I have some logs that output information in JSON. This is for collection to elasticsearch.
Some testers and operations people want to be able to read logs on the servers.
Here is some example JSON:
{
"#timestamp": "2015-09-22T10:54:35.449+02:00",
"#version": 1,
"HOSTNAME": "server1.example",
"level": "WARN",
"level_value": 30000,
"logger_name": "server1.example.adapter",
"message": "message"
"stack_trace": "ERROR LALALLA\nERROR INFO NANANAN\nSOME MORE ERROR INFO\nBABABABABABBA BABABABA ABABBABAA BABABABAB\n"
}
And so on.
Is it possible to make Jq print newline instead of the \n character sequence as seen in the value of .stack_trace?
Sure! Using the -r option, jq will print string contents directly to the terminal instead of as JSON escaped strings.
jq -r '.stack_trace'
Unless you're constraint to use jq only, you can "fix" (or actually "un-json-ify") jq output with sed:
cat the-input | jq . | sed 's/\\n/\n/g'
If you happen to have tabs in the input as well (\t in JSON), then:
cat the-input | jq . | sed 's/\\n/\n/g; s/\\t/\t/g'
This would be especially handy if your stack_trace was generated by Java (you didn't tell what is the source of the logs), as the Java stacktrace lines begin with <tab>at<space>.
Warning: naturally, this is not correct, in a sense that JSON input containing \\n will result in a "" output, however it should result in "n" output. While not correct, it's certainly sufficient for peeking at the data by humans. The sed patterns can be further improved to take care for this (at the cost of readability).
The input as originally given isn't quite valid JSON, and it's not clear precisely what the desired output is, but the following might be of interest. It is written for the current version of jq (version 1.5) but could easily be adapted for jq 1.4:
def json2qjson:
def pp: if type == "string" then "\"\(.)\"" else . end;
. as $in
| foreach keys[] as $k (null; null; "\"\($k)\": \($in[$k] | pp)" ) ;
def data: {
"#timestamp": "2015-09-22T10:54:35.449+02:00",
"#version": 1,
"HOSTNAME": "server1.example",
"level": "WARN",
"level_value": 30000,
"logger_name": "server1.example.adapter",
"message": "message",
"stack_trace": "ERROR LALALLA\nERROR INFO NANANAN\nSOME MORE ERROR INFO\nBABABABABABBA BABABABA ABABBABAA BABABABAB\n"
};
data | json2qjson
Output:
$ jq -rnf json2qjson.jq
"#timestamp": "2015-09-22T10:54:35.449+02:00"
"#version": 1
"HOSTNAME": "server1.example"
"level": "WARN"
"level_value": 30000
"logger_name": "server1.example.adapter"
"message": "message"
"stack_trace": "ERROR LALALLA
ERROR INFO NANANAN
SOME MORE ERROR INFO
BABABABABABBA BABABABA ABABBABAA BABABABAB
"
Related
this is the JSON I am working with -
{
"content": {
"macOS": {
"releases": [
{
"version": "3.21",
"updateItems": [
{
"id": 1,
"title": "Automatic detection for inactivity is here",
"message": "We've finally added a feature long requested - Orby now detects when you've been inactive on your computer. You can modify the maximum allowable inactive time through settings, or just turn it off if you don't need it",
"image": "https://static.image.png"
},
{
"id": 2,
"title": "In case you missed it... We have an iOS app 📱 🙌",
"message": "It's far from perfect, but it's come a long way since we first pushed out version 1.0. We don't that many users on it so far, but I'm hoping that it's useful. Please send any feedback and feature requests my way",
"image": "https://static.image.png"
}
]
}
]
},
"iOS": {
"releases": [
{
"version": "1.31",
"updateItems": [
{
"image": "https://static.image.png",
"id": 1,
"link": "orbit://com.orbit:/settings",
"message": "Strict mode offers a fantastic new way to keep your focus and get more done. To enable it, go to settings and toggle it on. Now when you want to run a timer, put your device face down on a surface. The timer will stop if you pick it up.",
"title": "Strict mode is here 🙌 ⚠️ ⚠️ ⚠️"
}
]
}
]
}
}
}
I want to translate all the title values and message values (I use shell translate).
In my attempt, I have looped through the releases, gotten the indices, then looped through the updateItems and gotten their indices, then translated the title and message based off both these indices, and then I've attempted to assign these new values to replace the existing title and message.
I've noticed that this results in all the title values being the same, and all the message values being the same.
I'm clearly using jq the wrong way, but am unsure how to correct.
Please help.
curl ${URL} | jq >englishContent.json
LANGUAGES=(
# "en"
# "de"
# "fr"
# "es"
"it"
# "ja"
# "ko"
# "nl"
# "pt-BR"
# "ru"
# "zh-Hans"
)
for language in $LANGUAGES; do
# Create new json payload from the downloaded english content json
result=$(jq '{ "language": "'$language'", "response": . }' englishContent.json)
# Get the total number of release items
macOS_releases_length=$(jq -r ".response.content.macOS.releases | length" <(echo "$result"))
# Iterate over releases items and then use index to substitute values into nested arrays
macOS_releases_length=$(expr "$macOS_releases_length" - 1)
for macOS_release_index in $(seq 0 $macOS_releases_length); do
update_items_length=$(jq ".response.content.macOS.releases[$macOS_release_index].updateItems | length" <(echo "$result"))
update_items_length=$(expr "$update_items_length" - 1)
for update_item_index in $(seq 0 $update_items_length); do
title=$(jq ".response.content.macOS.releases[$macOS_release_index].updateItems[$update_item_index].title" <(echo "$result"))
translated_title=$(trans -brief -no-warn :$language $title | xargs)
message=$(jq ".response.content.macOS.releases[$macOS_release_index].updateItems[$update_item_index].message" <(echo "$result"))
translated_message=$(trans -brief -no-warn :$language $message | xargs)
result=$(jq --arg release_index "$(echo "$macOS_release_index" | jq 'tonumber')" --arg item_index "$("$update_item_index" | jq 'tonumber')" --arg new_title $translated_title '.response.content.macOS.releases['$release_index'].updateItems['$item_index'].title |= $new_title' <(echo "$result"))
result=$(jq --arg release_index "$(echo "$macOS_release_index" | jq 'tonumber')" --arg item_index "$("$update_item_index" | jq 'tonumber')" --arg new_message $translated_message '.response.content.macOS.releases['$release_index'].updateItems['$item_index'].message |= $new_message' <(echo "$result"))
done
done
echo $result
done
I have a JSON endpoint which I can fetch value with curl and yml local file. I want to get the difference and delete it with id of name present on JSON endpoint.
JSON's endpoint
[
{
"hosts": [
"server1"
],
"id": "qz9o847b-f07c-49d1-b1fa-e5ed0b2f0519",
"name": "V1_toto_a"
},
{
"hosts": [
"server2"
],
"id": "a6aa847b-f07c-49d1-b1fa-e5ed0b2f0519",
"name": "V1_tata_b"
},
{
"hosts": [
"server3"
],
"id": "a6d9ee7b-f07c-49d1-b1fa-e5ed0b2f0519",
"name": "V1_titi_c"
}
]
files.yml
---
instance:
toto:
name: "toto"
tata:
name: "tata"
Between JSON's endpoint and local file, I want to delete it with id of tata, because it is the difference between the sources.
declare -a arr=(_a _b _c)
ar=$(cat files.yml | grep name | cut -d '"' -f2 | tr "\n" " ")
fileItemArray=($ar)
ARR_PRE=("${fileItemArray[#]/#/V1_}")
for i in "${arr[#]}"; do local_var+=("${ARR_PRE[#]/%/$i}"); done
remote_var=$(curl -sX GET "XXXX" | jq -r '.[].name | #sh' | tr -d \'\")
diff_=$(echo ${local_var[#]} ${remote_var[#]} | tr ' ' '\n' | sort | uniq -u)
output = titi
the code works, but I want to delete the titi with id dynamically
curl -X DELETE "XXXX" $id_titi
I am trying to delete with bash script, but I have no idea to continue...
Your endpoint is not proper JSON as it has
commas after the .name field but no following field
no commas between the elements of the top-level array
If this is not just a typo from pasting your example into this question, then you'd need to address this first before proceeding. This is how it should look like:
[
{
"hosts": [
"server1"
],
"id": "qz9o847b-f07c-49d1-b1fa-e5ed0b2f0519",
"name": "toto"
},
{
"hosts": [
"server2"
],
"id": "a6aa847b-f07c-49d1-b1fa-e5ed0b2f0519",
"name": "tata"
},
{
"hosts": [
"server3"
],
"id": "a6d9ee7b-f07c-49d1-b1fa-e5ed0b2f0519",
"name": "titi"
}
]
If your endpoint is proper JSON, try the following. It extracts the names from your .yml file (just as you do - there are plenty of more efficient and less error-prone ways but I'm trying to adapt your approach as much as possible) but instead of a Bash array generates a JSON array using jq which for Bash is a simple string. For your curl output it's basically the same thing, extracting a (JSON) array of names into a Bash string. Note that in both cases I use quotes <var>="$(…)" to capture strings that may include spaces (although I also use the -c option for jq to compact it's output to a single line). For the difference between the two, everything is taken over by jq as it can easily be fed with the JSON arrays as variables, perform the subtraction and output in your preferred format:
fromyml="$(cat files.yml | grep name | cut -d '"' -f2 | jq -Rnc '[inputs]')"
fromcurl="$(curl -sX GET "XXXX" | jq -c 'map(.name)')"
diff="$(jq -nr --argjson fromyml "$fromyml" --argjson fromcurl "$fromcurl" '
$fromcurl - $fromyml | .[]
')"
The Bash variable diff now contains a list of names only present in the curl output ($fromcurl - $fromyml), one per line (if, other than in your example, there happens to be more than one). If the curl output had duplicates, they will still be included (use $fromcurl - $fromyml | unique | .[] to get rid of them):
titi
As you can see, this solution has three calls to jq. I'll leave it to you to further reduce that number as it fits your general workflow (basically, it can be put together into one).
Getting the output of a program into a variable can be done using read.
perl -M5.010 -MYAML -MJSON::PP -e'
sub get_next_file { local $/; "".<> }
my %filter = map { $_->{name} => 1 } values %{ Load(get_next_file)->{instance} };
say for grep !$filter{$_}, map $_->{name}, #{ decode_json(get_next_file) };
' b.yaml a.json |
while IFS= read -r id; do
curl -X DELETE ..."$id"...
done
I used Perl here because what you had was no way to parse a YAML file. The snippet requires having installed the YAML Perl module.
What I am doing?
I have one JSON file as sonar-report.json. I want to iterate sonar-report.json in shell script, to read values of json.
To parse JSON file I am using jq https://stedolan.github.io/jq/
So Following code I was trying to execute in shell script
alias jq=./jq-win64.exe
for key in $(jq '.issues | keys | .[]' sonar-report.json); do
echo "$key"
line=$(jq -r ".issues[$key].line" sonar-report.json)
done
Problem
When i execute this, console give me error:
jq: error: syntax error, unexpected INVALID_CHARACTER (Windows cmd shell quoting issues?) at <top-level>, line 1:
If I update my above script, and add static index of array then script works fine
alias jq=./jq-win64.exe
for key in $(jq '.issues | keys | .[]' sonar-report.json); do
echo "$key"
line0=$(jq -r ".issues[0].line" sonar-report.json)
line1=$(jq -r ".issues[1].line" sonar-report.json)
done
so at the end what i want :
I want to iterate values and print in console like
alias jq=./jq-win64.exe
for key in $(jq '.issues | keys | .[]' sonar-report.json); do
line=$(jq -r ".issues[$key].line" sonar-report.json)
echo $line
done
so the output should be
15
This is my JSON file as sonar-report.json
{
"issues": [
{
"key": "016B7970D27939AEBD",
"component": "bits-and-bytes:src/main/java/com/catalystone/statusreview/handler/StatusReviewDecisionLedHandler.java",
"line": 15,
"startLine": 15,
"startOffset": 12,
"endLine": 15,
"endOffset": 14,
"message": "Use the \"equals\" method if value comparison was intended.",
"severity": "MAJOR",
"rule": "squid:S4973",
"status": "OPEN",
"isNew": true,
"creationDate": "2019-06-21T15:19:18+0530"
},
{
"key": "AWtqCc-jtovxS8PJjBiP",
"component": "bits-and-bytes:src/test/java/com/catalystone/statusreview/service/StatusReviewInitiationSerivceTest.java",
"message": "Fix failing unit tests on file \"src/test/java/com/catalystone/statusreview/service/StatusReviewInitiationSerivceTest.java\".",
"severity": "MAJOR",
"rule": "common-java:FailedUnitTests",
"status": "OPEN",
"isNew": false,
"creationDate": "2019-06-18T15:32:08+0530"
}
]
}
please help me, Thanks in advance
This looks to me like an instance of Windows/Unix line-ending incompatibility, indicated in jq bugs 92 (for Cygwin) and 1870 (for MSYS2).
Any of the workarounds indicated in those bug reports should work, but once the fix gets into the release binary (presumably v1.7), the simplest solution is to use the new -b command-line option. (The option is available in recent jq preview builds; see the second bug report listed above):
for key in $(jq -b '.issues | keys | .[]' sonar-report.json); do
line=$(jq -rb ".issues[$key].line" sonar-report.json)
# I added quotes in the next line, because it's better style.
echo "$line"
done
Until the next version of jq is available, or if you don't want to upgrade for some reason, a good workaround is to just remove the CRs by piping the output of jq through tr -d '\r':
for key in $(jq -'.issues | keys | .[]' sonar-report.json | tr -d '\r'); do
line=$(jq -r ".issues[$key].line" sonar-report.json | tr -d '\r')
echo "$line"
done
However, as pointed out in a comment by Cyrus, you probably don't need to iterate line-by-line in a shell loop, which is incredibly inefficient since it leads to reparsing the entire JSON input many times. You can use jq itself to iterate, with the much simpler:
jq '.issues[].line' solar-response.json
which will parse the JSON file just once, and then produce each .line value in the file. (You probably still want to use the -b command-line option or other workaround, depending on what you intend to do with the output.)
I have some logs that output information in JSON. This is for collection to elasticsearch.
Some testers and operations people want to be able to read logs on the servers.
Here is some example JSON:
{
"#timestamp": "2015-09-22T10:54:35.449+02:00",
"#version": 1,
"HOSTNAME": "server1.example",
"level": "WARN",
"level_value": 30000,
"logger_name": "server1.example.adapter",
"message": "message"
"stack_trace": "ERROR LALALLA\nERROR INFO NANANAN\nSOME MORE ERROR INFO\nBABABABABABBA BABABABA ABABBABAA BABABABAB\n"
}
And so on.
Is it possible to make Jq print newline instead of the \n character sequence as seen in the value of .stack_trace?
Sure! Using the -r option, jq will print string contents directly to the terminal instead of as JSON escaped strings.
jq -r '.stack_trace'
Unless you're constraint to use jq only, you can "fix" (or actually "un-json-ify") jq output with sed:
cat the-input | jq . | sed 's/\\n/\n/g'
If you happen to have tabs in the input as well (\t in JSON), then:
cat the-input | jq . | sed 's/\\n/\n/g; s/\\t/\t/g'
This would be especially handy if your stack_trace was generated by Java (you didn't tell what is the source of the logs), as the Java stacktrace lines begin with <tab>at<space>.
Warning: naturally, this is not correct, in a sense that JSON input containing \\n will result in a "" output, however it should result in "n" output. While not correct, it's certainly sufficient for peeking at the data by humans. The sed patterns can be further improved to take care for this (at the cost of readability).
The input as originally given isn't quite valid JSON, and it's not clear precisely what the desired output is, but the following might be of interest. It is written for the current version of jq (version 1.5) but could easily be adapted for jq 1.4:
def json2qjson:
def pp: if type == "string" then "\"\(.)\"" else . end;
. as $in
| foreach keys[] as $k (null; null; "\"\($k)\": \($in[$k] | pp)" ) ;
def data: {
"#timestamp": "2015-09-22T10:54:35.449+02:00",
"#version": 1,
"HOSTNAME": "server1.example",
"level": "WARN",
"level_value": 30000,
"logger_name": "server1.example.adapter",
"message": "message",
"stack_trace": "ERROR LALALLA\nERROR INFO NANANAN\nSOME MORE ERROR INFO\nBABABABABABBA BABABABA ABABBABAA BABABABAB\n"
};
data | json2qjson
Output:
$ jq -rnf json2qjson.jq
"#timestamp": "2015-09-22T10:54:35.449+02:00"
"#version": 1
"HOSTNAME": "server1.example"
"level": "WARN"
"level_value": 30000
"logger_name": "server1.example.adapter"
"message": "message"
"stack_trace": "ERROR LALALLA
ERROR INFO NANANAN
SOME MORE ERROR INFO
BABABABABABBA BABABABA ABABBABAA BABABABAB
"
I am new to jq so if this is not a jq question or a json question please point me in the right direction. I am not sure of the correct terminology so it is making it hard for me to properly articulate the problem.
I am using to curl to pull some json that I want to filter out keys with specific values. Here is some of the sample json:
{
"id": "593f468c81aaa30001960e16",
"name": "Name 1",
"channels": [
"593f38398481bc00019632e5"
],
"geofenceProfileId": null
}
{
"id": "58e464585180ac000a748b57",
"name": "Name 2",
"channels": [
"58b480097f04f20007f3cdca",
"580ea26616de060006000001"
],
"geofenceProfileId": null
}
{
"id": "58b4d6db7f04f20007f3cdd2",
"name": "Name 3",
"channels": [
"58b8a25cf9f6e19cf671872f"
],
"geofenceProfileId": "57f53018271c810006000001"
}
When I run the following command:
curl -X GET -H 'authorization: Basic somestring=' "https://myserver/myjson" |
jq '.[] | {id: .id, name: .name, channels: .channels, geofenceProfileId: .geofenceProfileId}' |
jq '.[] | select(.channels == 58b8a25cf9f6e19cf671872f)'
I get the following error:
jq: error: syntax error, unexpected IDENT, expecting ';' or ')' (Unix shell quoting issues?) at , line 1:
.[] | select(.channels == 58b8a25cf9f6e19cf671872f)
jq: 1 compile error
% Total % Received % Xferd Average Speed Time Time Time Current
Dload Upload Total Spent Left Speed
100 351k 0 351k 0 0 1109k 0 --:--:-- --:--:-- --:--:-- 1110k
Is this error because jq pretty prints the output of the first statement and the second statement is expecting it to be in one code block? If so, how do I convert it back to non pretty print format or how can I use jq to run a new filter on the output?
Basically I am trying to parse hundreds of records and filter out all of the records that are in a specific channel number or have a specific geofenceProfileId.
I'd suggest you start with:
jq 'select(.channels | index("58b8a25cf9f6e19cf671872f"))'
In fact, this might even be exactly the filter you want. If you want to remove the "channels" once you've made the selection, you could augment the filter above as follows:
select(.channels | index("58b8a25cf9f6e19cf671872f")) | del(.channels)
The main thing to note is that one can create a pipeline WITHIN a single invocation of jq. So most likely you'll end up with: curl ... | jq ...
Btw
The jq expression {"id": .id} can be abbreviated to {id}, so instead of:
{id.id, name: .name, channels: .channels, geofenceProfileId: .geofenceProfileId}
you could write:
{id, name, channels, geofenceProfileId}
Probably not related to your case but I managed to transform my command
npm pkg get version -ws | jq "select(to_entries | min_by(.value) | .value)"
to
npm pkg get version -ws | jq "to_entries | min_by(.value) | .value"
and result is same. May be it helps. SO the idea is to pipe inside jq statement