how to extract multiple values from json object using jq command - json

I am trying to get multiple values from a json object.
{
"nextToken": "9i2x1mbCpfo5hQ",
"jobSummaryList": [
{
"jobName": "012210",
"jobId": "0196f81cae73"
}
]
}
I want nextToken's value and jobName in one jq command.

https://stedolan.github.io/jq/manual/
jq '.nextToken, .jobSummaryList[].jobName' file

Related

jq: Conditional insert using "lookup" & "target" JSON objects

I'm trying to improve a bash script I wrote using jq (Python version), but can't quite get the conditional nature of the task at hand to work.
The task: insert array from one JSON object ("lookup") into another ("target") only if the key of the "lookup" matches a particular "higher-level" value in the "target". Assume that the two JSON objects are in lookup.json and target.json, respectively.
A minimal example to make this clearer:
"Lookup" JSON:
{
"table_one": [
"a_col_1",
"a_col_2"
],
"table_two": [
"b_col_1",
"b_col_2",
"b_col_3"
]
}
"Target" JSON:
{
"top_level": [
{
"name": "table_one",
"tests": [
{
"test_1": {
"param_1": "some_param"
}
},
{
"test_2": {
"param_1": "another_param"
}
}]
},
{
"name": "table_two",
"tests": [
{
"test_1": {
"param_1": "some_param"
}
},
{
"test_2": {
"param_1": "another_param"
}
}
]
}
]
}
I want the output to be:
{
"top_level": [{
"name": "table_one",
"tests": [{
"test_1": {
"param_1": "some_param"
}
},
{
"test_2": {
"param_1": "another_param",
"param_2": [
"a_col_1",
"a_col_2"
]
}
},
{
"name": "table_two",
"tests": [{
"test_1": {
"param_1": "some_param"
}
},
{
"test_2": {
"param_1": "another_param",
"param_2": [
"b_col_1",
"b_col_2",
"b_col_3"
]
}
}
]
}
]
}
]
}
Hopefully, that makes sense. Early attempts slurped both JSON blobs and assigned them to two variables. I'm trying to select for a match on [roughly] ($lookup | keys[]) == $target.top_level.name, but I can't quite get this match or the subsequent the array insert working.
Any advice is well-received!
Assuming the JSON samples have been corrected, and that the following program is in the file "target.jq", the invocation:
jq --argfile lookup lookup.json -f target.jq target.json
produces the expected result.
target.jq
.top_level |= map(
$lookup[.name] as $value
| .tests |= map(
if has("test_2")
then .test_2.param_2 = $value
else . end) )
Caveat
Since --argfile is officially deprecated, you might wish to choose an alternative method of passing in the contents of lookup.json, but --argfile is supported by all extant versions of jq as of this writing.
The jq answer is already given, but the ask itself is so fascinating - it requires a cross-lookup from a source file into the file being inserted, so I could not help providing also an alternative solution using jtc utility:
<target.json jtc -w'<name>l:<N>v[-1][tests][-1:][0]' \
-i file.json -i'<N>t:' -T'{"param_2":{{}}}'
A brief overlook of the used options:
-w'<name>l:<N>v[-1][tests][-1:][0]' - selects points of insertions in the source (target.json) by finding and memorizing into namespace N keys to be looked up in the inserted file, then rolling back 1 level up in the JSON tree, selecting tests label, then the last entry in it and finally addressing a 1st element of the last one
-i file.json make an insertion from the file
-i'<N>t:' - this walk over file.json finds recursively a tag (label) preserved in the namespace N from the respective walk -w (if not this insert option with the walk argument, then the whole file would get inserted into the insertion points -w..)
-T'{"param_2":{{}}}' - finally, a template operation is applied onto the insertion result transforming found entry (in file.json) into the one with the right label
PS. I'm the developer of the jtc - multithreading JSON processing utility for unix.
PPS. the disclaimer is required by SO.

Use jq to replace many values with variable values

Using jq, is it possible to replace the value of each parameter in the sample JSON with the value of the variable that is the initial value?
In my scenario, Azure DevOps does not carryout any kind of variable substitution on the JSON file, so I need to do it manually. So for example, say $SUBSCRIPTION_ID is set to abc-123, I'd like to use jq to update the JSON file.
I can pull out the values using .parameters[].value, but I can't seem to find a way of setting each individual value.
The main challenge here is that the solution should be reusable, and different JSON files will have different parameters, so I don't think I can use --argjson.
Example
Original JSON
{
"$schema": "https://schema.management.azure.com/schemas/2015-01-01/parametersTemplate.json#",
"contentVersion": "1.0.0.0",
"parameters": {
"subscriptionId": {
"value": "$SUBSCRIPTION_ID"
},
"topicName": {
"value": "$TOPIC_NAME"
}
}
}
Variables
SUBSCRIPTION_ID="abc-123"
TOPIC_NAME="SomeTopic"
Desired JSON
{
"$schema": "https://schema.management.azure.com/schemas/2015-01-01/parametersTemplate.json#",
"contentVersion": "1.0.0.0",
"parameters": {
"subscriptionId": {
"value": "abc-123"
},
"topicName": {
"value": "SomeTopic"
}
}
}
Export those variables so that you can access them from within jq.
export SUBSCRIPTION_ID TOPIC_NAME
jq '.parameters[].value |= (env[.[1:]] // .)' file
//. part is for leaving variables absent in the environment as is, you can drop it if not necessary
Use --argjson; essentially, you are just going to ignore the attempt at parameterizing the JSON and simply replace the values unconditionally.
jq --argjson x "$SUBSCRIPTION_ID" \
--argjson y "$TOPIC_NAME" \
'.parameters.subscriptionId.value = $x; .parameters.topicName.value = $y' \
config.json
Here is a "data-driven" approach based on the contents of the schema and the available environment variables:
export SUBSCRIPTION_ID="abc-123"
export TOPIC_NAME="SomeTopic"
< schema.json jq '.parameters
|= map_values(if .value | (startswith("$") and env[.[1:]])
then .value |= env[.[1:]] else . end)'
Notice that none of the template names appear in the jq program.
If your shell supports it, you could avoid the "export" commands by prefacing the jq command with the variable assignments along the lines of:
SUBSCRIPTION_ID="abc-123" TOPIC_NAME="SomeTopic" jq -f program.jq schema.json
Caveat
Using environment variables to pass in the parameter values may not be such a great idea. Two alternatives would be to provide the name-value pairs in a text file or as a JSON object. See also Using jq as a template engine

transform json to add array objects

I need to transform an array by adding additional objects -
I have:
"user_id":"testuser"
"auth_token":"abcd"
I need:
"key":"user_id"
"value":"testuser"
"key":"auth_token"
"value":"abcd"
I have been using jq but cant figure out how to do it. Do i need to transform this into a multi-dimensional array first?
I have tried multiple jq queries but cant find the most suitable
When i try using jq i get
jq: error: syntax error, unexpected $end, expecting QQSTRING_TEXT or QQSTRING_INTERP_START or QQSTRING_END (Unix shell quoting issues?) at , line 1
Your input is not json, it's just a bunch of what could be thought of as key/value pairs. Assuming your json input actually looked like this:
{
"user_id": "testuser",
"auth_token": "abcd"
}
You could get an array of key/value pair objects using to_entries.
$ jq 'to_entries' input.json
[
{
"key": "user_id",
"value": "testuser"
},
{
"key": "auth_token",
"value": "abcd"
}
]
If on the other hand your input was actually that, you would need to convert it to a format that can be processed. Fortunately you could read it in as a raw string and probably parse using regular expressions or basic string manipulation.
$ jq -Rn '[inputs|capture("\"(?<key>[^\"]+)\":\"(?<value>[^\"]*)\"")]' input.txt
$ jq -Rn '[inputs|split(":")|map(fromjson)|{key:.[0],value:.[1]}]' input.txt
You can use to_entries filter for that.
Here is jqplay example
Robust conversion of key:value lines to JSON.
If the key:value specifications would be valid JSON except for the
missing punctuation (opening and closing braces etc), then a simple and quite robust approach to converting these key:value pairs to a single valid JSON object is illustrated by the following:
cat <<EOF | jq -nc -R '["{" + inputs + "}" | fromjson] | add'
"user_id": "testuser"
"auth_token" : "abcd"
EOF
Output
{
"user_id": "testuser",
"auth_token": "abcd"
}

How to create key of object in json with jq?

I have a json file as belows:
{
"HealthCheckPath": "/",
"HealthCheckIntervalSeconds": 30
}
I wanna create a key with Test name. My desired result:
"Test": {
"HealthCheckPath": "/",
"HealthCheckIntervalSeconds": 30
}
I want to using jq option with bash or other way via bash shell.
Run below to get desired output:
jq '{ "Test" : .}' file.json

Need help! - Unable to load JSON using COPY command

Need your expertise here!
I am trying to load a JSON file (generated by JSON dumps) into redshift using copy command which is in the following format,
[
{
"cookieId": "cb2278",
"environment": "STAGE",
"errorMessages": [
"70460"
]
}
,
{
"cookieId": "cb2271",
"environment": "STG",
"errorMessages": [
"70460"
]
}
]
We ran into the error - "Invalid JSONPath format: Member is not an object."
when I tried to get rid of square braces - [] and remove the "," comma separator between JSON dicts then it loads perfectly fine.
{
"cookieId": "cb2278",
"environment": "STAGE",
"errorMessages": [
"70460"
]
}
{
"cookieId": "cb2271",
"environment": "STG",
"errorMessages": [
"70460"
]
}
But in reality most JSON files from API s have this formatting.
I could do string replace or reg ex to get rid of , and [] but I am wondering if there is a better way to load into redshift seamlessly with out modifying the file.
One way to convert a JSON array into a stream of the array's elements is to pipe the former into jq '.[]'. The output is sent to stdout.
If the JSON array is in a file named input.json, then the following command will produce a stream of the array's elements on stdout:
$ jq ".[]" input.json
If you want the output in jsonlines format, then use the -c switch (i.e. jq -c ......).
For more on jq, see https://stedolan.github.io/jq