How to insert JSON as string into another JSON - json

I am writing script (bash script for Azure pipeline) and I need to combine JSON from different variables. For example, I have:
TYPE='car'
COLOR='blue'
ADDITIONAL_PARAMS='{"something": "big", "etc":"small"}'
So, as you can see, I have several string variables and one which consist JSON.
I need to combine these variables with this format (and I cant :( ):
some_script --extr-vars --extra_vars '{"var_type": "'$TYPE'", "var_color": "'$COLOR'", "var_additional_data": "'$ADDITIONAL_PARAMS'"}'
But this combination is not working, I have a string something like:
some_script --extr-vars --extra_vars '{"var_type": "car", "var_color": "blue", "var_additional_data": " {"something": "big", "etc":"small"} "}'
which is not correct and valid JSON.
How I can combine existing JSON (already formatted with double quotes ") with other variables? I am using bash / console / yq utilite (to convert yaml to json)

Use jq to generate the JSON. (You can probably do this in one step with yq, but I'm not as familiar with that tool.)
ev=$(jq --arg t "$TYPE" \
--arg c "$COLOR" \
--argjson ap "$ADDITIONAL_PARAMS" \
-n '{var_type: $t, var_color: $c var_additional_data: $ap}')
some_script --extr-vars --extra_vars "$ev"

Related

Unable to assign multiple values from a file to a json request

I have a simple file with this test line:
mmm#gmail.com 31460 147557432
My goal is to send as json data.
In my while loop I can echo the variables in the second line of my code example.
However, when I attempt to assign them to jsonstring and echo, the values are not populated.
What do I need to do to pass these values to my json string?
while read emailvar idvar expirevar; do
echo "$emailvar:$expirevar:$idvar"
jsonstring=$idvar $emailvar $expirevar
echo "$jsonstring"
#jsonstring='{"user_id":"$idvar","email":"$emailvar","custom_attributes":{"Program_Expires_at":"$expirevar"}}'
done < "tempdata.txt"
#!/bin/bash
while read line;
do
line_array=($line)
emailvar=${line_array[0]}
expirevar=${line_array[1]}
idvar=${line_array[2]}
jsonstring='{"user_id": "'$idvar'", "email": "'$emailvar'", "custom_attributes":{"Program_Expires_at": "'$expirevar'"}'
echo $jsonstring
done < 'tempdata.txt'
Output:
You have to escape the whitespace to make it part of the string, rather than creating a simple command with some pre-command assignments.
jsonstring=$idvar\ $emailvar\ $expirevar
more commonly written as
jsonstring="$idvar $emailvar $expirevar"
In your commented assignment, you used single quotes, which prevent parameter expansion. You need to use double quotes, which requires manually escaping the interior double quotes. More robust, though, is to use a tool like jq to generate the JSON for you: it will take care of escaping any characters in your variables to generate valid JSON.
jsonstring=$(jq -n \
--arg id "$idvar" \
--arg email "$emailvar" \
--arg expire "$expirevar" \
'{user_id: $id,
email: $email,
custom_attributes: {Program_Expires_at: $expire}}'
)
It seems this is essentially a problem with how bash handles variables and parameter expansion. I believe the solution here basically adding bunch of double quotes.
Double quotes can be used to enable parameter expansion for multiple variables. For JSON output in this bash script, we'll need to use nested double-quotes.
To fix this, we can:
put double quotes (") surrounding the value for jsonstring
escape double quotes surrounding strings used within the value for jsonstring with \
If you'd like $idvar and $expirevar to be interpreted as numbers instead of strings, you don't need escaped double-quotes around these values.
For example:
#!/bin/bash
while read emailvar idvar expirevar; do
jsonstring="{\"user_id\":$idvar,\"email\":\"$emailvar\",\"custom_attributes\":{\"Program_Expires_at\":$expirevar}}"
echo "$jsonstring"
done < "tempdata.txt"
Example output:
user#pc: bash ./script.sh
{"user_id":31460,"email":"mmm#gmail.com","custom_attributes":{"Program_Expires_at":147557432}}
user#pc: bash ./script.sh | jq .
{
"user_id": 31460,
"email": "mmm#gmail.com",
"custom_attributes": {
"Program_Expires_at": 147557432
}
}

How to extract elements from a string value in json, using jq [duplicate]

I'm trying to get jq to parse a JSON structure like:
{
"a" : 1,
"b" : 2,
"c" : "{\"id\":\"9ee ...\",\"parent\":\"abc...\"}\n"
}
That is, an element in the JSON is a string with escaped json.
So, I have something along the lines of
$ jq [.c] myFile.json | jq [.id]
But that crashes with jq: error: Cannot index string with string
This is because the output of .c is a string, not more JSON.
How do I get jq to parse this string?
My initial solution is to use sed to replace all the escape chars (\":\", \",\" and \") but that's messy, I assume there's a way built into jq to do this?
Thanks!
edit:
Also, the jq version available here is:
$ jq --version
jq version 1.3
I guess I could update it if required.
jq has the fromjson builtin for this:
jq '.c | fromjson | .id' myFile.json
fromjson was added in version 1.4.
You can use the raw output (-r) that will unescape characters:
jq -r .c myfile.json | jq .id
ADDENDUM: This has the advantage that it works in jq 1.3 and up; indeed, it should work in every version of jq that has the -r option.
Motivation: you want to parse JSON string - you want to escape a JSON object that's wrapped with quotes and represented as a String buffer, and convert it to a valid JSON object. For example:
some JSON unescaped string :
"{\"name\":\"John Doe\",\"position\":\"developer\"}"
the expected result ( a JSON object ):
{"name":"John Doe","position":"developer"}
Solution: In order to escape a JSON string and convert it into a valid JSON object use the sed tool in command line and use regex expressions to remove/replace specific characters:
cat current_json.txt | sed -e 's/\\\"/\"/g' -e 's/^.//g' -e 's/.$//g'
s/\\\"/\"/g replacing all backslashes and quotes ( \" ) into quotes only (")
s/^.//g replacing the first character in the stream to none character
s/.$//g replacing the last character in the stream to none character

Passing a path ("key1.key2") from a bash variable to jq

I am having trouble accessing bash variable inside 'jq'.
The snippet below shows my bash loop to check for missing keys in a Json file.
#!/bin/sh
for key in "key1" "key2.key3"; do
echo "$key"
if ! cat ${JSON_FILE} | jq --arg KEY "$key" -e '.[$KEY]'; then
missingKeys+=${key}
fi
done
JSON_FILE:
{
"key1": "val1",
"key2": {
"key3": "val3"
}
}
The script works correctly for top level keys such as "key1". But it does not work correctly (returns null) for "key2.key3".
'jq' on the command line does return the correct value
cat input.json | jq '.key2.key3'
"val3"
I followed answers from other posts to come to this solution. However can't seem to figure out why it does not work for nested json keys.
Using --arg prevents your data from being incorrectly parsed as syntax. Usually, a shell variable you're passing into jq contains literal data, so this is the correct thing.
In this case, your variable contains syntax, not literal data: The . isn't part of the string you want to do a lookup by, but is instead an instruction to jq to do two separate lookups one after the other.
So, in this case, you should do the more obvious thing, instead of using --arg:
jq -e ".$KEY"

Get field from JSON object using jq and command line argument

Assume the following JSON file
{
"foo": "hello",
"bar": "world"
}
I want to get the foo field from the JSON object in a standalone object, and I do this:
<file jq '{foo}'
{
"foo": "hello"
}
Now the field I actually want is coming from the shell and is given to jq as an argument like this:
<file jq --arg myarg "foo" '{$myarg}'
{
"myarg": "foo"
}
Unfortunately this doesn't give the expected result {"foo":"hello"}.
Any idea why the name of the variable gets into the object?
A workaround to this is to explicitly defined the object:
<file jq '{($myarg):.[$myarg]}'
Fine, but is there a way to use the shortcut syntax as explained in the man page, but with a variable ?
You can use this to select particular fields of an object: if the input is an object with “user”, “title”, “id”, and “content” fields and you just want “user” and “title”, you can write
{user: .user, title: .title}
Because that is so common, there’s a shortcut syntax for it: {user, title}.
If that matters, I'm using jq version 1.5
In short, no. The shortcut syntax can only be used under very special conditions. For example, it cannot be used with key names that are jq keywords.
Alternatives
The method described in the Q is the preferred one, but for the record, here are two alternatives:
jq --arg myarg "foo" '
.[$myarg] as $v | {} | .[$myarg] = $v'
And of course there's the alternative that comes with numerous caveats:
myarg=foo ; jq "{ $myarg }"

Build a JSON string with Bash variables

I need to read these bash variables into my JSON string and I am not familiar with bash. any help is appreciated.
#!/bin/sh
BUCKET_NAME=testbucket
OBJECT_NAME=testworkflow-2.0.1.jar
TARGET_LOCATION=/opt/test/testworkflow-2.0.1.jar
JSON_STRING='{"bucketname":"$BUCKET_NAME"","objectname":"$OBJECT_NAME","targetlocation":"$TARGET_LOCATION"}'
echo $JSON_STRING
You are better off using a program like jq to generate the JSON, if you don't know ahead of time if the contents of the variables are properly escaped for inclusion in JSON. Otherwise, you will just end up with invalid JSON for your trouble.
BUCKET_NAME=testbucket
OBJECT_NAME=testworkflow-2.0.1.jar
TARGET_LOCATION=/opt/test/testworkflow-2.0.1.jar
JSON_STRING=$( jq -n \
--arg bn "$BUCKET_NAME" \
--arg on "$OBJECT_NAME" \
--arg tl "$TARGET_LOCATION" \
'{bucketname: $bn, objectname: $on, targetlocation: $tl}' )
You can use printf:
JSON_FMT='{"bucketname":"%s","objectname":"%s","targetlocation":"%s"}\n'
printf "$JSON_FMT" "$BUCKET_NAME" "$OBJECT_NAME" "$TARGET_LOCATION"
much clear and simpler
A possibility:
#!/bin/bash
BUCKET_NAME="testbucket"
OBJECT_NAME="testworkflow-2.0.1.jar"
TARGET_LOCATION="/opt/test/testworkflow-2.0.1.jar
# one line
JSON_STRING='{"bucketname":"'"$BUCKET_NAME"'","objectname":"'"$OBJECT_NAME"'","targetlocation":"'"$TARGET_LOCATION"'"}'
# multi-line
JSON_STRING="{
\"bucketname\":\"${BUCKET_NAME}\",
\"objectname\":\"${OBJECT_NAME}\",
\"targetlocation\":\"${TARGET_LOCATION}\"
}"
# [optional] validate the string is valid json
echo "${JSON_STRING}" | jq
In addition to chepner's answer, it's also possible to construct the object completely from args with this simple recipe:
BUCKET_NAME=testbucket
OBJECT_NAME=testworkflow-2.0.1.jar
TARGET_LOCATION=/opt/test/testworkflow-2.0.1.jar
JSON_STRING=$(jq -n \
--arg bucketname "$BUCKET_NAME" \
--arg objectname "$OBJECT_NAME" \
--arg targetlocation "$TARGET_LOCATION" \
'$ARGS.named')
Explanation:
--null-input | -n disabled reading input. From the man page: Don't read any input at all! Instead, the filter is run once using null as the input. This is useful when using jq as a simple calculator or to construct JSON data from scratch.
--arg name value passes values to the program as predefined variables: value is available as $name. All named arguments are also available as $ARGS.named
Because the format of $ARGS.named is already an object, jq can output it as is.
First, don't use ALL_CAPS_VARNAMES: it's too easy to accidentally overwrite a crucial shell variable (like PATH)
Mixing single and double quotes in shell strings can be a hassle. In this case, I'd use printf:
bucket_name=testbucket
object_name=testworkflow-2.0.1.jar
target_location=/opt/test/testworkflow-2.0.1.jar
template='{"bucketname":"%s","objectname":"%s","targetlocation":"%s"}'
json_string=$(printf "$template" "$BUCKET_NAME" "$OBJECT_NAME" "$TARGET_LOCATION")
echo "$json_string"
For homework, read this page carefully: Security implications of forgetting to quote a variable in bash/POSIX shells
A note on creating JSON with string concatenation: there are edge cases. For example, if any of your strings contain double quotes, you can broken JSON:
$ bucket_name='a "string with quotes"'
$ printf '{"bucket":"%s"}\n' "$bucket_name"
{"bucket":"a "string with quotes""}
Do do this more safely with bash, we need to escape that string's double quotes:
$ printf '{"bucket":"%s"}\n' "${bucket_name//\"/\\\"}"
{"bucket":"a \"string with quotes\""}
I had to work out all possible ways to deal json strings in a command request, Please look at the following code to see why using single quotes can fail if used incorrectly.
# Create Release and Tag commit in Github repository
# returns string with in-place substituted variables
json=$(cat <<-END
{
"tag_name": "${version}",
"target_commitish": "${branch}",
"name": "${title}",
"body": "${notes}",
"draft": ${is_draft},
"prerelease": ${is_prerelease}
}
END
)
# returns raw string without any substitutions
# single or double quoted delimiter - check HEREDOC specs
json=$(cat <<-!"END" # or 'END'
{
"tag_name": "${version}",
"target_commitish": "${branch}",
"name": "${title}",
"body": "${notes}",
"draft": ${is_draft},
"prerelease": ${is_prerelease}
}
END
)
# prints fully formatted string with substituted variables as follows:
echo "${json}"
{
"tag_name" : "My_tag",
"target_commitish":"My_branch"
....
}
Note 1: Use of single vs double quotes
# enclosing in single quotes means no variable substitution
# (treats everything as raw char literals)
echo '${json}'
${json}
echo '"${json}"'
"${json}"
# enclosing in single quotes and outer double quotes causes
# variable expansion surrounded by single quotes(treated as raw char literals).
echo "'${json}'"
'{
"tag_name" : "My_tag",
"target_commitish":"My_branch"
....
}'
Note 2: Caution with Line terminators
Note the json string is formatted with line terminators such as LF \n
or carriage return \r(if its encoded on windows it contains CRLF \r\n)
using (translate) tr utility from shell we can remove the line terminators if any
# following code serializes json and removes any line terminators
# in substituted value/object variables too
json=$(echo "$json" | tr -d '\n' | tr -d '\r' )
# string enclosed in single quotes are still raw literals
echo '${json}'
${json}
echo '"${json}"'
"${json}"
# After CRLF/LF are removed
echo "'${json}'"
'{ "tag_name" : "My_tag", "target_commitish":"My_branch" .... }'
Note 3: Formatting
while manipulating json string with variables, we can use combination of ' and " such as following, if we want to protect some raw literals using outer double quotes to have in place substirution/string interpolation:
# mixing ' and "
username=admin
password=pass
echo "$username:$password"
admin:pass
echo "$username"':'"$password"
admin:pass
echo "$username"'[${delimiter}]'"$password"
admin[${delimiter}]pass
Note 4: Using in a command
Following curl request already removes existing \n (ie serializes json)
response=$(curl -i \
--user ${username}:${api_token} \
-X POST \
-H 'Accept: application/vnd.github.v3+json' \
-d "$json" \
"https://api.github.com/repos/${username}/${repository}/releases" \
--output /dev/null \
--write-out "%{http_code}" \
--silent
)
So when using it for command variables, validate if it is properly formatted before using it :)
If you need to build a JSON representation where members mapped to undefined or empty variables should be ommited, then jo can help.
#!/bin/bash
BUCKET_NAME=testbucket
OBJECT_NAME=""
JO_OPTS=()
if [[ ! "${BUCKET_NAME}x" = "x" ]] ; then
JO_OPTS+=("bucketname=${BUCKET_NAME}")
fi
if [[ ! "${OBJECT_NAME}x" = "x" ]] ; then
JO_OPTS+=("objectname=${OBJECT_NAME}")
fi
if [[ ! "${TARGET_LOCATION}x" = "x" ]] ; then
JO_OPTS+=("targetlocation=${TARGET_LOCATION}")
fi
jo "${JO_OPTS[#]}"
The output of the commands above would be just (note the absence of objectname and targetlocation members):
{"bucketname":"testbucket"}
can be done following way:
JSON_STRING='{"bucketname":"'$BUCKET_NAME'","objectname":"'$OBJECT_NAME'","targetlocation":"'$TARGET_LOCATION'"}'
For Node.js Developer, or if you have node environment installed, you can try this:
JSON_STRING=$(node -e "console.log(JSON.stringify({bucketname: $BUCKET_NAME, objectname: $OBJECT_NAME, targetlocation: $TARGET_LOCATION}))")
Advantage of this method is you can easily convert very complicated JSON Object (like object contains array, or if you need int value instead of string) to JSON String without worrying about invalid json error.
Disadvantage is it's relying on Node.js environment.
These solutions come a little late but I think they are inherently simpler that previous suggestions (avoiding the complications of quoting and escaping).
BUCKET_NAME=testbucket
OBJECT_NAME=testworkflow-2.0.1.jar
TARGET_LOCATION=/opt/test/testworkflow-2.0.1.jar
# Initial unsuccessful solution
JSON_STRING='{"bucketname":"$BUCKET_NAME","objectname":"$OBJECT_NAME","targetlocation":"$TARGET_LOCATION"}'
echo $JSON_STRING
# If your substitution variables have NO whitespace this is sufficient
JSON_STRING=$(tr -d [:space:] <<JSON
{"bucketname":"$BUCKET_NAME","objectname":"$OBJECT_NAME","targetlocation":"$TARGET_LOCATION"}
JSON
)
echo $JSON_STRING
# If your substitution variables are more general and maybe have whitespace this works
JSON_STRING=$(jq -c . <<JSON
{"bucketname":"$BUCKET_NAME","objectname":"$OBJECT_NAME","targetlocation":"$TARGET_LOCATION"}
JSON
)
echo $JSON_STRING
#... A change in layout could also make it more maintainable
JSON_STRING=$(jq -c . <<JSON
{
"bucketname" : "$BUCKET_NAME",
"objectname" : "$OBJECT_NAME",
"targetlocation" : "$TARGET_LOCATION"
}
JSON
)
echo $JSON_STRING
To build upon Hao's answer using NodeJS: you can split up the lines, and use the -p option which saves having to use console.log.
JSON_STRING=$(node -pe "
JSON.stringify({
bucketname: process.env.BUCKET_NAME,
objectname: process.env.OBJECT_NAME,
targetlocation: process.env.TARGET_LOCATION
});
")
An inconvenience is that you need to export the variables beforehand, i.e.
export BUCKET_NAME=testbucket
# etc.
Note: You might be thinking, why use process.env? Why not just use single quotes and have bucketname: '$BUCKET_NAME', etc so bash inserts the variables? The reason is that using process.env is safer - if you don't have control over the contents of $TARGET_LOCATION it could inject JavaScript into your node command and do malicious things (by closing the single quote, e.g. the $TARGET_LOCATION string contents could be '}); /* Here I can run commands to delete files! */; console.log({'a': 'b. On the other hand, process.env takes care of sanitising the input.
You could use envsubst:
export VAR="some_value_here"
echo '{"test":"$VAR"}' | envsubst > json.json
also it might be a "template" file:
//json.template
{"var": "$VALUE", "another_var":"$ANOTHER_VALUE"}
So after you could do:
export VALUE="some_value_here"
export ANOTHER_VALUE="something_else"
cat json.template | envsubst > misha.json
For a general case of building JSON from bash with arbitrary inputs, many of the previous responses (even the high voted ones with jq) omit cases when the variables contain " double quote, or \n newline escape string, and you need complex string concatenation of the inputs.
When using jq you need to printf %b the input first to get the \n converted to real newlines, so that once you pass through jq you get \n back and not \\n.
I found this with version with nodejs to be quite easy to reason about if you know javascript/nodejs well:
TITLE='Title'
AUTHOR='Bob'
JSON=$( TITLE="$TITLE" AUTHOR="$AUTHOR" node -p 'JSON.stringify( {"message": `Title: ${process.env.TITLE}\n\nAuthor: ${process.env.AUTHOR}`} )' )
It's a bit verbose due to process.env. but allows to properly pass the variables from shell, and then format things inside (nodejs) backticks in a safe way.
This outputs:
printf "%s\n" "$JSON"
{"message":"Title: Title\n\nAuthor: Bob"}
(Note: when having a variable with \n always use printf "%s\n" "$VAR" and not echo "$VAR", whose output is platform-dependent! See here for details)
Similar thing with jq would be
TITLE='Title'
AUTHOR='Bob'
MESSAGE="Title: ${TITLE}\n\nAuthor: ${AUTHOR}"
MESSAGE_ESCAPED_FOR_JQ=$(printf %b "${MESSAGE}")
JSON=$( jq '{"message": $jq_msg}' --arg jq_msg "$MESSAGE_ESCAPED_FOR_JQ" --null-input --compact-output --raw-output --monochrome-output )
(the last two params are not necessary when running in a subshell, but I just added them so that the output is then same when you run the jq command in a top-level shell).
Bash will not insert variables into a single-quote string. In order to get the variables bash needs a double-quote string.
You need to use double-quote string for the JSON and just escape double-quote characters inside JSON string.
Example:
#!/bin/sh
BUCKET_NAME=testbucket
OBJECT_NAME=testworkflow-2.0.1.jar
TARGET_LOCATION=/opt/test/testworkflow-2.0.1.jar
JSON_STRING="{\"bucketname\":\"$BUCKET_NAME\",\"objectname\":\"$OBJECT_NAME\",\"targetlocation\":\"$TARGET_LOCATION\"}"
echo $JSON_STRING
if you have node.js and get minimist installed in global:
jc() {
node -p "JSON.stringify(require('minimist')(process.argv), (k,v) => k=='_'?undefined:v)" -- "$#"
}
jc --key1 foo --number 12 --boolean \
--under_score 'abc def' --'white space' ' '
# {"key1":"foo","number":12,"boolean":true,"under_score":"abc def","white space":" "}
you can post it with curl or what:
curl --data "$(jc --type message --value 'hello world!')" \
--header 'content-type: application/json' \
http://server.ip/api/endpoint
be careful that minimist will parse dot:
jc --m.room.member #gholk:ccns.io
# {"m":{"room":{"member":"#gholk:ccns.io"}}}
Used this for AWS Macie configuration:
JSON_CONFIG=$( jq -n \
--arg bucket_name "$BUCKET_NAME" \
--arg kms_key_arn "$KMS_KEY_ARN" \
'{"s3Destination":{"bucketName":$bucket_name,"kmsKeyArn":$kms_key_arn}}'
)
aws macie2 put-classification-export-configuration --configuration "$JSON_CONFIG"
You can simply make a call like this to print the JSON.
#!/bin/sh
BUCKET_NAME=testbucket
OBJECT_NAME=testworkflow-2.0.1.jar
TARGET_LOCATION=/opt/test/testworkflow-2.0.1.jar
echo '{ "bucketName": "'"$BUCKET_NAME"'", "objectName": "'"$OBJECT_NAME"'", "targetLocation": "'"$TARGET_LOCATION"'" }'
or
JSON_STRING='{ "bucketName": "'"$BUCKET_NAME"'", "objectName": "'"$OBJECT_NAME"'", "targetLocation": "'"$TARGET_LOCATION"'" }'
echo $JOSN_STRING