File content to single JSON string value with bash - json

I would like to read a file and put the whole content into a single string that is escaped to be used in a JSON object.
And I want to do it on the commandline/terminal (Linux).

Version 1
WARNING: With this solution the content of the file can be too big to fit in an argument!
jq -n \
--arg content "$(cat theFile.txt)" \
'{ theContent : $content }' \
| \
jq '.theContent'
Version 2
Jeff Mercado provided a more compact solution for the first part - so I adapted that in my code as follows:
jq -Rs \
'{ theContent: . }' \
theFile.txt \
| \
jq '.theContent'
Version 3
Now Jeff Mercado provided a more compact solution for what I was looking for:
jq -Rs '.' theFile.txt

A more direct way to do that is to use the raw input (-R) combined with slurp (-s) parameters to read the entire input as a single string. Then take that input and store in the appropriate property. You don't need to pass it in as a separate parameter.
$ jq -Rs '{ theContent: . }' theFile.txt

Related

Can you separate distinct JSON attributes into two files using jq?

I am following this tutorial from Vault about creating your own certificate authority. I'd like to separate the response (change the output to API call using cURL to see the response) into two distinct files, one file possessing the certificate and issuing_ca attributes, the other file containing the private_key. The tutorial is using jq to parse JSON objects, but my unfamiliarity with jq isn't helpful here, and most searches are returning info on how to merge JSON using jq.
I've tried running something like
vault write -format=json pki_int/issue/example-dot-com \
common_name="test.example.com" \
ttl="24h" \
format=pem \
jq -r '.data.certificate, .data.issuing_ca > test.cert.pem \
jq -r '.data.private_key' > test.key.pem
or
vault write -format=json pki_int/issue/example-dot-com \
common_name="test.example.com" \
ttl="24h" \
format=pem \
| jq -r '.data.certificate, .data.issuing_ca > test.cert.pem \
| jq -r '.data.private_key' > test.key.pem
but no dice.
It is not an issue with jq invocation, but the way the output files get written. Per your usage indicated, after writing the file test.cert.pem, the contents over the read end of the pipe (JSON output) is no longer available to extract the private_key contents.
To duplicate the contents over at the write end of pipe, use tee along with process substitution. The following should work on bash/zsh or ksh93 and not on POSIX bourne shell sh
vault write -format=json pki_int/issue/example-dot-com \
common_name="test.example.com" \
ttl="24h" \
format=pem \
| tee >( jq -r '.data.certificate, .data.issuing_ca' > test.cert.pem) \
>(jq -r '.data.private_key' > test.key.pem) \
>/dev/null
See this in action
jq -n '{data:{certificate: "foo", issuing_ca: "bar", private_key: "zoo"}}' \
| tee >( jq -r '.data.certificate, .data.issuing_ca' > test.cert.pem) \
>(jq -r '.data.private_key' > test.key.pem) \
>/dev/null
and now observe the contents of both the files.
You could abuse jq's ability to write to standard error (version 1.6 or later) separately from standard output.
vault write -format=json pki_int/issue/example-dot-com \
common_name="test.example.com" \
ttl="24h" \
format=pem \
| jq -r '.data as $f | ($f.private_key | stderr) | ($f.certificate, $f.issuing_ca)' > test.cert.pem 2> test.key.pem
There's a general technique for this type of problem that is worth mentioning
because it has minimal prerequisites (just jq and awk), and because
it scales well with the number of files. Furthermore it is quite efficient in that only one invocation each of jq and awk is needed. The idea is to setup a pipeline of the form: jq ... | awk ...
There are many variants
of the technique but in the present case, the following would suffice:
jq -rc '
.data
| "test.cert.pem",
"\t\(.certificate)",
"\t\(.issuing_ca)",
"test.key.pem",
"\t\(.private_key)"
' | awk -F\\t 'NF == 1 {fn=$1; next} {print $2 > fn}'
Notice that this works even if the items of interest are strings with embedded tabs.

/bin/jq: Argument list too long

I'm using groovy script and write this line
Version_List=$(jq -r '.items[] .version' "${projectContent}")
the ${projectContent} is another variable which is result of curl command
when i run the pipeline this is the error /bin/jq: Argument list too long
Use jq -Rs . <<< "${projectContent}" to convert the variable content into a JSON string (you may even use curl directly, like so: curl … | jq -Rs), then import that string into a jq variable using --slurpfile with Process Substitution
jq --slurpfile projectContent <(
jq -Rs . <<< "${projectContent}"
) '...' # your jq code here using variable $projectContent
Don't forget to also add either -n or an input file.

use curl/bash command in jq

I am trying to get a list of URL after redirection using bash scripting. Say, google.com gets redirected to http://www.google.com with 301 status.
What I have tried is:
json='[{"url":"google.com"},{"url":"microsoft.com"}]'
echo "$json" | jq -r '.[].url' | while read line; do
curl -LSs -o /dev/null -w %{url_effective} $line 2>/dev/null
done
So, is it possible for us to use commands like curl inside jq for processing JSON objects.
I want to add the resulting URL to existing JSON structure like:
[
{
"url": "google.com",
"redirection": "http://www.google.com"
},
{
"url": "microsoft.com",
"redirection": "https://www.microsoft.com"
}
]
Thank you in advance..!
curl is capable of making multiple transfers in a single process, and it can also read command line arguments from a file or stdin, so, you don't need a loop at all, just put that JSON into a file and run this:
jq -r '"-o /dev/null\nurl = \(.[].url)"' file |
curl -sSLK- -w'%{url_effective}\n' |
jq -R 'fromjson | map(. + {redirection: input})' file -
This way only 3 processes will be spawned for the whole task, instead of n + 2 where n is the number of URLs.
I would generate a dictionary with jq per url and slurp those dictionaries into the final list with jq -s:
json='[{"url":"google.com"},{"url":"microsoft.com"}]'
echo "$json" | jq -r '.[].url' | while read url; do
redirect=$(curl -LSs \
-o /dev/null \
-w '%{url_effective}' \
"${url}" 2>/dev/null)
jq --null-input --arg url "${url}" --arg redirect "${redirect}" \
'{url:$url, redirect: $redirect}'
done | jq -s
Alternative (first) solution:
You can output the url and the effective_url as tab separated data and create the output json with jq:
json='[{"url":"google.com"},{"url":"microsoft.com"}]'
echo "$json" | jq -r '.[].url' | while read line; do
prefix="${line}\t"
curl -LSs -o /dev/null -w "${prefix}"'%{url_effective}'"\n" "$line" 2>/dev/null
done | jq -r --raw-input 'split("\t")|{"url":.[0],"redirection":.[1]}'
Both solutions will generate valid json, independently of whatever characters the url/effective_url might contain.
Trying to keep this in JSON all the way is pretty cumbersome. I would simply try to make Bash construct a new valid JSON fragment inside the loop.
So in other words, if $url is the URL and $redirect is where it redirects to, you can do something like
printf '{"url": "%s", "redirection": "%s"}\n' "$url" "$redirect"
to produce JSON output from these strings. So tying it all together
jq -r '.[].url' <<<"$json" |
while read -r url; do
printf '{"url:" "%s", "redirection": "%s"}\n' \
"$url" "$(curl -LSs -o /dev/null -w '%{url_effective}' "$url")"
done |
jq -s
This is still pretty brittle; in particular, if either of the printf input strings could contain a literal double quote, that should properly be escaped.

JQ JSON Creation

New to JSON creation, decided to try JQ from this question. However, when I tried implementing it, all I received was the help page for JQ and these types of errors: ./test.sh: line 43: --: command not found for the last three lines.
#!/bin/bash
echo Starting
sensorType="XXX"
sensorLocation="XXX"
sensorCommand="XXXXX"
# Hardware information.
systemTime=$(date +%d-%m-%Y_%H:%M:%S)
kernalName=$(uname -s)
nodeName=$(uname -i)
kernalRelease=$(uname -r)
kernalVersion=$(uname -v)
machine=$(uname -m)
processor=$(uname -p)
hardwarePlatform=$(uname -i)
operatingSystem=$(uname -o)
timeup=$(uptime)
# Software information.
softVersion=$(XXX version)
JSON_STRING=$( jq -n \
-- arg systemTime "$systemTime" \
-- arg sensorType "$sensorType" \
-- arg sensorLocation "$sensorLocation" \
-- arg sensorCommand "$sensorCommand" \
-- arg kernalName "$kernalName" \
-- arg nodeName "$nodeName" \
-- arg kernalRelease "$kernalRelease" \
-- arg kernalVersion "$kernalVersion" \
-- arg machine "$machine" \
-- arg processor "$processor"
-- arg hardwarePlatform "$hardwarePlatform" \
-- arg operatingSystem "$operatingSystem" \
-- arg timeup "$timeup" \
-- arg softVersion "$softVersion" \
'{systemTime: $systemTime, sensorType: $sensorType, sensorLocation: $sensorLocation, kernalName: $kernalName, nodeName: $nodeName, kernalRelease: $kernalRelease, machine: $machine, processor: $processor, hardwarePlatform: $hardwarePlatform, operatingSystem: $operatingSystem, timeup: $timeup, softVersion: $softVersion }' )
echo $JSON_STRING
Not sure if this is the most efficient way of using JQ or if there is a better way to implement it. But I would love to hear if there is a more efficient / easier way to accomplish this.
Use an array to store the arguments before calling jq; it's easier to spread the arguments across multiple lines in an array assignment, as the backslashes aren't necessary.
jq_args=(
--arg systemTime "$systemTime"
--arg sensorType "$sensorType"
--arg sensorLocation "$sensorLocation"
--arg sensorCommand "$sensorCommand"
--arg kernalName "$kernalName"
--arg nodeName "$nodeName"
--arg kernalRelease "$kernalRelease"
--arg kernalVersion "$kernalVersion"
--arg machine "$machine"
--arg processor "$processor"
--arg hardwarePlatform "$hardwarePlatform"
--arg operatingSystem "$operatingSystem"
--arg timeup "$timeup"
--arg softVersion "$softVersion"
)
JSON_STRING=$( jq -n "${jq_args[#]}" '{
systemTime: $systemTime,
sensorType: $sensorType,
sensorLocation: $sensorLocation,
kernalName: $kernalName,
nodeName: $nodeName,
kernalRelease: $kernalRelease,
machine: $machine,
processor: $processor,
hardwarePlatform: $hardwarePlatform,
operatingSystem: $operatingSystem,
timeup: $timeup,
softVersion: $softVersion
}' )
If you are using a version of bash that supports associative arrays, you can further simply the building of jq_args:
declare -A x
x=([systemTime]="$systemTime"
[sensorType]="$sensorType"
# etc
)
declare -a jq_args
for k in "${!x[#]}"; do
jq_args+=(--arg "$k" "${x[$k]}")
done
JSON_STRING=$( jq -n "${jq_args[#]}" ... )
The first problem, as #Inian pointed out, is that there should be no space between "--" and "arg".
The second problem is that there should be no spaces after the backslash when it is used (as here) for line-continuation: for the backslash to serve as a line-continuation character, it must escape (i.e. immediately precede) the newline.
Otherwise, except possibly for some oddities such as $(XXX version), you should be good to go, which is not to say there aren't better ways to create the JSON object. See the next section for an illustration of an alternative approach.
Illustration of an alternative approach
The following approach can be used even if the keys and/or values contain control characters:
FMT="%s\0%s\0"
(
printf $FMT systemTime $(date +%d-%m-%Y_%H:%M:%S)
printf $FMT newline "$(echo a; echo b)"
) | jq -sR '[split("\u0000") | range(0;length;2) as $i | {(.[$i]): .[$i + 1]}] | add'
If it is known that no keys or values contain literal newline characters, the following variant, which has the main advantage of not requiring the "-s" option, could be used:
(
echo systemTime
date +%d-%m-%Y_%H:%M:%S
echo uname
uname -a
echo softVersion
echo XXX version
) | jq -nR '[inputs as $key | {($key): input}] | add'
Not sure if this is the most efficient way of using JQ
Here are two alternative approaches. The first is based on the TSV format, and the second the CSV format.
Using the TSV format
The following assumes that neither literal tabs nor literal newlines appear within keys or values.
FMT="%s\t%s\n"
(
printf $FMT systemTime $(date +%d-%m-%Y_%H:%M:%S)
printf $FMT uname "$(uname -a)"
printf $FMT softVersion "XXX version"
) | jq -nR '[inputs | split("\t") | {(.[0]): .[1]}] | add'
Using a CSV-to-JSON utility
The approach illustrated here is probably mainly of interest if one starts with a CSV file of key-value pairs.
In the following, we'll use the excellent csv2json utility at https://github.com/fadado/CSV There are actually two executables provided there. Both convert each CSV row to a JSON array. We'll use extended symlinked to csv2json.
(
echo systemTime, $(date +%d-%m-%Y_%H:%M:%S)
echo uname, "$(uname -a)"
echo softVersion, XXX version
) | csv2json
| jq -n '[inputs as [$key, $value] | {($key): $value}] | add'
If you have all the values as environment variables, you could just utilize the env object to get the values. The syntax then becomes trivial.
systemTime=$(date +%d-%m-%Y_%H:%M:%S) \
kernalName=$(uname -s) \
nodeName=$(uname -i) \
kernalRelease=$(uname -r) \
kernalVersion=$(uname -v) \
machine=$(uname -m) \
processor=$(uname -p) \
hardwarePlatform=$(uname -i) \
operatingSystem=$(uname -o) \
timeup=$(uptime) \
jq -n 'env | {
systemTime,
sensorType,
sensorLocation,
kernalName,
nodeName,
kernalRelease,
machine,
processor,
hardwarePlatform,
operatingSystem,
timeup,
softVersion
}'

Build a JSON string with Bash variables

I need to read these bash variables into my JSON string and I am not familiar with bash. any help is appreciated.
#!/bin/sh
BUCKET_NAME=testbucket
OBJECT_NAME=testworkflow-2.0.1.jar
TARGET_LOCATION=/opt/test/testworkflow-2.0.1.jar
JSON_STRING='{"bucketname":"$BUCKET_NAME"","objectname":"$OBJECT_NAME","targetlocation":"$TARGET_LOCATION"}'
echo $JSON_STRING
You are better off using a program like jq to generate the JSON, if you don't know ahead of time if the contents of the variables are properly escaped for inclusion in JSON. Otherwise, you will just end up with invalid JSON for your trouble.
BUCKET_NAME=testbucket
OBJECT_NAME=testworkflow-2.0.1.jar
TARGET_LOCATION=/opt/test/testworkflow-2.0.1.jar
JSON_STRING=$( jq -n \
--arg bn "$BUCKET_NAME" \
--arg on "$OBJECT_NAME" \
--arg tl "$TARGET_LOCATION" \
'{bucketname: $bn, objectname: $on, targetlocation: $tl}' )
You can use printf:
JSON_FMT='{"bucketname":"%s","objectname":"%s","targetlocation":"%s"}\n'
printf "$JSON_FMT" "$BUCKET_NAME" "$OBJECT_NAME" "$TARGET_LOCATION"
much clear and simpler
A possibility:
#!/bin/bash
BUCKET_NAME="testbucket"
OBJECT_NAME="testworkflow-2.0.1.jar"
TARGET_LOCATION="/opt/test/testworkflow-2.0.1.jar
# one line
JSON_STRING='{"bucketname":"'"$BUCKET_NAME"'","objectname":"'"$OBJECT_NAME"'","targetlocation":"'"$TARGET_LOCATION"'"}'
# multi-line
JSON_STRING="{
\"bucketname\":\"${BUCKET_NAME}\",
\"objectname\":\"${OBJECT_NAME}\",
\"targetlocation\":\"${TARGET_LOCATION}\"
}"
# [optional] validate the string is valid json
echo "${JSON_STRING}" | jq
In addition to chepner's answer, it's also possible to construct the object completely from args with this simple recipe:
BUCKET_NAME=testbucket
OBJECT_NAME=testworkflow-2.0.1.jar
TARGET_LOCATION=/opt/test/testworkflow-2.0.1.jar
JSON_STRING=$(jq -n \
--arg bucketname "$BUCKET_NAME" \
--arg objectname "$OBJECT_NAME" \
--arg targetlocation "$TARGET_LOCATION" \
'$ARGS.named')
Explanation:
--null-input | -n disabled reading input. From the man page: Don't read any input at all! Instead, the filter is run once using null as the input. This is useful when using jq as a simple calculator or to construct JSON data from scratch.
--arg name value passes values to the program as predefined variables: value is available as $name. All named arguments are also available as $ARGS.named
Because the format of $ARGS.named is already an object, jq can output it as is.
First, don't use ALL_CAPS_VARNAMES: it's too easy to accidentally overwrite a crucial shell variable (like PATH)
Mixing single and double quotes in shell strings can be a hassle. In this case, I'd use printf:
bucket_name=testbucket
object_name=testworkflow-2.0.1.jar
target_location=/opt/test/testworkflow-2.0.1.jar
template='{"bucketname":"%s","objectname":"%s","targetlocation":"%s"}'
json_string=$(printf "$template" "$BUCKET_NAME" "$OBJECT_NAME" "$TARGET_LOCATION")
echo "$json_string"
For homework, read this page carefully: Security implications of forgetting to quote a variable in bash/POSIX shells
A note on creating JSON with string concatenation: there are edge cases. For example, if any of your strings contain double quotes, you can broken JSON:
$ bucket_name='a "string with quotes"'
$ printf '{"bucket":"%s"}\n' "$bucket_name"
{"bucket":"a "string with quotes""}
Do do this more safely with bash, we need to escape that string's double quotes:
$ printf '{"bucket":"%s"}\n' "${bucket_name//\"/\\\"}"
{"bucket":"a \"string with quotes\""}
I had to work out all possible ways to deal json strings in a command request, Please look at the following code to see why using single quotes can fail if used incorrectly.
# Create Release and Tag commit in Github repository
# returns string with in-place substituted variables
json=$(cat <<-END
{
"tag_name": "${version}",
"target_commitish": "${branch}",
"name": "${title}",
"body": "${notes}",
"draft": ${is_draft},
"prerelease": ${is_prerelease}
}
END
)
# returns raw string without any substitutions
# single or double quoted delimiter - check HEREDOC specs
json=$(cat <<-!"END" # or 'END'
{
"tag_name": "${version}",
"target_commitish": "${branch}",
"name": "${title}",
"body": "${notes}",
"draft": ${is_draft},
"prerelease": ${is_prerelease}
}
END
)
# prints fully formatted string with substituted variables as follows:
echo "${json}"
{
"tag_name" : "My_tag",
"target_commitish":"My_branch"
....
}
Note 1: Use of single vs double quotes
# enclosing in single quotes means no variable substitution
# (treats everything as raw char literals)
echo '${json}'
${json}
echo '"${json}"'
"${json}"
# enclosing in single quotes and outer double quotes causes
# variable expansion surrounded by single quotes(treated as raw char literals).
echo "'${json}'"
'{
"tag_name" : "My_tag",
"target_commitish":"My_branch"
....
}'
Note 2: Caution with Line terminators
Note the json string is formatted with line terminators such as LF \n
or carriage return \r(if its encoded on windows it contains CRLF \r\n)
using (translate) tr utility from shell we can remove the line terminators if any
# following code serializes json and removes any line terminators
# in substituted value/object variables too
json=$(echo "$json" | tr -d '\n' | tr -d '\r' )
# string enclosed in single quotes are still raw literals
echo '${json}'
${json}
echo '"${json}"'
"${json}"
# After CRLF/LF are removed
echo "'${json}'"
'{ "tag_name" : "My_tag", "target_commitish":"My_branch" .... }'
Note 3: Formatting
while manipulating json string with variables, we can use combination of ' and " such as following, if we want to protect some raw literals using outer double quotes to have in place substirution/string interpolation:
# mixing ' and "
username=admin
password=pass
echo "$username:$password"
admin:pass
echo "$username"':'"$password"
admin:pass
echo "$username"'[${delimiter}]'"$password"
admin[${delimiter}]pass
Note 4: Using in a command
Following curl request already removes existing \n (ie serializes json)
response=$(curl -i \
--user ${username}:${api_token} \
-X POST \
-H 'Accept: application/vnd.github.v3+json' \
-d "$json" \
"https://api.github.com/repos/${username}/${repository}/releases" \
--output /dev/null \
--write-out "%{http_code}" \
--silent
)
So when using it for command variables, validate if it is properly formatted before using it :)
If you need to build a JSON representation where members mapped to undefined or empty variables should be ommited, then jo can help.
#!/bin/bash
BUCKET_NAME=testbucket
OBJECT_NAME=""
JO_OPTS=()
if [[ ! "${BUCKET_NAME}x" = "x" ]] ; then
JO_OPTS+=("bucketname=${BUCKET_NAME}")
fi
if [[ ! "${OBJECT_NAME}x" = "x" ]] ; then
JO_OPTS+=("objectname=${OBJECT_NAME}")
fi
if [[ ! "${TARGET_LOCATION}x" = "x" ]] ; then
JO_OPTS+=("targetlocation=${TARGET_LOCATION}")
fi
jo "${JO_OPTS[#]}"
The output of the commands above would be just (note the absence of objectname and targetlocation members):
{"bucketname":"testbucket"}
can be done following way:
JSON_STRING='{"bucketname":"'$BUCKET_NAME'","objectname":"'$OBJECT_NAME'","targetlocation":"'$TARGET_LOCATION'"}'
For Node.js Developer, or if you have node environment installed, you can try this:
JSON_STRING=$(node -e "console.log(JSON.stringify({bucketname: $BUCKET_NAME, objectname: $OBJECT_NAME, targetlocation: $TARGET_LOCATION}))")
Advantage of this method is you can easily convert very complicated JSON Object (like object contains array, or if you need int value instead of string) to JSON String without worrying about invalid json error.
Disadvantage is it's relying on Node.js environment.
These solutions come a little late but I think they are inherently simpler that previous suggestions (avoiding the complications of quoting and escaping).
BUCKET_NAME=testbucket
OBJECT_NAME=testworkflow-2.0.1.jar
TARGET_LOCATION=/opt/test/testworkflow-2.0.1.jar
# Initial unsuccessful solution
JSON_STRING='{"bucketname":"$BUCKET_NAME","objectname":"$OBJECT_NAME","targetlocation":"$TARGET_LOCATION"}'
echo $JSON_STRING
# If your substitution variables have NO whitespace this is sufficient
JSON_STRING=$(tr -d [:space:] <<JSON
{"bucketname":"$BUCKET_NAME","objectname":"$OBJECT_NAME","targetlocation":"$TARGET_LOCATION"}
JSON
)
echo $JSON_STRING
# If your substitution variables are more general and maybe have whitespace this works
JSON_STRING=$(jq -c . <<JSON
{"bucketname":"$BUCKET_NAME","objectname":"$OBJECT_NAME","targetlocation":"$TARGET_LOCATION"}
JSON
)
echo $JSON_STRING
#... A change in layout could also make it more maintainable
JSON_STRING=$(jq -c . <<JSON
{
"bucketname" : "$BUCKET_NAME",
"objectname" : "$OBJECT_NAME",
"targetlocation" : "$TARGET_LOCATION"
}
JSON
)
echo $JSON_STRING
To build upon Hao's answer using NodeJS: you can split up the lines, and use the -p option which saves having to use console.log.
JSON_STRING=$(node -pe "
JSON.stringify({
bucketname: process.env.BUCKET_NAME,
objectname: process.env.OBJECT_NAME,
targetlocation: process.env.TARGET_LOCATION
});
")
An inconvenience is that you need to export the variables beforehand, i.e.
export BUCKET_NAME=testbucket
# etc.
Note: You might be thinking, why use process.env? Why not just use single quotes and have bucketname: '$BUCKET_NAME', etc so bash inserts the variables? The reason is that using process.env is safer - if you don't have control over the contents of $TARGET_LOCATION it could inject JavaScript into your node command and do malicious things (by closing the single quote, e.g. the $TARGET_LOCATION string contents could be '}); /* Here I can run commands to delete files! */; console.log({'a': 'b. On the other hand, process.env takes care of sanitising the input.
You could use envsubst:
export VAR="some_value_here"
echo '{"test":"$VAR"}' | envsubst > json.json
also it might be a "template" file:
//json.template
{"var": "$VALUE", "another_var":"$ANOTHER_VALUE"}
So after you could do:
export VALUE="some_value_here"
export ANOTHER_VALUE="something_else"
cat json.template | envsubst > misha.json
For a general case of building JSON from bash with arbitrary inputs, many of the previous responses (even the high voted ones with jq) omit cases when the variables contain " double quote, or \n newline escape string, and you need complex string concatenation of the inputs.
When using jq you need to printf %b the input first to get the \n converted to real newlines, so that once you pass through jq you get \n back and not \\n.
I found this with version with nodejs to be quite easy to reason about if you know javascript/nodejs well:
TITLE='Title'
AUTHOR='Bob'
JSON=$( TITLE="$TITLE" AUTHOR="$AUTHOR" node -p 'JSON.stringify( {"message": `Title: ${process.env.TITLE}\n\nAuthor: ${process.env.AUTHOR}`} )' )
It's a bit verbose due to process.env. but allows to properly pass the variables from shell, and then format things inside (nodejs) backticks in a safe way.
This outputs:
printf "%s\n" "$JSON"
{"message":"Title: Title\n\nAuthor: Bob"}
(Note: when having a variable with \n always use printf "%s\n" "$VAR" and not echo "$VAR", whose output is platform-dependent! See here for details)
Similar thing with jq would be
TITLE='Title'
AUTHOR='Bob'
MESSAGE="Title: ${TITLE}\n\nAuthor: ${AUTHOR}"
MESSAGE_ESCAPED_FOR_JQ=$(printf %b "${MESSAGE}")
JSON=$( jq '{"message": $jq_msg}' --arg jq_msg "$MESSAGE_ESCAPED_FOR_JQ" --null-input --compact-output --raw-output --monochrome-output )
(the last two params are not necessary when running in a subshell, but I just added them so that the output is then same when you run the jq command in a top-level shell).
Bash will not insert variables into a single-quote string. In order to get the variables bash needs a double-quote string.
You need to use double-quote string for the JSON and just escape double-quote characters inside JSON string.
Example:
#!/bin/sh
BUCKET_NAME=testbucket
OBJECT_NAME=testworkflow-2.0.1.jar
TARGET_LOCATION=/opt/test/testworkflow-2.0.1.jar
JSON_STRING="{\"bucketname\":\"$BUCKET_NAME\",\"objectname\":\"$OBJECT_NAME\",\"targetlocation\":\"$TARGET_LOCATION\"}"
echo $JSON_STRING
if you have node.js and get minimist installed in global:
jc() {
node -p "JSON.stringify(require('minimist')(process.argv), (k,v) => k=='_'?undefined:v)" -- "$#"
}
jc --key1 foo --number 12 --boolean \
--under_score 'abc def' --'white space' ' '
# {"key1":"foo","number":12,"boolean":true,"under_score":"abc def","white space":" "}
you can post it with curl or what:
curl --data "$(jc --type message --value 'hello world!')" \
--header 'content-type: application/json' \
http://server.ip/api/endpoint
be careful that minimist will parse dot:
jc --m.room.member #gholk:ccns.io
# {"m":{"room":{"member":"#gholk:ccns.io"}}}
Used this for AWS Macie configuration:
JSON_CONFIG=$( jq -n \
--arg bucket_name "$BUCKET_NAME" \
--arg kms_key_arn "$KMS_KEY_ARN" \
'{"s3Destination":{"bucketName":$bucket_name,"kmsKeyArn":$kms_key_arn}}'
)
aws macie2 put-classification-export-configuration --configuration "$JSON_CONFIG"
You can simply make a call like this to print the JSON.
#!/bin/sh
BUCKET_NAME=testbucket
OBJECT_NAME=testworkflow-2.0.1.jar
TARGET_LOCATION=/opt/test/testworkflow-2.0.1.jar
echo '{ "bucketName": "'"$BUCKET_NAME"'", "objectName": "'"$OBJECT_NAME"'", "targetLocation": "'"$TARGET_LOCATION"'" }'
or
JSON_STRING='{ "bucketName": "'"$BUCKET_NAME"'", "objectName": "'"$OBJECT_NAME"'", "targetLocation": "'"$TARGET_LOCATION"'" }'
echo $JOSN_STRING