How to have a good json format in Bash? - json

I have a script wherein when I get the value I will going to output them in a json format.
Below is my script:
#!/bin/bash
if
[ "$(egrep -l 'TR.834' /home/user/for_test_dir/*) " ];
then
egrep -l 'TR.834' /home/user/for_test_dir/* > /tmp/outbound.txt
chmod 777 /tmp/result.txt
#json_File=~/outboundFile.json
info_template='
{
"date" : "%s",
"time" : "%s",
"type" : "%s",
"num" : "%s"
},\n'
echo '{' #>> $json_File
echo -e '\t"List": {' #>> $json_File
echo -e '\t\t"Data" : [' #>> $json_File
a=1
while read LINE
do
fullpath[$a]=$LINE
filename[$a]=`basename $LINE`
cat $LINE | awk '{gsub("-",RS);print}' > /tmp/temp2.txt
chmod 777 /tmp/temp2.txt
cat /tmp/temp2.txt | awk '{gsub("*",RS);print}' > /tmp/temp.txt
chmod 777 /tmp/temp.txt
DATE[$a]=`sed -n '10p' < /tmp/temp.txt`
TIME[$a]=`sed -n '11p' < /tmp/temp.txt`
TYPE[$a]=`sed -n '28p' < /tmp/temp.txt`
NUM[$a]=`grep -oP 'BEN.\K[A-Z0-9_-\s]*' < /tmp/temp2.txt | awk '{ printf "%s,", $0 }'| sed 's/,\+$//' | awk '!x[$0]++'`
printf "$template" "${DATE[$a]}"\
"${TIME[$a]}"\
"${TYPE[$a]}"\
"${num[$a]}" #>> $json_File
done < /tmp/result.txt
fi
echo -e "\t\t]" #>> $json_File
echo -e "\t}" #>> $json_File
echo -e "}" #>> $json_File
My problem here is when I extract more than one data. Like for example, "NUM": "3232", "232343", "23345",
and my expected output for this would be like this:
"NUM" : [
"3232",
"232343",
"23345"
]
For single-data, "NUM" : "23234" (format would be as is). Badly need your help.

echo '{ "NUM": "3232", "232343", "23345" }' | python -m json.tool
or
sudo gem install json
echo '{ "NUM": "3232", "232343", "23345" }' | prettify_json.rb

Related

Using jq with an unknown amount of arguments from shell script?

I'm about to lose my mind working with jq for the first time today. I've tried every ever so dirty way to make this work somehow. Let's get down to business:
I'm trying to further develop the pretty barebones API for Unifi Controllers for some use cases:
https://dl.ui.com/unifi/6.0.41/unifi_sh_api
Here's some mock up data (the real json data is thousands of lines):
{
"meta": {
"rc": "ok"
},
"data": [
{
"_id": "61a0da77f730e404af0edc3c",
"ip": "10.19.31.120",
"mac": "ff:ec:ff:89:ff:58",
"name": "Test-Accesspoint"
}
]
}
Here is my advancedapi.sh
#!/bin/bash
source unifiapi.sh
unifi_login 1>dev/null
if [ "$1" ];
then
command=$1
shift
else
echo "Please enter command"
exit
fi
#jq -r '.data[] | "\(.name),\(.mac)"'
assembleKeys(){
c="0"
seperator=";"
keys=$(for i in "$#"
do
c=$(expr $c + 1)
if [ $c -gt 1 ];
then
echo -n "${seperator}\(.${i})"
else
echo -n "\(.${i})"
fi
done
echo)
}
assembleJQ(){
c=0
jq=$(echo -en 'jq -r -j \x27.data[] | '
for i in "$#"
do
if [ "$c" != "0" ];
then
echo -en ",\x22 \x22,"
fi
c=1
echo -en ".$i"
done
echo -en ",\x22\\\n\x22\x27")
echo "$jq"
}
getDeviceKeys(){
assembleJQ "$#"
#unifi_list_devices | jq -r '.data[]' | jq -r "${keys}"
#export keys=mac
#export second=name
#unifi_list_devices | jq -r -j --arg KEYS "$keys" --arg SECOND "$second" '.data[] | .[$KEYS]," ",.[$SECOND],"\n"'
unifi_list_devices | $jq
#unifi_list_devices | jq -r -j '.data[] | .mac," ",.name,"\n"'
}
"$command" $#
Users should be able to call any function from the API with as many arguments as they want.
So basically this:
#export keys=mac
#export second=name
#unifi_list_devices | jq -r -j --arg KEYS "$keys" --arg SECOND "$second" '.data[] | .[$KEYS]," ",.[$SECOND],"\n"'
which works, but only with a limited number of arguments. I want to pass $# to jq.
I'm trying to get the name and mac-address of a device by calling:
./advancedapi.sh getDeviceKeys mac name
this should return the mac-address and name of the device. But it only gives me this:
jq -r -j '.data[] | .mac," ",.name,"\n"'
jq: error: syntax error, unexpected INVALID_CHARACTER, expecting $end (Unix shell quoting issues?) at <top-level>, line 1:
'.data[]
jq: 1 compile error
The top-most line being the jq command, which works perfectly fine when called manually.
The assembleKeys function was my attempt at generating a string that looks like this:
jq -r '.data[] | "\(.name),\(.mac)"'
Can anyone explain to me, preferably in-depth, how to do this? I'm going insane here!
Thanks!
I want to pass $# to jq.
There are actually many ways to do this, but the simplest might be using the --args command-line option, as illustrated here:
File: check
#/bin/bash
echo "${#}"
jq -n '$ARGS.positional' --args "${#}"
Example run:
./check 1 2 3
1 2 3
[
"1",
"2",
"3"
]
Projection
Here's an example that might be closer to what you're looking for.
Using your sample JSON:
File: check2
#/bin/bash
jq -r '
def project($array):
. as $in | reduce $array[] as $k ([]; . + [$in[$k]]);
$ARGS.positional,
(.data[] | project($ARGS.positional))
| #csv
' input.json --args "${#}"
./check2 name mac
"name","mac"
"Test-Accesspoint","ff:ec:ff:89:ff:58"
There is error in the json (red highlighted commas)(that commas are not needed)
$
$ cat j.json | jq -r '.data[] | "\(.name)█\(.mac)█\(.ip)█\(._id)"'
Test-Accesspoint█ff:ec:ff:89:ff:58█10.19.31.120█61a0da77f730e404af0edc3c
$ cat j.json
{
"meta": {
"rc": "ok"
},
"data": [
{
"_id": "61a0da77f730e404af0edc3c",
"ip": "10.19.31.120",
"mac": "ff:ec:ff:89:ff:58",
"name": "Test-Accesspoint"
}
]
}
$
If the api have given this json you should probably edit it, before piping it into jq [maybe with sed or something like it]

Redirecting mysql output to prompt using shell

I'm writing a shell code to automaticaly run multiple sql queries in multiple databases. My code is working well, but beside "select" queries, all other queries aren't displaying anything in the prompt while executing. How could I force query outputs to redirect to prompt?
That's some of my code:
for x in "${db[#]}"
do
found=0
for enreg in `cat /home/dbfile.csv`
do
#extracting database data from csv
DBNAME=`echo $enreg | awk -F";" '{ print $4 }'`
if [ $x = $DBNAME ]
then
PASS=`echo $enreg | awk -F";" '{ print $2 }'`
HOST=`echo $enreg | awk -F";" '{ print $3 }'`
USERNAME=`echo $enreg | awk -F";" '{ print $1 }'`
# Running queries in database $DBNAME
for y in "${req[#]}"
do
echo "
"
mysql -u $USERNAME -p$PASS -h $HOST $DBNAME -e "$y"
echo "
"
done
found=1
break
fi
done
if [ $found -eq 0 ]
then
echo "Database $x doesn't exist"
fi
done

Send bash array to json file

I pulled some config variables from a json file with jq.
And once modified, i want to write the whole config array (which can contain keys that were not there at first) to the json file.
The "foreach" part seems quite obvious.
But how to express "change keyA by value A or add keyA=>valueA" to the conf file ?
I'm stuck with something like
for key in "${!conf[#]}"; do
value=${conf[$key]}
echo $key $value
jq --arg key $key --arg value $value '.$key = $value' $conf_file > $new_file
done
Thanks
Extended solution with single jq invocation:
Sample conf array:
declare -A conf=([status]="finished" [nextTo]="2018-01-24" [result]=true)
Sample file conf.json:
{
"status": "running",
"minFrom": "2018-01-23",
"maxTo": "2018-01-24",
"nextTo": "2018-01-23",
"nextFrom": "2018-01-22"
}
Processing:
jq --arg data "$(paste -d':' <(printf "%s\n" "${!conf[#]}") <(printf "%s\n" "${conf[#]}"))" \
'. as $conf | map($data | split("\n")[]
| split(":") | {(.[0]) : .[1]})
| add | $conf + .' conf.json > conf.tmp && mv conf.tmp conf.json
The resulting conf.json contents:
{
"status": "finished",
"minFrom": "2018-01-23",
"maxTo": "2018-01-24",
"nextTo": "2018-01-24",
"nextFrom": "2018-01-22",
"result": "true"
}
I finally run into the following solution : not as compact as RomanPerekhrest, but a bit more human-readable. Thanks.
function update_conf() {
echo update_conf
if [ -n "$1" ] && [ -n "$2" ]; then
conf[$1]=$2
fi
for key in "${!conf[#]}"; do
value=${conf[$key]}
jq --arg key "$key" --arg value "$value" '.[$key] = $value' $confFile > $confTempFile && mv $confTempFile $confF$
done
}

Add values named per element in loop

I have the next piece of code:
echo -n "{ \"data\":" > test.json
echo -n " [" >> test.json
sqcur=0
sqtotal=`ls /etc/squid?.conf | wc -l`
tcp_ports=$(grep "port" /etc/squid?.conf |awk '{print $2}' |sed 's|.*[0-9]*\:||g'|sort|uniq|xargs)
for squidconf in `ls /etc/squid?.conf`; do
let "sqcur+=1" > /dev/null
squidident=`echo $squidconf | sed 's/.*\///' | awk -F '.' '{print $1}'`
tcp_ports=$(grep "port" $squidconf |awk '{print $2}' |sed 's|.*[0-9]*\:||g'|sort|uniq|xargs)
if [ $sqcur -lt $sqtotal ]; then
echo -n " { \"{#SQPROC}\": \"/usr/local/squid/bin/squid\", \"{#SQPROC_IDENT}\": \"${squidident}\", \"{#SQPROC_ARGS}\": \"-D -C -F -f ${squidconf}\", \"{#SQPROC_PORT}\": \"${tcp_ports}\", \"{#SQPROC_CONF}\": \"${squidconf}\" }," >> test.json
else
echo -n " { \"{#SQPROC}\": \"/usr/local/squid/bin/squid\", \"{#SQPROC_IDENT}\": \"${squidident}\", \"{#SQPROC_ARGS}\": \"-D -C -F -f ${squidconf}\", \"{#SQPROC_PORT}\": \"${tcp_ports}\", \"{#SQPROC_CONF}\": \"${squidconf}\" }" >> test.json
fi
done
echo -n " ]" >> test.json
echo "}" >> test.json
It gives the next content in json file:
{
"data": [
{
"{#SQPROC}": "/usr/local/squid/bin/squid",
"{#SQPROC_IDENT}": "squid1",
"{#SQPROC_ARGS}": "-D -C -F -f /etc/squid1.conf",
"{#SQPROC_PORT}": "1111 2222",
"{#SQPROC_CONF}": "/etc/squid1.conf"
},
But I want in case of multiple values of port to obtain the next result:
{
"data": [
{
"{#SQPROC}": "/usr/local/squid/bin/squid",
"{#SQPROC_IDENT}": "squid1",
"{#SQPROC_ARGS}": "-D -C -F -f /etc/squid1.conf",
"{#SQPROC_PORT_1111}": "1111",
"{#SQPROC_PORT_2222}": "2222",
"{#SQPROC_CONF}": "/etc/squid1.conf"
},
Could you please help with it?
You're probably looking for a something like this:
for port in $tcpports
do
all_ports="${all_ports} \"{#SQPROC_PORT_$port}\": \"$port\","
done
At the end of this loop you'll have an all_ports variable containing something like:
"{#SQPROC_PORT_1111}": "1111", "{#SQPROC_PORT_2222}": "2222",
for example.
To integrate that easily in the code you already have I would propose some little changes in your for-loop. For each iteration build up a variable json_content that will containt the data for the json at that iteration and at the end of the iteration, just print $json_content:
echo -n "{ \"data\":" > test.json
echo -n " [" >> test.json
sqcur=0
sqtotal=`ls /etc/squid?.conf | wc -l`
for squidconf in `ls /etc/squid?.conf`; do
let "sqcur+=1" > /dev/null
squidident=`echo $squidconf | sed 's/.*\///' | awk -F '.' '{print $1}'`
tcp_ports=$(grep "port" $squidconf |awk '{print $2}' |sed 's|.*[0-9]*\:||g'|sort|uniq|xargs)
json_content="{ \"{#SQPROC}\": \"/usr/local/squid/bin/squid\", \"{#SQPROC_IDENT}\": \"${squidident}\", \"{#SQPROC_ARGS}\": \"-D -C -F -f ${squidconf}\","
for port in $tcp_ports
do
json_content="$json_content \"{#SQPROC_PORT_$port}\": \"$port\","
done
json_content="$json_content \"{#SQPROC_CONF}\": \"${squidconf}\" }"
if [ $sqcur -lt $sqtotal ]; then
json_content="${json_content},"
fi
echo -n " ${json_content}" >> test.json
done
echo -n " ]" >> test.json
echo "}" >> test.json
I can't verify the rest of the loop. I have no squid! But as you re-calculate the tcp_ports in the for-loop, I removed the initialization before the for-loop
Hope this helps,
Good luck!

Modifying bash script to take each line in a file and execute command

I need to modify a bash script to to take each line in a file and execute command. I currently have this:
#!/bin/bash
if [ -z "$1" ] ; then
echo "Lipsa IP";
exit;
fi
i=1
ip=$1
while [ $i -le `wc -l pass_file | awk '{print $1}'` ] ; do
if [ -n "$ip" ]; then
rand=`head -$i pass_file | tail -1`
user=`echo $rand | awk '{print $1}'`
pass=`echo $rand | awk '{print $2}'`
CMD=`ps -eaf | grep -c mysql`
if [ "$CMD" -lt "50" ]; then
./mysql $ip $user $pass &
else
sleep 15
fi
i=`expr $i + 1`
fi
done
The password file is in format and name pfile:
username password
The intranet hosts file is in this format (line-by-line) and name hlist:
192.168.0.1
192.168.0.2
192.168.0.3
Any suggestions?
I don't understand what you want to do which you are not already doing. Do you want to use the ip number file in some fashion?
Anyway, the way you extract the username and password from the password file is unnecessarily complicated (to put it politely); you can iterate over the lines in a file in a much simpler fashion. Instead of:
while [ $i -le `wc -l pass_file | awk '{print $1}'` ] ; do
rand=`head -$i pass_file | tail -1`
user=`echo $rand | awk '{print $1}'`
pass=`echo $rand | awk '{print $2}'`
# ...
i=`expr $i + 1`
fi
Just use the bash (Posix) read command:
while read -r user pass __; do
# ...
done < pass_file
(The __ is in case there is a line in the pass_file with more than two values; the last variable name in the read command receives "the rest of the line").
I searched the web again and found a cleaner code which I adapted to suit my needs.
#!/bin/bash
while read ip
do
if [ -n "$ip" ]
then
while read user pass
do
CMD=`ps -eaf | grep -c mysql`
if [ "$CMD" -gt "50" ]
then
sleep 15
fi
./mysql $ip $user $pass &
done < pass_file
fi
done < host_file