Add values named per element in loop - json

I have the next piece of code:
echo -n "{ \"data\":" > test.json
echo -n " [" >> test.json
sqcur=0
sqtotal=`ls /etc/squid?.conf | wc -l`
tcp_ports=$(grep "port" /etc/squid?.conf |awk '{print $2}' |sed 's|.*[0-9]*\:||g'|sort|uniq|xargs)
for squidconf in `ls /etc/squid?.conf`; do
let "sqcur+=1" > /dev/null
squidident=`echo $squidconf | sed 's/.*\///' | awk -F '.' '{print $1}'`
tcp_ports=$(grep "port" $squidconf |awk '{print $2}' |sed 's|.*[0-9]*\:||g'|sort|uniq|xargs)
if [ $sqcur -lt $sqtotal ]; then
echo -n " { \"{#SQPROC}\": \"/usr/local/squid/bin/squid\", \"{#SQPROC_IDENT}\": \"${squidident}\", \"{#SQPROC_ARGS}\": \"-D -C -F -f ${squidconf}\", \"{#SQPROC_PORT}\": \"${tcp_ports}\", \"{#SQPROC_CONF}\": \"${squidconf}\" }," >> test.json
else
echo -n " { \"{#SQPROC}\": \"/usr/local/squid/bin/squid\", \"{#SQPROC_IDENT}\": \"${squidident}\", \"{#SQPROC_ARGS}\": \"-D -C -F -f ${squidconf}\", \"{#SQPROC_PORT}\": \"${tcp_ports}\", \"{#SQPROC_CONF}\": \"${squidconf}\" }" >> test.json
fi
done
echo -n " ]" >> test.json
echo "}" >> test.json
It gives the next content in json file:
{
"data": [
{
"{#SQPROC}": "/usr/local/squid/bin/squid",
"{#SQPROC_IDENT}": "squid1",
"{#SQPROC_ARGS}": "-D -C -F -f /etc/squid1.conf",
"{#SQPROC_PORT}": "1111 2222",
"{#SQPROC_CONF}": "/etc/squid1.conf"
},
But I want in case of multiple values of port to obtain the next result:
{
"data": [
{
"{#SQPROC}": "/usr/local/squid/bin/squid",
"{#SQPROC_IDENT}": "squid1",
"{#SQPROC_ARGS}": "-D -C -F -f /etc/squid1.conf",
"{#SQPROC_PORT_1111}": "1111",
"{#SQPROC_PORT_2222}": "2222",
"{#SQPROC_CONF}": "/etc/squid1.conf"
},
Could you please help with it?

You're probably looking for a something like this:
for port in $tcpports
do
all_ports="${all_ports} \"{#SQPROC_PORT_$port}\": \"$port\","
done
At the end of this loop you'll have an all_ports variable containing something like:
"{#SQPROC_PORT_1111}": "1111", "{#SQPROC_PORT_2222}": "2222",
for example.
To integrate that easily in the code you already have I would propose some little changes in your for-loop. For each iteration build up a variable json_content that will containt the data for the json at that iteration and at the end of the iteration, just print $json_content:
echo -n "{ \"data\":" > test.json
echo -n " [" >> test.json
sqcur=0
sqtotal=`ls /etc/squid?.conf | wc -l`
for squidconf in `ls /etc/squid?.conf`; do
let "sqcur+=1" > /dev/null
squidident=`echo $squidconf | sed 's/.*\///' | awk -F '.' '{print $1}'`
tcp_ports=$(grep "port" $squidconf |awk '{print $2}' |sed 's|.*[0-9]*\:||g'|sort|uniq|xargs)
json_content="{ \"{#SQPROC}\": \"/usr/local/squid/bin/squid\", \"{#SQPROC_IDENT}\": \"${squidident}\", \"{#SQPROC_ARGS}\": \"-D -C -F -f ${squidconf}\","
for port in $tcp_ports
do
json_content="$json_content \"{#SQPROC_PORT_$port}\": \"$port\","
done
json_content="$json_content \"{#SQPROC_CONF}\": \"${squidconf}\" }"
if [ $sqcur -lt $sqtotal ]; then
json_content="${json_content},"
fi
echo -n " ${json_content}" >> test.json
done
echo -n " ]" >> test.json
echo "}" >> test.json
I can't verify the rest of the loop. I have no squid! But as you re-calculate the tcp_ports in the for-loop, I removed the initialization before the for-loop
Hope this helps,
Good luck!

Related

How to parse the csv file without two blank fields in awk?

The csv structure is such a way as below.
"field1","field2","field3,with,commas","field4",
Ther are four fields in the csv file.
the first : field1
the second : field2
the third : field3,with,commas
the forth : field4
Here is my regular expression for awk.
'^"|","|",$'
debian8#hwy:~$ echo '"field1","field2","field3,with,commas","field4",' |awk -F '^"|","|",$' '{print NF}'
6
debian8#hwy:~$ echo '"field1","field2","field3,with,commas","field4",' |awk -F '^"|","|",$' '{print $1}'
debian8#hwy:~$ echo '"field1","field2","field3,with,commas","field4",' |awk -F '^"|","|",$' '{print $2}'
field1
debian8#hwy:~$ echo '"field1","field2","field3,with,commas","field4",' |awk -F '^"|","|",$' '{print $3}'
field2
debian8#hwy:~$ echo '"field1","field2","field3,with,commas","field4",' |awk -F '^"|","|",$' '{print $4}'
field3,with,commas
debian8#hwy:~$ echo '"field1","field2","field3,with,commas","field4",' |awk -F '^"|","|",$' '{print $5}'
field4
debian8#hwy:~$ echo '"field1","field2","field3,with,commas","field4",' |awk -F '^"|","|",$' '{print $6}'
Two issues remains in my regular expression '^"|","|",$'.
1.4 fiels to be parsed as 6 fields by '^"|","|",$'.
2.$1 and $6 was parsed into blank.
How to write a regular expression format to make:
echo '"field1","field2","field3,with,commas","field4",' |awk -F format '{print NF}'
4
debian8#hwy:~$ echo '"field1","field2","field3,with,commas","field4",' |awk -F format '{print $1}'
field1
debian8#hwy:~$ echo '"field1","field2","field3,with,commas","field4",' |awk -F format '{print $2}'
field2
debian8#hwy:~$ echo '"field1","field2","field3,with,commas","field4",' |awk -F foramt '{print $3}'
field3,with,commas
debian8#hwy:~$ echo '"field1","field2","field3,with,commas","field4",' |awk -F format '{print $4}'
field4
A workaround could be to set FS to "," and remove using gsub the character at the start and the end of each record:
echo '"field1","field2","field3,with,commas","field4",' | awk -v FS='","' '{gsub(/^"|",$/, ""); print NF, $1, $2, $3, $4}'
4 field1 field2 field3,with,commas field4
I think that the FPAT variable is probably what you want. Take a look at the documentation and examples in the Users Guide

how to covert a text file in json format using shell script

sample text
[08/12/2016 01:26:17 kup01 - NBU status: 96, EMM status: No media is]
expecting output
$json ="[{""date"":""08/12‌​‌​‌​/2016"",
""time"":‌​""‌​01‌​:26:17"",
""ho‌​st""‌​:"ku‌​p01",
"sta‌​tu code":"09",
"emm status":"No media is""
}]
Try this code.
-bash-4.1$ sh test
{
"date":"08/12/2016"
"time":"01:26:17"
"host":"kup01"
"status code":"96,"
"emm status":" No media is"
}
-bash-4.1$ cat test
#!/bin/bash
SHELL_OUPTUT="[08/12/2016 01:26:17 kup01 - NBU status: 96, EMM status: No media is]"
echo -e "{
\"date\":\"$(echo $SHELL_OUPTUT|sed -e 's/\[//' -e 's/\]//'|awk '{print $1}')\"
\"time\":\"$(echo $SHELL_OUPTUT|sed -e 's/\[//' -e 's/\]//'|awk '{print $2}')\"
\"host\":\"$(echo $SHELL_OUPTUT|sed -e 's/\[//' -e 's/\]//'|awk '{print $3}')\"
\"status code\":\"$(echo $SHELL_OUPTUT|sed -e 's/\[//' -e 's/\]//'|awk '{print $7}')\"
\"emm status\":\"$(echo $SHELL_OUPTUT|sed -e 's/\[//' -e 's/\]//'|awk -F : '{print $NF}')\"
}"
-bash-4.1$
try this;
echo "08/12/2016 01:26:17 kup01 - NBU status: 96, EMM status: No media is" | awk '{print "$json =\"[{\"\"date\"\":\"\"" $1 "\"\", \"\"time\"\":‌​\"\"‌​" $2 "\"\", \"\"ho‌​st\"\"‌​:\"" $3 "\", \"sta‌​tu code\":\"" $6 "\", \"emm status\":\"" $10" "$11" "$12"\"\" }]"}'

Modifying bash script to take each line in a file and execute command

I need to modify a bash script to to take each line in a file and execute command. I currently have this:
#!/bin/bash
if [ -z "$1" ] ; then
echo "Lipsa IP";
exit;
fi
i=1
ip=$1
while [ $i -le `wc -l pass_file | awk '{print $1}'` ] ; do
if [ -n "$ip" ]; then
rand=`head -$i pass_file | tail -1`
user=`echo $rand | awk '{print $1}'`
pass=`echo $rand | awk '{print $2}'`
CMD=`ps -eaf | grep -c mysql`
if [ "$CMD" -lt "50" ]; then
./mysql $ip $user $pass &
else
sleep 15
fi
i=`expr $i + 1`
fi
done
The password file is in format and name pfile:
username password
The intranet hosts file is in this format (line-by-line) and name hlist:
192.168.0.1
192.168.0.2
192.168.0.3
Any suggestions?
I don't understand what you want to do which you are not already doing. Do you want to use the ip number file in some fashion?
Anyway, the way you extract the username and password from the password file is unnecessarily complicated (to put it politely); you can iterate over the lines in a file in a much simpler fashion. Instead of:
while [ $i -le `wc -l pass_file | awk '{print $1}'` ] ; do
rand=`head -$i pass_file | tail -1`
user=`echo $rand | awk '{print $1}'`
pass=`echo $rand | awk '{print $2}'`
# ...
i=`expr $i + 1`
fi
Just use the bash (Posix) read command:
while read -r user pass __; do
# ...
done < pass_file
(The __ is in case there is a line in the pass_file with more than two values; the last variable name in the read command receives "the rest of the line").
I searched the web again and found a cleaner code which I adapted to suit my needs.
#!/bin/bash
while read ip
do
if [ -n "$ip" ]
then
while read user pass
do
CMD=`ps -eaf | grep -c mysql`
if [ "$CMD" -gt "50" ]
then
sleep 15
fi
./mysql $ip $user $pass &
done < pass_file
fi
done < host_file

How to have a good json format in Bash?

I have a script wherein when I get the value I will going to output them in a json format.
Below is my script:
#!/bin/bash
if
[ "$(egrep -l 'TR.834' /home/user/for_test_dir/*) " ];
then
egrep -l 'TR.834' /home/user/for_test_dir/* > /tmp/outbound.txt
chmod 777 /tmp/result.txt
#json_File=~/outboundFile.json
info_template='
{
"date" : "%s",
"time" : "%s",
"type" : "%s",
"num" : "%s"
},\n'
echo '{' #>> $json_File
echo -e '\t"List": {' #>> $json_File
echo -e '\t\t"Data" : [' #>> $json_File
a=1
while read LINE
do
fullpath[$a]=$LINE
filename[$a]=`basename $LINE`
cat $LINE | awk '{gsub("-",RS);print}' > /tmp/temp2.txt
chmod 777 /tmp/temp2.txt
cat /tmp/temp2.txt | awk '{gsub("*",RS);print}' > /tmp/temp.txt
chmod 777 /tmp/temp.txt
DATE[$a]=`sed -n '10p' < /tmp/temp.txt`
TIME[$a]=`sed -n '11p' < /tmp/temp.txt`
TYPE[$a]=`sed -n '28p' < /tmp/temp.txt`
NUM[$a]=`grep -oP 'BEN.\K[A-Z0-9_-\s]*' < /tmp/temp2.txt | awk '{ printf "%s,", $0 }'| sed 's/,\+$//' | awk '!x[$0]++'`
printf "$template" "${DATE[$a]}"\
"${TIME[$a]}"\
"${TYPE[$a]}"\
"${num[$a]}" #>> $json_File
done < /tmp/result.txt
fi
echo -e "\t\t]" #>> $json_File
echo -e "\t}" #>> $json_File
echo -e "}" #>> $json_File
My problem here is when I extract more than one data. Like for example, "NUM": "3232", "232343", "23345",
and my expected output for this would be like this:
"NUM" : [
"3232",
"232343",
"23345"
]
For single-data, "NUM" : "23234" (format would be as is). Badly need your help.
echo '{ "NUM": "3232", "232343", "23345" }' | python -m json.tool
or
sudo gem install json
echo '{ "NUM": "3232", "232343", "23345" }' | prettify_json.rb

bash script - grep a json variable

I would like the simplest solution for my pretty basic bash script:
#!/bin/bash
# Weather API url format: http://api.wunderground.com/api/{api_key}/conditions/q/CA/{location}.json
# http://api.wunderground.com/api/5e8747237f05d669/conditions/q/CA/tbilisi.json
api_key=5e8747237f05d669
location=tbilisi
temp=c
api=$(wget -qO- http://api.wunderground.com/api/$api_key/conditions/q/CA/$location.json)
temp_c=$api | grep temp_c
temp_f=$api | grep temp_f
if [ $temp = "f" ]; then
echo $temp_f
else
echo $temp_c
fi
grep returns empty. This is my first bash script, I'm getting hold of syntax, so please point out obvious errors.
I also don't understand why I have $() for wget.
You can use:
temp_c=$(echo $api|awk '{print $2}' FS='temp_c":'|awk '{print $1}' FS=',')
temp_f=$(echo $api|awk '{print $2}' FS='temp_f":'|awk '{print $1}' FS=',')
Instead of:
temp_c=$api | grep temp_c
temp_f=$api | grep temp_f
I am getting following JSON response from curl and storing in variable :
CURL_OUTPUT='{ "url": "protocol://xyz.net/9999" , "other_key": "other_value" }'
Question :I want to read the url key value and extract the id from that url:
Answer : _ID=$(echo $CURL_OUTPUT |awk '{print $2}' FS='url":' |awk '{print $1}' FS=',' | awk '{print $2}' FS='"'|awk '{print $4}' FS='/')