I have a bash script that output json, but apparently, I can't have a comma at the end of the list items as seen below. How do I get the last item in the list to not have a comma on the end, while keeping it on all the rest of the items?
Ideally, I'd like to do this in the loop, but I'm okay with stripping it off after creation.
Here is a portion of the script.
#Print out a list of hosts in groups
for group in $grouplist; do
printf '\t"%s": {\n' $group
echo -ne '\t\t"hosts": ['
for host in `grep $group $results|awk '{print $1}'`; do
printf '"%s", ' "$host"
done
echo ']'
echo -e '\t},'
done
It outputs a json list ending like this.
},
}
I have this in a function so I have messed around with a variety of sed stuff to clean it up, but haven't managed to pull it off.
function | sed "s/something/another/"
Maybe you can get the len of $grouplist and keep track of the loop count, then echo with and if;then; else, something like this:
len=$(wc -w <<< $grouplist)
i=0
for group in $grouplist; do
let i++
printf '\t"%s": {\n' $group
echo -ne '\t\t"hosts": ['
for host in `grep $group $results|awk '{print $1}'`; do
printf '"%s", ' "$host"
done
echo ']'
if (( $i < $len )); then
echo -e '\t},'
else
echo -e '\t}'
fi
done
If there is an alternative to bash, I will appreciate it too.
I have a large dump of MySQL commands (over 10 GB)
When restoring the dump I get a few warnings and occasionally an error. I need to execute those commands and process all warnings and errors. And, preferibly to do it automatically.
mysql --show-warnings
tee logfile.log
source dump.sql
The logfile will contain many lines telling each command was successful, and will display some warnings, particulartly truncate colums. But the original file has tens of thousands of very large INSERTs, the log is not particularly helpful. Despite it requires some kind of supervised interaction. (I cannot program a crontab, for example.)
#!/bin/bash
echo "tee logfile.log" > script.sql
echo "source $1" > script.sql
mysql --show-warnings < script.sql > tmpfile.log 2>&1
cat tmpfile.log >> logfile.log
The tee command doesn't work in this batch environment. I can capture all the warnings, but I cannot figure out which command produced each warning.
So I came down with this small monstruosity:
#!/bin/bash
ERRFILE=$(basename "$0" .sh).err.log
LOGFILE=$(basename "$1" .sql).log
log_action() {
WARN=$(cat)
[ -z "$WARN" ] || echo -e "Line ${1}: ${WARN}\n${2}" >> "$LOGFILE"
}
echo 0 > "$ERRFILE"
log_error() {
ERNO=$(cat "$ERRFILE")
ERR=$(cat)
[ -z "$ERR" ] || echo -e "*** ERROR ***\nLine ${1}: ${ERR}\n${2}" >> "$LOGFILE"
(( ERNO++ ))
echo $ERNO > "$ERRFILE"
}
COUNT=0
COMMAND=''
echo -e "**** BEGIN $(date +%Y-%m-%d\ %H:%M:%S)\n" > "$LOGFILE"
exec 4> >(log_action $COUNT "$COMMAND")
exec 5> >(log_error $COUNT "$COMMAND")
exec 3> >(mysql --show-warnings >&4 2>&5)
while IFS='' read -r LINE || [[ -n "$line" ]]
do
(( COUNT++ ))
[ ${#LINE} -eq 0 ] && continue # discard blank lines
[ "${LINE:0:2}" = "--" ] && continue # discard comments
COMMAND+="$LINE" # build command
[ "${LINE: -1}" != ";" ] && continue # if not finnished keep building
echo $COMMAND >&3 # otherwise execute
COMMAND=''
done < "$1"
exec 3>$-
exec 5>$-
exec 4>$-
echo -e "**** END $(date +%Y-%m-%d\ %H:%M:%S)\n" >> "$LOGFILE"
ERRS=$(cat "$ERRFILE")
[ "ERRS" = 0 ] || echo "${ERRS} Errors." >&2
This scans the file at $1 and sends the commands to an open MySQL connection at &3. That part is working fine.
The capture of warnings and errors is not working though.
It only records the first error.
It only records the first warning.
I haven't find a good way to pass the line number $COUNT and offending command $COMMAND to the recording functions.
The only error is after the time stamps, and the only warning is after the error, which is not the chronology of the script.
Can't seem to understand the output of the following code snippet. trying to print the function return value in a loop
contains () {
local e
for e in "${#:2}"; do [[ "$e" == "$1" ]] && return 0; done
return 1
}
line="ayush"
line2="this is a line containing ayush"
contains $line $line2
echo $? #prints 0
for i in 1 2 3;do
contains "$line" "$line2"
echo $? #prints 1 everytime
done
#Ayush Goel
The Problem is here,
contains () {
local e
for e in "${#:2}"; do [[ "$e" == "$1" ]] && return 0; done
return 1
}
line="ayush"
line2="this is a line containing ayush"
contains $line $line2
echo $? #prints 0
for i in 1 2 3;do
contains $line $line2 # <------------------ ignore ""
echo $? # Now it will print 0
done
Difference between $var and "$var" :
1) $var case
var="this is the line"
for i in $var; do
printf $i
done
here it will print
this is the line
means $var is expanded using space
2)"$var" case
var="this is the line"
for i in "$var"; do
printf $i
done
this will print
this
here "$var" will be considered as a single argument and it will take only one value from the list.
I want to get the number of commits per day per hour. With the following command, I am able to get the output in json format. However, I would like to know if I can add the key to the values in json format using command line?
curl https://api.github.com/repos/test/myrepo/stats/punch_card
Current output:
[
0,
2,
32
]
Expected output:
[
day: 0,
hour: 2,
commits: 32
]
Since you haven't specified anything beyond "command line", I'm assuming you want a bash-based solution. This simple (though kind of ugly) script will do what you want, while maintaining indentation (apart from the closing square bracket of the overall response):
#!/bin/bash
resp=$(curl https://api.github.com/repos/test/myrepo/stats/punch_card)
nextPref=""
for val in $resp
do
echo "$nextPref $val"
if [[ $val == "[" && $nextPref == "" ]]
then
nextPref=" "
elif [[ $val == "[" && $nextPref == " " ]]
then
nextPref=" day:"
elif [[ $nextPref == " day:" ]]
then
nextPref=" hour:"
elif [[ $nextPref == " hour:" ]]
then
nextPref=" commits:"
elif [[ $nextPref == " commits:" ]]
then
nextPref=" "
fi
done
So I have a bash script which outputs details on servers. The problem is that I need the output to be JSON. What is the best way to go about this? Here is the bash script:
# Get hostname
hostname=`hostname -A` 2> /dev/null
# Get distro
distro=`python -c 'import platform ; print platform.linux_distribution()[0] + " " + platform.linux_distribution()[1]'` 2> /dev/null
# Get uptime
if [ -f "/proc/uptime" ]; then
uptime=`cat /proc/uptime`
uptime=${uptime%%.*}
seconds=$(( uptime%60 ))
minutes=$(( uptime/60%60 ))
hours=$(( uptime/60/60%24 ))
days=$(( uptime/60/60/24 ))
uptime="$days days, $hours hours, $minutes minutes, $seconds seconds"
else
uptime=""
fi
echo $hostname
echo $distro
echo $uptime
So the output I want is something like:
{"hostname":"server.domain.com", "distro":"CentOS 6.3", "uptime":"5 days, 22 hours, 1 minutes, 41 seconds"}
Thanks.
If you only need to output a small JSON, use printf:
printf '{"hostname":"%s","distro":"%s","uptime":"%s"}\n' "$hostname" "$distro" "$uptime"
Or if you need to produce a larger JSON, use a heredoc as explained by leandro-mora. If you use the here-doc solution, please be sure to upvote his answer:
cat <<EOF > /your/path/myjson.json
{"id" : "$my_id"}
EOF
Some of the more recent distros, have a file called: /etc/lsb-release or similar name (cat /etc/*release). Therefore, you could possibly do away with dependency your on Python:
distro=$(awk -F= 'END { print $2 }' /etc/lsb-release)
An aside, you should probably do away with using backticks. They're a bit old fashioned.
I find it much more easy to create the json using cat:
cat <<EOF > /your/path/myjson.json
{"id" : "$my_id"}
EOF
I'm not a bash-ninja at all, but I wrote a solution, that works perfectly for me. So, I decided to share it with community.
First of all, I created a bash script called json.sh
arr=();
while read x y;
do
arr=("${arr[#]}" $x $y)
done
vars=(${arr[#]})
len=${#arr[#]}
printf "{"
for (( i=0; i<len; i+=2 ))
do
printf "\"${vars[i]}\": ${vars[i+1]}"
if [ $i -lt $((len-2)) ] ; then
printf ", "
fi
done
printf "}"
echo
And now I can easily execute it:
$ echo key1 1 key2 2 key3 3 | ./json.sh
{"key1":1, "key2":2, "key3":3}
#Jimilian script was very helpful for me. I changed it a bit to send data to zabbix auto discovery
arr=()
while read x y;
do
arr=("${arr[#]}" $x $y)
done
vars=(${arr[#]})
len=${#arr[#]}
printf "{\n"
printf "\t"data":[\n"
for (( i=0; i<len; i+=2 ))
do
printf "\t{ "{#VAL1}":\"${vars[i]}\",\t"{#VAL2}":\"${vars[i+1]}\" }"
if [ $i -lt $((len-2)) ] ; then
printf ",\n"
fi
done
printf "\n"
printf "\t]\n"
printf "}\n"
echo
Output:
$ echo "A 1 B 2 C 3 D 4 E 5" | ./testjson.sh
{
data:[
{ {#VAL1}:"A", {#VAL2}:"1" },
{ {#VAL1}:"B", {#VAL2}:"2" },
{ {#VAL1}:"C", {#VAL2}:"3" },
{ {#VAL1}:"D", {#VAL2}:"4" },
{ {#VAL1}:"E", {#VAL2}:"5" }
]
}
I wrote a tiny program in Go, json_encode. It works pretty good for such cases:
$ ./getDistro.sh | json_encode
["my.dev","Ubuntu 17.10","4 days, 2 hours, 21 minutes, 17 seconds"]
data=$(echo " BUILD_NUMBER : ${BUILD_NUMBER} , BUILD_ID : ${BUILD_ID} , JOB_NAME : ${JOB_NAME} " | sed 's/ /"/g')
output => data="BUILD_NUMBER":"29","BUILD_ID":"29","JOB_NAME":"OSM_LOG_ANA"
To answer the subject line, if you were to needing to to get a tab separated output from any command line and needed it to be formatted as a JSON list of lists, you can do the following:
echo <tsv_string> | python3 -c "import sys,json; print(json.dumps([row.split('\t') for row in sys.stdin.read().splitlines() if True]))
For example, to get a zfs list output as json:
zfs list -Hpr -o name,creation,mountpoint,mounted | python3 -c "import sys,json; print(json.dumps([row.split('\t') for row in sys.stdin.read().splitlines() if True]))