extract token from curl result by shell script - json

I write this script
#!/bin/bash
# cm.sh
curl -i \
-H "Content-Type: application/json" \
-d '
{ "auth": {
"identity": {
"methods": ["password"],
"password": {
"user": {
"name": "admin",
"domain": { "id": "default" },
"password": "secret"
}
}
}
}
}' \
"http://localhost/identity/v3/auth/tokens" ; echo
echo $tokenizer1
echo $tokenizer2
But all of them(awk or sed) it's the same
% Total % Received % Xferd Average Speed Time Time Time Current
Dload Upload Total Spent Left Speed
100 540 100 312 100 228 312 228 0:00:01 --:--:-- 0:00:01 5142
My goal is to put the token in a variable for later.
Thanks guys in advance.

Instead of using the direct result of cURL, you could save the result in a file, and use your grep command on it.
Something like this maybe :
curl -o boulou.txt http://localhost/identity/v3/auth/tokens && cat boulou.txt | grep "X-Subject-Token" | awk '{printf $2}'
Edit, if you just want you desired output, add the --silent to the cURL command :
curl -o boulou.txt http://localhost/identity/v3/auth/tokens --silent && cat boulou.txt | grep "X-Subject-Token" | awk '{printf $2}'
Edit 2: If you want to export it, and delete your file, you could use something like this :
export OS_TOKEN=$(curl -o billy.txt hhttp://localhost/identity/v3/auth/tokens --silent && cat billy.txt | grep "X-Subject-Token" | awk '{printf $2}') && rm billy.txt

"How do I use grep/awk to extract a header in a field from curl when I pass it a JSON document when the contents is stored in a variable?" is a very tricky and unique problem indeed.
However, if you gradually mock out every part of your code to narrow it down, you'll discover that this is the much easier question you could have researched or asked instead:
How do I use grep/awk on contents from a variable?
I have a variable containing HTTP headers, and I want to extract the value of one of them. Here's an example:
variable='Foo: 1
Bar: 2
Baz: 3'
This is what I've tried to get 2 from Bar:
# Just hangs
tokenizer1=$variable `grep "Bar" | awk '{printf $2}'`
# Is empty
tokenizer2=$variable | `grep "Bar" | awk '{printf $2}'`
The answer here is to use echo to pipe the contents so that grep can read it on stdin:
tokenizer3=$(echo "$variable" | grep "Bar" | awk '{printf $2}')
This is easily applied to your example:
tokenizer3=$(echo "$token" | grep "X-Subject-Token" | awk '{printf $2}')
echo "The value is $tokenizer3"

Related

Using jq with an unknown amount of arguments from shell script?

I'm about to lose my mind working with jq for the first time today. I've tried every ever so dirty way to make this work somehow. Let's get down to business:
I'm trying to further develop the pretty barebones API for Unifi Controllers for some use cases:
https://dl.ui.com/unifi/6.0.41/unifi_sh_api
Here's some mock up data (the real json data is thousands of lines):
{
"meta": {
"rc": "ok"
},
"data": [
{
"_id": "61a0da77f730e404af0edc3c",
"ip": "10.19.31.120",
"mac": "ff:ec:ff:89:ff:58",
"name": "Test-Accesspoint"
}
]
}
Here is my advancedapi.sh
#!/bin/bash
source unifiapi.sh
unifi_login 1>dev/null
if [ "$1" ];
then
command=$1
shift
else
echo "Please enter command"
exit
fi
#jq -r '.data[] | "\(.name),\(.mac)"'
assembleKeys(){
c="0"
seperator=";"
keys=$(for i in "$#"
do
c=$(expr $c + 1)
if [ $c -gt 1 ];
then
echo -n "${seperator}\(.${i})"
else
echo -n "\(.${i})"
fi
done
echo)
}
assembleJQ(){
c=0
jq=$(echo -en 'jq -r -j \x27.data[] | '
for i in "$#"
do
if [ "$c" != "0" ];
then
echo -en ",\x22 \x22,"
fi
c=1
echo -en ".$i"
done
echo -en ",\x22\\\n\x22\x27")
echo "$jq"
}
getDeviceKeys(){
assembleJQ "$#"
#unifi_list_devices | jq -r '.data[]' | jq -r "${keys}"
#export keys=mac
#export second=name
#unifi_list_devices | jq -r -j --arg KEYS "$keys" --arg SECOND "$second" '.data[] | .[$KEYS]," ",.[$SECOND],"\n"'
unifi_list_devices | $jq
#unifi_list_devices | jq -r -j '.data[] | .mac," ",.name,"\n"'
}
"$command" $#
Users should be able to call any function from the API with as many arguments as they want.
So basically this:
#export keys=mac
#export second=name
#unifi_list_devices | jq -r -j --arg KEYS "$keys" --arg SECOND "$second" '.data[] | .[$KEYS]," ",.[$SECOND],"\n"'
which works, but only with a limited number of arguments. I want to pass $# to jq.
I'm trying to get the name and mac-address of a device by calling:
./advancedapi.sh getDeviceKeys mac name
this should return the mac-address and name of the device. But it only gives me this:
jq -r -j '.data[] | .mac," ",.name,"\n"'
jq: error: syntax error, unexpected INVALID_CHARACTER, expecting $end (Unix shell quoting issues?) at <top-level>, line 1:
'.data[]
jq: 1 compile error
The top-most line being the jq command, which works perfectly fine when called manually.
The assembleKeys function was my attempt at generating a string that looks like this:
jq -r '.data[] | "\(.name),\(.mac)"'
Can anyone explain to me, preferably in-depth, how to do this? I'm going insane here!
Thanks!
I want to pass $# to jq.
There are actually many ways to do this, but the simplest might be using the --args command-line option, as illustrated here:
File: check
#/bin/bash
echo "${#}"
jq -n '$ARGS.positional' --args "${#}"
Example run:
./check 1 2 3
1 2 3
[
"1",
"2",
"3"
]
Projection
Here's an example that might be closer to what you're looking for.
Using your sample JSON:
File: check2
#/bin/bash
jq -r '
def project($array):
. as $in | reduce $array[] as $k ([]; . + [$in[$k]]);
$ARGS.positional,
(.data[] | project($ARGS.positional))
| #csv
' input.json --args "${#}"
./check2 name mac
"name","mac"
"Test-Accesspoint","ff:ec:ff:89:ff:58"
There is error in the json (red highlighted commas)(that commas are not needed)
$
$ cat j.json | jq -r '.data[] | "\(.name)█\(.mac)█\(.ip)█\(._id)"'
Test-Accesspoint█ff:ec:ff:89:ff:58█10.19.31.120█61a0da77f730e404af0edc3c
$ cat j.json
{
"meta": {
"rc": "ok"
},
"data": [
{
"_id": "61a0da77f730e404af0edc3c",
"ip": "10.19.31.120",
"mac": "ff:ec:ff:89:ff:58",
"name": "Test-Accesspoint"
}
]
}
$
If the api have given this json you should probably edit it, before piping it into jq [maybe with sed or something like it]

How to grep two row values from a same json file and display them as columnwise side by side in another file?

example_json_file:
{
"href": "/users/115",
"id": 115,
"username": "test",
"locked": false,
"type": "local",
"effective_groups":
}
Assume we have a lot of these in a file. I want to grep the ID and username values from the json file and put them as comma-separated values side by side columns in another file?
I tried the following to grep
sed -E 's/},\s*{/},\n{/g' user_id_json.txt | grep '"id"'
It helps to grep individual field values. However, I need help in merging two field values like ID and USERNAME and put them in a separate file with comma separated values using a single command.
expected output
test,115
With valid JSON and jq:
jq -r '"\(.id),\(.username)"' file
Output:
115,test
This requires Ed. If we hypothetically assume that each JSON set has the same number of values/lines; which I know we probably shouldn't, then...
#!/bin/sh
cat >> edchop+.txt << EOF
1,8W temp
1,8d
wq
EOF
while [ $(wc -l example_json_file | cut -d' ' -f1) -gt 0 ]
do
ed -s example_json_file < edchop+.txt
name=$(sed -n '/username/p' temp | cut -d':' -f2 | tr -d ' ' | tr -d ',' | tr -d '\"')
id=$(sed -n '/id/p' temp | cut -d':' -f2 | tr -d ' ' | tr -d ',')
echo -e "${name}","${id}" >> example_csvfile
done
rm -v ./edchop+.txt
rm -v ./temp

use curl/bash command in jq

I am trying to get a list of URL after redirection using bash scripting. Say, google.com gets redirected to http://www.google.com with 301 status.
What I have tried is:
json='[{"url":"google.com"},{"url":"microsoft.com"}]'
echo "$json" | jq -r '.[].url' | while read line; do
curl -LSs -o /dev/null -w %{url_effective} $line 2>/dev/null
done
So, is it possible for us to use commands like curl inside jq for processing JSON objects.
I want to add the resulting URL to existing JSON structure like:
[
{
"url": "google.com",
"redirection": "http://www.google.com"
},
{
"url": "microsoft.com",
"redirection": "https://www.microsoft.com"
}
]
Thank you in advance..!
curl is capable of making multiple transfers in a single process, and it can also read command line arguments from a file or stdin, so, you don't need a loop at all, just put that JSON into a file and run this:
jq -r '"-o /dev/null\nurl = \(.[].url)"' file |
curl -sSLK- -w'%{url_effective}\n' |
jq -R 'fromjson | map(. + {redirection: input})' file -
This way only 3 processes will be spawned for the whole task, instead of n + 2 where n is the number of URLs.
I would generate a dictionary with jq per url and slurp those dictionaries into the final list with jq -s:
json='[{"url":"google.com"},{"url":"microsoft.com"}]'
echo "$json" | jq -r '.[].url' | while read url; do
redirect=$(curl -LSs \
-o /dev/null \
-w '%{url_effective}' \
"${url}" 2>/dev/null)
jq --null-input --arg url "${url}" --arg redirect "${redirect}" \
'{url:$url, redirect: $redirect}'
done | jq -s
Alternative (first) solution:
You can output the url and the effective_url as tab separated data and create the output json with jq:
json='[{"url":"google.com"},{"url":"microsoft.com"}]'
echo "$json" | jq -r '.[].url' | while read line; do
prefix="${line}\t"
curl -LSs -o /dev/null -w "${prefix}"'%{url_effective}'"\n" "$line" 2>/dev/null
done | jq -r --raw-input 'split("\t")|{"url":.[0],"redirection":.[1]}'
Both solutions will generate valid json, independently of whatever characters the url/effective_url might contain.
Trying to keep this in JSON all the way is pretty cumbersome. I would simply try to make Bash construct a new valid JSON fragment inside the loop.
So in other words, if $url is the URL and $redirect is where it redirects to, you can do something like
printf '{"url": "%s", "redirection": "%s"}\n' "$url" "$redirect"
to produce JSON output from these strings. So tying it all together
jq -r '.[].url' <<<"$json" |
while read -r url; do
printf '{"url:" "%s", "redirection": "%s"}\n' \
"$url" "$(curl -LSs -o /dev/null -w '%{url_effective}' "$url")"
done |
jq -s
This is still pretty brittle; in particular, if either of the printf input strings could contain a literal double quote, that should properly be escaped.

Sh Script JSON values from JSON string

i have a file which contains an JSON object as a string:
{"STATUS":[{"STATUS":"S","When":1530779438,"Code":70,"Msg":"CGMiner stats","Description":"cgminer 4.9.0"}],"STATS":[{"CGMiner":"4.9.0","Miner":"9.0.0.5","CompileTime":"Sat May 26 20:42:30 CST 2018","Type":"Antminer Z9-Mini"},{"STATS":0,"ID":"ZCASH0","Elapsed":179818,"Calls":0,"Wait":0.000000,"Max":0.000000,"Min":99999999.000000,"GHS 5s":"16.39","GHS av":16.27,"miner_count":3,"frequency":"750","fan_num":1,"fan1":5760,"fan2":0,"fan3":0,"fan4":0,"fan5":0,"fan6":0,"temp_num":3,"temp1":41,"temp2":40,"temp3":43,"temp2_1":56,"temp2_2":53,"temp2_3":56,"temp_max":43,"Device Hardware%":0.0000,"no_matching_work":0,"chain_acn1":4,"chain_acn2":4,"chain_acn3":4,"chain_acs1":" oooo","chain_acs2":" oooo","chain_acs3":" oooo","chain_hw1":0,"chain_hw2":0,"chain_hw3":0,"chain_rate1":"5.18","chain_rate2":"5.34","chain_rate3":"5.87"}],"id":1}
now i want to get some values from keys in this object within a sh script.
The following cmd works, but however not for all keys!?
this works: (i get "750")
grep -o '"frequency": *"[^"]*"' LXstats.txt | grep -o '"[^"]*"$'
but this not: (empty)
grep -o '"fan_num": *"[^"]*"' LXstats.txt | grep -o '"[^"]*"$'
same with this:
grep -o '"fan1": *"[^"]*"' LXstats.txt | grep -o '"[^"]*"$'
Working on a xilinx OS, which has no python, so "jq" will not work and grep has no "-P" option. So anyone have an idea to work with that? :)
Thanks and best regards,
dave
When you want to do more than just g/re/p you should be using awk, not combinations of greps+pipes, etc.
$ awk -v tag='frequency' 'match($0,"\""tag"\": *(\"[^\"]*|[0-9]+)") { val=substr($0,RSTART,RLENGTH); sub(/^"[^"]+": *"?/,"",val); print val }' file
750
$ awk -v tag='fan_num' 'match($0,"\""tag"\": *(\"[^\"]*|[0-9]+)") { val=substr($0,RSTART,RLENGTH); sub(/^"[^"]+": *"?/,"",val); print val }' file
1
$ awk -v tag='fan1' 'match($0,"\""tag"\": *(\"[^\"]*|[0-9]+)") { val=substr($0,RSTART,RLENGTH); sub(/^"[^"]+": *"?/,"",val); print val }' file
5760
The above will work with any awk in any shell on any UNIX box. If you have GNU awk for the 3rd arg to match() and gensub() you can write it a bit briefer:
$ awk -v tag='frequency' 'match($0,"\""tag"\": *(\"[^\"]*|[0-9]+)",a) { print gensub(/^"/,"",1,a[1]) }' file
750

Parsing curl json output using awk

Using this command in a loop:
curl -s -X POST -F "filedata=#inputfile" -F "containerid=documentLibrary" -F "destination=filelocation" http://user:pass#server:host/service/api/upload;
I am able to get the following:
{
"nodeRef": "123",
"fileName": "filename.pdf",
"status":
{
"code": 200,
"name": "OK",
"description": "File uploaded successfully"
}
}
this snippet will appear 100s, 1000s of times. I need to extract just the nodeRef value (123) and put into a csv using awk or any other parsing tool that does not require installation like jq.
lacking specialized json tools, perhaps grep
... | grep -oP '(?<="nodeRef": ")\w+'
still assumes something about the format...
Few awk approach:-
<curl command> | awk -F'"' '/nodeRef/{ print $(NF-1) }'
<curl command> | awk '/nodeRef/{X=$NF;gsub(/[",]/,Y,X);print X}'