find dir1 | grep .json | python -mjson.tool $1 > /dev/null
I am using the above command but this doesn't take the files as inputs. What should i do to check for all the json files in a folder and validate whether its a proper json.
I found this a while back and have been using it:
for j in *.json; do python -mjson.tool "$j" > /dev/null || echo "INVALID $j" >&2; done;
I'm sure it compares similarly to dasho-o's answer, but am including it as another option.
You need to use 'xarg'.
The pipe find/grep will place the file names of the json file to STDIN. You need to create a command line with those file names. This is what xargs does:
find dir1 | grep .json | xargs -L1 python -mjson.tool > /dev/null
Side notes:
Since 'find' has filtering and execution predicates, more compact line can be created
find dir1 -name '*.json' -exec python -mjson.tool '{}' ';'
Also consider using 'jq' as light weight alternative to validating via python.
Related
I have output.txt file, where my script is storing some outputs, I just need to get the output of ID which is in the 1st line of the output.txt in myscript.sh file, can someone suggest a way to do that
{"id":"**dad04f6d-4e06-4420-b0bc-cb2dcfee2dcf**","name":"new","url":"https://dev.azure.com/vishalmishra2424/82c93136-470c-4be0-b6da-a8234f49a695/_apis/git/repositories/dad04f6d-4e06-4420-b0bc-cb2dcfee2dcf","project":{"id":"82c93136-470c-4be0-b6da-a8234f49a695","name":"vishalmishra","url":"https://dev.azure.com/vishalmishra2424/_apis/projects/82c93136-470c-4be0-b6da-a8234f49a695","state":"wellFormed","revision":12,"visibility":"public","lastUpdateTime":"2021-04-22T14:24:47.2Z"},"size":0,"remoteUrl":"https://vishalmishra2424#dev.azure.com/vishalmishra2424/vishalmishra/_git/new","sshUrl":"git#ssh.dev.azure.com:v3/vishalmishra2424/vishalmishra/new","webUrl":"https://dev.azure.com/vishalmishra2424/vishalmishra/_git/new","isDisabled":false}
The snippet you posted looks like
JSON and a utility named file
which can guess different types of file says that too:
$ file output.txt
output.txt: JSON data
You should use JSON-aware tools to extract value of id, for example
jq:
$ jq -r '.id' output.txt
**dad04f6d-4e06-4420-b0bc-cb2dcfee2dcf**
or jshon:
$ jshon -e id < output.txt
"**dad04f6d-4e06-4420-b0bc-cb2dcfee2dcf**"
When I put a command on terminal, it works just fine, but when I put the same command in a .sh script and then run it, it doesn't give any output. What might be the reason for this?
The command:
IFS=$'\t'; while read -r k v; do
export "$k=\"$v\""
This is kind of expected since export sets the environment variable for that particular shell.
Docs -
export command is used to export a variable or function to the
environment of all the child processes running in the current shell.
export -f functionname # exports a function in the current shell. It
exports a variable or function with a value.
So when you create a sh script it runs the specified commands into a different shell which terminates once the script exits.
It works with the sh script too -
data.sh
#!/bin/bash
IFS=$'\t'; while read -r k v; do
export "$k=\"$v\""
echo $HELLO1
echo $SAMPLEKEY
done < <(jq -r '.data | to_entries[] | [(.key|ascii_upcase), .value] | #tsv' data.json)
Output -
$ ./data.sh
"world1"
"world1"
"samplevalue"
Which suggests that your variables are getting exported but for that particular shell env.
In case you want to make them persistent, try putting scripts or exporting them via ~/.bashrc OR ~/.profile.
Once you put them in ~/.bashrc OR ~/.profile, you will find the output something as below -
I used ~/.bash_profile on my MAC OS -
Last login: Thu Jan 25 15:15:42 on ttys006
"world1"
"world1"
"samplevalue"
viveky4d4v#020:~$ echo $SAMPLEKEY
"samplevalue"
viveky4d4v#020:~$ echo $HELLO1
"world1"
viveky4d4v#020:~$
Which clarifies that your env variables will get exported whenever you open a new shell, the logic for this lies in .bashrc (https://unix.stackexchange.com/questions/129143/what-is-the-purpose-of-bashrc-and-how-does-it-work)
Put your script as it is ~/.bashrc at the end -
IFS=$'\t'; while read -r k v; do
export "$k=\"$v\""
echo $HELLO1
echo $SAMPLEKEY
done < <(jq -r '.data | to_entries[] | [(.key|ascii_upcase), .value] | #tsv' data.json)
You need to make sure that data.json stays in user's home directory.
Basically: A child process can't change the environment of it's parent process.
You need to source the script instead of executing it:
source your_script.sh
source runs the script in the current shell which makes it possible to modify the environment.
Alternatively you can create a function in your shell startup files (e.g. ~/.bashrc):
my_function() {
IFS=$'\t'; while read -r k v; do
export "$k=\"$v\""
done < <(jq -r '.data | to_entries[] | [(.key|ascii_upcase), .value] | #tsv' /path/to/data.json)
}
After you've started a new shell you can run
my_function
Is there a commandline documentation to use jq? I am currently running this command:
%jq% -f JSON.txt -r ".sm_api_content"
It is supposed to read from JSON.txt and to output the value of sm_api_content (which is a string).
But I am getting this error:
jq: error: Could not open file .sm_api_content: No such file or directory
Can anyone help me out here?
-f is for specifying a filename to read your "filter" from - the filter in this case being .sm_api_content
It sounds as if you just want to run jq without -f, e.g.
jq -r .sm_api_content JSON.txt
I am trying to use a grep to search a JSON output, I used a curl command to return the data from a particular codeship build and I want to use GREP to store said ID value in a variable. However after I run the command and try to echo out the value of the variable its blank.
Below are the commands:
export API_KEY=abc123
export PROJECT_ID=123456
export LAST_BUILD_ID=$(curl -s https://codeship.com/api/v1/projects/$PROJECT_ID.json?api_key=$API_KEY | grep -Eo '"builds":\[{"id":\d+' | grep -Eo --color=never '\d+' | tail -1)
export LAST_BUILD_URL=$(echo "https://codeship.com/api/v1/builds/$LAST_BUILD_ID/restart.json?api_key=$API_KEY")
My response : never use grep nor regex to parse json.
Instead, use a proper json parser.
In shell, take a look to jq.
Example, adapt it a bit :
#!/bin/bash
API_KEY=abc123
PROJECT_ID=123456
html=$(curl -s https://codeship.com/api/v1/projects/$PROJECT_ID.json?api_key=$API_KEY)
LAST_BUILD_ID=$(jq '.builds | .[] | .never' <<< "$html") # just guessing
LAST_BUILD_URL=$(echo "https://codeship.com/api/v1/builds/$LAST_BUILD_ID/restart.json?api_key=$API_KEY")
Note
If you provide the JSON, I will be able to be more specific with the jq command
How can I extract extract a .depot file on HPUX?
The .depot file is a tarred dir stucture, with some of the files gzipped under the same name as original.
Note that my environment is quite limited - I can't have root, I don't have swinstall.
http://forums13.itrc.hp.com/service/forums/questionanswer.do?admit=109447627+1259826031876+28353475&threadId=1143807
At best, the solution should work on Linux too.
I have tried to untar and gunzip -f -r -d -v --suffix= .
But the problem is that the gzipped files have no suffix, so in the end, gzip deletes them.
It was relatively easy:
for f in `find -type f` ; do
mv $f $f.gz
gunzip $f.gz
done