I still cannot parse JSON data in linux.
I need a linux command to parse Json data to readable string.
someone told me to use underscore-cli.(https://npmjs.org/package/underscore-cli)
I install and use it, still the result is unreadable.
my data:
"2005\u5e7405\u670812\u65e5(\u6728) 02\u664216\u5206"
according to this link
http://json.parser.online.fr/
the result is
"2005年05月12日(木) 02時16分"
Is there any other way to parse this Json data?
Please help.
Try jq: http://stedolan.github.com/jq/
echo '"2005\u5e7405\u670812\u65e5(\u6728) 02\u664216\u5206"' | ./jq .
"2005年05月12日(木) 02時16分"
jq takes escaped unicode and outputs it in utf-8.
Related
I want to extract information like Name, Description, LHOST, SOURCE from the following file.
https://github.com/rapid7/metasploit-framework/blob/master/modules/auxiliary/docx/word_unc_injector.rb
Could anybody let me know how to do so robustly? One way is to convert the ruby file in to json format (then json can be processed easily). But I am sure how to convert ruby file to json? Could anybody let me know? Thanks.
The following script rb2json0.rb works.
#!/usr/bin/env ruby
require 'ripper'
require 'json'
puts Ripper.sexp($stdin.read).to_json()
$ ./rb2json0.rb <<< 'def hello(world) "Hello, #{world}!"; end'
["program",[["def",["#ident","hello",[1,4]],["paren",["params",[["#ident","world",[1,10]]],null,null,null,null,null,null]],["bodystmt",[["string_literal",["string_content",["#tstring_content","Hello, ",[1,18]],["string_embexpr",[["var_ref",["#ident","world",[1,27]]]]],["#tstring_content","!",[1,33]]]]],null,null,null]]]]
Is there a way to convert JSONL to JSON in Linux with the full JSONL file depth? I found some methods based on jq but they don't work with full depth of JSONL file
Would something like this work?
#!/bin/sh
echo "[" >$1.json
perl -pe 's/$/,/' <$1 >>$1.json
echo "]" >>$1.json
I am quite confused as to what you want to do. But when it comes to jq, normally I process things line by line, with each line being an atomic JSON object. Something like
cat file | jq some-options 'some commands' > output.txt
Sometimes I get the output in tsv format and pipe it into awk. jq is very friendly with line-by-line objects.
To convert a large JSON list into line by line format, just parse the large object in any programming language, and serialize the inner objects back to json line by line.
But if you have already parsed the large object, I suggest you do the required processing you want to do in jq directly, without serializing the inner objects back...
Wanted to parse json:
{"FileStatus":"accessTime":1472892839430,"blockSize":134217728,"childrenNum":0,"fileId":17226,"group":"admin","length":115714,"modificationTime":1469649837471,"owner":"admin","pathSuffix":"","permission":"755","replication":2,"storagePolicy":0,"type":"FILE"}}
I tried something like this but not able to get it.
$ {"FileStatus":{"accessTime":1472892839430}} | jq '.FileStatus.accessTime'
Error:
-bash: {FileStatus:{accessTime:1472892839430}}: command not found`
Can someone help me to parse this whole json.
To make a command read a string on stdin in bash, use a "here string" like this:
$ jq '.FileStatus.accessTime' <<<'{"FileStatus":{"accessTime":1472892839430}}'
1472892839430
Also, you need to properly quote the text, so that bash doesn't try to interpret in some way you don't intend. When you want to preserve it literally, use single quotes (').
I have a working code for parsing a JSON output using KornShell by treating it as a string of characters. The issue I have is that the vendor keeps changing the position of the field that I am intersted in. I understand in JSON, we can parse it by key-value pairs.
Is there something out there that can do this? I am intersted in a specific field and I would like to use it to run the checks on the status of another RESTAPI call.
My sample json output is like this:
JSONDATA value :
{
"status": "success",
"job-execution-id": 396805,
"job-execution-user": "flexapp",
"job-execution-trigger": "RESTAPI"
}
I would need the job-execution-id value to monitor this job through the rest of the script.
I am using the following command to parse it:
RUNJOB=$(print ${DATA} |cut -f3 -d':'|cut -f1 -d','| tr -d [:blank:]) >> ${LOGDIR}/${LOGFILE}
The problem with this is, it is field delimited by :. The field position has been known to be changed by the vendors during releases.
So I am trying to see if I can use a utility out there that would always give me the key-value pair of "job-execution-id": 396805, no matter where it is in the json output.
I started looking at jsawk, and it requires the js interpreter to be installed on our machines which I don't want. Any hint on how to go about finding which RPM that I need to solve it?
I am using RHEL5.5.
Any help is greatly appreciated.
The ast-open project has libdss (and a dss wrapper) which supposedly could be used with ksh. Documentation is sparse and is limited to a few messages on the ast-user mailing list.
The regression tests for libdss contain some json and xml examples.
I'll try to find more info.
Python is included by default with CentOS so one thing you could do is pass your JSON string to a Python script and use Python's JSON parser. You can then grab the value written out by the script. An example you could modify to meet your needs is below.
Note that by specifying other dictionary keys in the Python script you can get any of the values you need without having to worry about the order changing.
Python script:
#get_job_execution_id.py
# The try/except is because you'll probably have Python 2.4 on CentOS 5.5,
# and the straight "import json" statement won't work unless you have Python 2.6+.
try:
import json
except:
import simplejson as json
import sys
json_data = sys.argv[1]
data = json.loads(json_data)
job_execution_id = data['job-execution-id']
sys.stdout.write(str(job_execution_id))
Kornshell script that executes it:
#get_job_execution_id.sh
#!/bin/ksh
JSON_DATA='{"status":"success","job-execution-id":396805,"job-execution-user":"flexapp","job-execution-trigger":"RESTAPI"}'
EXECUTION_ID=`python get_execution_id.py "$JSON_DATA"`
echo $EXECUTION_ID
Trying to use a query with mongoexport results in an error. But the same query is evaluated by the mongo-client without an error.
In mongo-client:
db.listing.find({"created_at":new Date(1221029382*1000)})
with mongoexport:
mongoexport -d event -c listing -q '{"created_at":new Date(1221029382*1000)}'
The generated error:
Fri Nov 11 17:44:08 Assertion: 10340:Failure parsing JSON string near:
$and: [ {
0x584102 0x528454 0x5287ce 0xa94ad1 0xa8e2ed 0xa92282 0x7fbd056a61c4
0x4fca29
mongoexport(_ZN5mongo11msgassertedEiPKc+0x112) [0x584102]
mongoexport(_ZN5mongo8fromjsonEPKcPi+0x444) [0x528454]
mongoexport(_ZN5mongo8fromjsonERKSs+0xe) [0x5287ce]
mongoexport(_ZN6Export3runEv+0x7b1) [0xa94ad1]
mongoexport(_ZN5mongo4Tool4mainEiPPc+0x169d) [0xa8e2ed]
mongoexport(main+0x32) [0xa92282]
/lib/libc.so.6(__libc_start_main+0xf4) [0x7fbd056a61c4]
mongoexport(__gxx_personality_v0+0x3d9) [0x4fca29]
assertion: 10340 Failure parsing JSON string near: $and: [ {
But doing the multiplication in Date beforehand in mongoexport:
mongoexport -d event -c listing -q '{"created_at":new Date(1221029382000)}'
works!
Why is mongo evaluating the queries differently in these two contexts?
The mongoexport command-line utility supports passing a query in JSON format, but you are trying to evaluate JavaScript in your query.
The JSON format was originally derived from JavaScript's object notation, but the contents of a JSON document can be parsed without eval()ing it in a JavaScript interpreter.
You should consider JSON as representing "structured data" and JavaScript as "executable code". So there are, in fact, two different contexts for the queries you are running.
The mongo command-line utility is an interactive JavaScript shell which includes a JavaScript interpreter as well as some helper functions for working with MongoDB. While the JavaScript object format looks similar to JSON, you can also use JavaScript objects, function calls, and operators.
Your example of 1221029382*1000 is the result of a math operation that would be executed by the JavaScript interpreter if you ran that in the mongo shell; in JSON it's an invalid value for a new Date so mongoexport is exiting with a "Failure parsing JSON string" error.
I also got this error doing a mongoexport, but for a different reason. I'll share my solution here though since I ended up on this SO page while trying to solve my issue.
I know it has little to do with this question, but the title of this post brought it up in Google, so since I was getting the exact same error I'll add an answer. Hopefully it helps someone.
I was trying to do a MongoId _id query in the Windows console. The problem was that I needed to wrap the JSON query in double quotes, and the ObjectId also had to be in double quotes (not single!). So I had to escape the ObjectId quotes.
mongoexport -u USERNAME -pPASSWORD -d DATABASE -c COLLECTION
--query "{_id : ObjectId(\"5148894d98981be01e000011\")}"
If I wrap the JSON query in single quote on Windows, I get this error:
ERROR: too many positional options
And if I use single quotes around the ObjectId, I get this error:
Assertion: 10340:Failure parsing JSON string near: _id
So, yeah. Good luck.