mongoexport JSON parsing error - json

Trying to use a query with mongoexport results in an error. But the same query is evaluated by the mongo-client without an error.
In mongo-client:
db.listing.find({"created_at":new Date(1221029382*1000)})
with mongoexport:
mongoexport -d event -c listing -q '{"created_at":new Date(1221029382*1000)}'
The generated error:
Fri Nov 11 17:44:08 Assertion: 10340:Failure parsing JSON string near:
$and: [ {
0x584102 0x528454 0x5287ce 0xa94ad1 0xa8e2ed 0xa92282 0x7fbd056a61c4
0x4fca29
mongoexport(_ZN5mongo11msgassertedEiPKc+0x112) [0x584102]
mongoexport(_ZN5mongo8fromjsonEPKcPi+0x444) [0x528454]
mongoexport(_ZN5mongo8fromjsonERKSs+0xe) [0x5287ce]
mongoexport(_ZN6Export3runEv+0x7b1) [0xa94ad1]
mongoexport(_ZN5mongo4Tool4mainEiPPc+0x169d) [0xa8e2ed]
mongoexport(main+0x32) [0xa92282]
/lib/libc.so.6(__libc_start_main+0xf4) [0x7fbd056a61c4]
mongoexport(__gxx_personality_v0+0x3d9) [0x4fca29]
assertion: 10340 Failure parsing JSON string near: $and: [ {
But doing the multiplication in Date beforehand in mongoexport:
mongoexport -d event -c listing -q '{"created_at":new Date(1221029382000)}'
works!
Why is mongo evaluating the queries differently in these two contexts?

The mongoexport command-line utility supports passing a query in JSON format, but you are trying to evaluate JavaScript in your query.
The JSON format was originally derived from JavaScript's object notation, but the contents of a JSON document can be parsed without eval()ing it in a JavaScript interpreter.
You should consider JSON as representing "structured data" and JavaScript as "executable code". So there are, in fact, two different contexts for the queries you are running.
The mongo command-line utility is an interactive JavaScript shell which includes a JavaScript interpreter as well as some helper functions for working with MongoDB. While the JavaScript object format looks similar to JSON, you can also use JavaScript objects, function calls, and operators.
Your example of 1221029382*1000 is the result of a math operation that would be executed by the JavaScript interpreter if you ran that in the mongo shell; in JSON it's an invalid value for a new Date so mongoexport is exiting with a "Failure parsing JSON string" error.

I also got this error doing a mongoexport, but for a different reason. I'll share my solution here though since I ended up on this SO page while trying to solve my issue.
I know it has little to do with this question, but the title of this post brought it up in Google, so since I was getting the exact same error I'll add an answer. Hopefully it helps someone.
I was trying to do a MongoId _id query in the Windows console. The problem was that I needed to wrap the JSON query in double quotes, and the ObjectId also had to be in double quotes (not single!). So I had to escape the ObjectId quotes.
mongoexport -u USERNAME -pPASSWORD -d DATABASE -c COLLECTION
--query "{_id : ObjectId(\"5148894d98981be01e000011\")}"
If I wrap the JSON query in single quote on Windows, I get this error:
ERROR: too many positional options
And if I use single quotes around the ObjectId, I get this error:
Assertion: 10340:Failure parsing JSON string near: _id
So, yeah. Good luck.

Related

How can I fix JSON error on shell script?

I have a bash script which sends curl post requests. I want to pass data as bash script parameters. However one of the parameter has spaces in the string and it fails with the error below.
Error parsing JSON data.\n\tString not terminated on line
In shell script, I'm sending an argument like this format {"name":"'$2'"}
Could you please help me to solve that issue?
Thanks
jq is good not only for manipulating existing JSON data, but creating new data, as it does things like correctly handling characters that can't appear unescaped in JSON strings, and proper quoting. Something like
curl ... -d"$(jq -n --arg val "$2" '{name: $val}')"
It would be better if you add enough data while asking question.
I assume your json will be like
{
"name": "argument passed"
}
curl -XPOST "your/url/here" -H 'Content-Type:application/json' -d'{"name":"'$1'"}'
Save the above command as post_request.sh(Feel free to change the name).
Run using below comand.
sh post_request.sh "argument passed"
"argument passed" will be your name with space.

How to properly format JSON in Powershell while using aws-cli?

Error sending JSON structure using aws-cli in Powershell. Specifically a call to put an item into an existing DynamoDB table.
The problem seems to be that the lack of double quotes around keys and values in the JSON object I'm attempting to send. I've read that Powershell is finicky with outputting double quotes, especially when leveraging external APIs.
Unfortunately, since my org uses okta for authenticating AWS requests, I have to use Powershell.
I've tried everything that I've seen here:
https://learn.microsoft.com/en-us/powershell/module/microsoft.powershell.utility/convertto-json?view=powershell-6
...here:
Error parsing parameter '--expression-attribute-values': Invalid JSON: Expecting property name enclosed in double quotes: line 1 column 3 (char 2)
...here:
https://github.com/aws/aws-cli/issues/1326
...and here:
PowerShell: best way to escape double quotes in string passed to an external program? E.g., a JSON string
WHAT I'VE TRIED:
This is the basic first attempt:
okta-aws dh dynamodb put-item --table-name AlexaRoomLookup-dev --item '{"deviceId": {"S":"amzn1.ask.device.AEH2LHYGV7GSPP5THMR
5H56AI2OOMAQ7MF54CZ3E6WR433WGS6QAOCYCKJWRJ3TQY5IE76NWR2IKCANB6TJNKLDEZOO2YN6ACUVT33MKSS4CO6R7GJI6GDFLOBOPUA2IXX7RI732UXJ6PDST5KYC7CSQK
634K4APEBRNVOKVZIDECOCBBIFB4"},"roomNumber": {"N":9110}}' --return-consumed-capacity TOTAL
Then I tried escaping with backslash:
okta-aws dh dynamodb put-item --table-name AlexaRoomLookup-dev --item {\"deviceId\":{\"S\":\"amzn1.ask.device.AEH2LHYGV7GSPP5THMR5H56AI2OOMAQ7MF54CZ3E6WR433WGS6QAOCYCKJWRJ3TQY5IE76NWR2IKCANB6TJNKLDEZOO2YN6ACUVT33MKSS4CO6R7GJI6GDFLOBOPUA2IXX7RI732UXJ6PDST5KYC7CSQK634K4APEBRNVOKVZIDECOCBBIFB4\"},\"roomNumber\": {\"N\":9110}} --return-consumed-capacity TOTAL
Then esacping with backtick (which i've replaced here with an asterisk so SO would read it as code) and backslash:
{*"deviceId*": {*"S*":*"amzn1.ask.device.AEH2LHYGV7GSPP
5THMR5H56AI2OOMAQ7MF54CZ3E6WR433WGS6QAOCYCKJWRJ3TQY5IE76NWR2IKCANB6TJNKLDEZOO2YN6ACUVT33MKSS4CO6R7GJI6GDFLOBOPUA2IXX7RI732UXJ6PDST5KYC
7CSQK634K4APEBRNVOKVZIDECOCBBIFB4*"},*"roomNumber*": {*"N*":9110}}" --return- consumed-capacity TOTAL
I then tried a "here string" to no avail.
EXPECTATIONS and RESULTS:
I would expect a method of escaping that's in the microsoft documentation to work.
Each of the above gave this error with a variation of the problematic "JSON received" based on the escape method, but it never had double quotes around keys and values:
Error parsing parameter '--item': Invalid JSON: Expecting property name enclosed in double quotes: line 1 column 2 (char 1)
JSON received: {deviceId: {S:amzn1.ask.device.AEH2LHYGV7GSPP5THMR5H56AI2OOMAQ7MF54CZ3E6WR433WGS6QAOCYCKJWRJ3TQY5IE76NWR2IKCANB6TJNKLDEZOO2YN6ACUVT33MKSS4CO6R7GJI6GDFLOBOPUA2IXX7RI732UXJ6PDST5KYC7CSQK634K4APEBRNVOKVZIDECOCBBIFB4},roomNumber: {N:9110}}
The only thing that seemed to work was using "file://file.json" as the input to --item, which I couldn't find documented anywhere... I think it was on that github thread I linked. However, I'd rather not have to edit a file every time I want to send JSON with an AWS API call... Here it is:
okta-aws dh dynamodb put-item --table-name AlexaRoomLookup-dev --item file://file.json --return-consumed-capacity TOTAL
Can anyone provide info other than what's listed here as to why the above methods wouldn't work? Have I just implemented them incorrectly?
Thanks.
I was having the same issue while trying to send a message to Amazon SQS with JSON body using PowerShell. After trying different escape characters, the following worked for me.
aws sqs send-message --queue-url "<queue-url>" --message-body '{\""key1\"": \""value1\"",\""key2\"": \""value2\"",\""key3\"": \""value3\"" }'
OS: Windows 10 Pro (Version 1803)
AWS CLI version: 1.16.180
For further information, see the official documentation.
Using quotation marks with strings in the AWS CLI
You could try the PowerShell "stop parsing symbol" (i.e. "--%") at the start of the command. This tells PowerShell to use the rest of the parameters verbatim.
PS> okta-aws --% dh dynamodb put-item --table-name AlexaRoomLookup-dev --item '{"deviceId": {"S":"amzn1.ask.device.AEH...etc...FB4"},"roomNumber": {"N":9110}}' --return-consumed-capacity TOTAL
See about_parsing for more details...
It won't help if your json is in a variable, but if it's hard-coded like your example above it might work.

Force mongodb to output strict JSON

I want to consume the raw output of some MongoDB commands in other programs that speak JSON. When I run commands in the mongo shell, they represent Extended JSON, fields in "shell mode", with special fields like NumberLong , Date, and Timestamp. I see references in the documentation to "strict mode", but I see no way to turn it on for the shell, or a way to run commands like db.serverStatus() in things that do output strict JSON, like mongodump. How can I force Mongo to output standards-compliant JSON?
There are several other questions on this topic, but I don't find any of their answers particularly satisfactory.
The MongoDB shell speaks Javascript, so the answer is simple: use JSON.stringify(). If your command is db.serverStatus(), then you can simply do this:
JSON.stringify(db.serverStatus())
This won't output the proper "strict mode" representation of each of the fields ({ "floatApprox": <number> } instead of { "$numberLong": "<number>" }), but if what you care about is getting standards-compliant JSON out, this'll do the trick.
I have not found a way to do this in the mongo shell, but as a workaround, mongoexport can run queries and its output uses strict mode and can be piped into other commands that expect JSON input (such as json_pp or jq). For example, suppose you have the following mongo shell command to run a query, and you want to create a pipeline using that data:
db.myItemsCollection.find({creationDate: {$gte: ISODate("2016-09-29")}}).pretty()
Convert that mongo shell command into this shell command, piping for the sake of example to `json_pp:
mongoexport --jsonArray -d myDbName -c myItemsCollection -q '{"creationDate": {"$gte": {"$date": "2016-09-29T00:00Z"}}}' | json_pp
You will need to convert the query into strict mode format, and pass the database name and collection name as arguments, as well as quote properly for your shell, as shown here.
In case of findOne
JSON.stringify(db.Bill.findOne({'a': '123'}))
In case of a cursor
db.Bill.find({'a': '123'}).forEach(r=>print(JSON.stringify(r)))
or
print('[') + db.Bill.find().limit(2).forEach(r=>print(JSON.stringify(r) + ',')) + print(']')
will output
[{a:123},{a:234},]
the last one will have a ',' after the last item...remove it
To build on the answer from #jbyler, you can strip out the numberLongs using sed after you get your data - that is if you're using linux.
mongoexport --jsonArray -d dbName -c collection -q '{fieldName: {$regex: ".*turkey.*"}}' | sed -r 's/\{ "[$]numberLong" : "([0-9]+)" }/"\1"/g' | json_pp
EDIT: This will transform a given document, but will not work on a list of documents. Changed find to findOne.
Adding
.forEach(function(results){results._id=results._id.toString();printjson(results)})`
to a findOne() will output valid JSON.
Example:
db
.users
.findOne()
.forEach(function (results) {
results._id = results._id.toString();
printjson(results)
})
Source: https://www.mydbaworld.com/mongodb-shell-output-valid-json/

Parse error creating Erlang JSON string

I'm having trouble properly escaping a string I'm trying to use to represent JSON in Erlang. I'm not sure why this particular sequence is giving the parser trouble. I have this string in a Basho Bench configuration file.
'{
"stats":"completed",
"times":[
{
"time":"2014-10-29T23:40:46.558Z"
}
]
}'
I am getting this error:
23:37:18.521 [error] Failed to parse config file server/http.config.erl: {29,erl_scan,{illegal,atom}}
It seems like maybe the issue is the numbers in the string but I don't get how I would escape them. Any thoughts?
You provided insufficient info but anyway, server/http.config.erl is not JSON. It is erlang term, so this error is from Erlang parser. The whole text you provided is parsed as atom because of ' which is delimiter for atoms.
The string is not a string. Single quotes denote an atom. It must be wrapped in double quotes to be interpreted as a string.

Ways to parse JSON using KornShell

I have a working code for parsing a JSON output using KornShell by treating it as a string of characters. The issue I have is that the vendor keeps changing the position of the field that I am intersted in. I understand in JSON, we can parse it by key-value pairs.
Is there something out there that can do this? I am intersted in a specific field and I would like to use it to run the checks on the status of another RESTAPI call.
My sample json output is like this:
JSONDATA value :
{
"status": "success",
"job-execution-id": 396805,
"job-execution-user": "flexapp",
"job-execution-trigger": "RESTAPI"
}
I would need the job-execution-id value to monitor this job through the rest of the script.
I am using the following command to parse it:
RUNJOB=$(print ${DATA} |cut -f3 -d':'|cut -f1 -d','| tr -d [:blank:]) >> ${LOGDIR}/${LOGFILE}
The problem with this is, it is field delimited by :. The field position has been known to be changed by the vendors during releases.
So I am trying to see if I can use a utility out there that would always give me the key-value pair of "job-execution-id": 396805, no matter where it is in the json output.
I started looking at jsawk, and it requires the js interpreter to be installed on our machines which I don't want. Any hint on how to go about finding which RPM that I need to solve it?
I am using RHEL5.5.
Any help is greatly appreciated.
The ast-open project has libdss (and a dss wrapper) which supposedly could be used with ksh. Documentation is sparse and is limited to a few messages on the ast-user mailing list.
The regression tests for libdss contain some json and xml examples.
I'll try to find more info.
Python is included by default with CentOS so one thing you could do is pass your JSON string to a Python script and use Python's JSON parser. You can then grab the value written out by the script. An example you could modify to meet your needs is below.
Note that by specifying other dictionary keys in the Python script you can get any of the values you need without having to worry about the order changing.
Python script:
#get_job_execution_id.py
# The try/except is because you'll probably have Python 2.4 on CentOS 5.5,
# and the straight "import json" statement won't work unless you have Python 2.6+.
try:
import json
except:
import simplejson as json
import sys
json_data = sys.argv[1]
data = json.loads(json_data)
job_execution_id = data['job-execution-id']
sys.stdout.write(str(job_execution_id))
Kornshell script that executes it:
#get_job_execution_id.sh
#!/bin/ksh
JSON_DATA='{"status":"success","job-execution-id":396805,"job-execution-user":"flexapp","job-execution-trigger":"RESTAPI"}'
EXECUTION_ID=`python get_execution_id.py "$JSON_DATA"`
echo $EXECUTION_ID