After installing a HAL Validator (https://paragsarin.medium.com/hal-api-validation-754fd3b2c96), and preparing a CURL request, I received the error "0xE2 is an invalid start of a property name". The response of my HAL service validates as JSON already, so why am I getting this error?
Turns out this error was from parsing a json based configuration file of the validator, not the results from validating a HAL response (which is also json). The json in the configuration file turned out to have left and right double quotes (disallowed) rather than standard double quotes. These characters came from copying from a web browser into an editor when creating the configuration file. Swapping left and right with standard double quotes fixed the issue.
For me, the issue was caused by toString method in my POST request:
httpRequest.body(requestParams.toString())
Taking off toString resolved the issue:
httpRequest.body(requestParams);
Newman Version (can be found via newman -v): 4.2.2
OS details (type, version, and architecture): Windows 10 Pro Version 1803 Running all files locally, but hitting internal API
Are you using Newman as a library, or via the CLI? CLI
Did you encounter this recently, or has this bug always been there: This is a new collection
Expected behaviour: I need to use a CSV file to import data into the response body of POST requests. All values MUST be strings. My CSV works correctly in POSTMAN, but fails with error: Invalid closing quote at line 2; found """ instead of delimiter "," in NEWMAN.
Command / script used to run Newman: newman run allPatients.postman_collection.json -e New_QA.postman_environment.json -d 2.csv
Sample collection, and auxiliary files (minus the sensitive details):
In POSTMAN, when I run the requests, all values are strings and must be surrounded by doubled quotes. I use a CSV file that looks like this:
"bin","pcn","group_id","member_id","last_name","first_name","dob","sex","pharmacy_npi","prescriber_npi"
"""012353""","""01920000""","""TESTD 273444""","""Z9699879901""","""Covg""","""MC""","""19500101""","""2""","""1427091255""","""1134165194"""
When I run the same CSV data file in NEWMAN, I get the error above. I have tried a few options I've seen on this forum without any luck such as using Escape syntax for double quotes such as:
"/"text/""
The only things I've tried that have not failed pre-run with an error like above include removing the double-quotes entirely or replacing them with single-quotes. When I do this, I get 400 Bad Request, which I suspect is due to me sending invalid data-types.
Please close this issue. It was the result of human error.
I was able to fix this by correctly using the syntax suggested elsewhere.
"bin","pcn","group_id","member_id","last_name","first_name","dob","sex","pharmacy_npi","prescriber_npi"
"\"012353\"","\"01920000\"","\"TESTD 273444\"","\"Z9699879901\"","\"Covg\"","\"MC\"","\"19500101\"","\"2\"","
\"1427091255\"","\"1134165194\""
So I have a function that its sole purpose is to parse JSON's that get dynamically generated but not by my program. Sometimes those JSON's might not meet the proper syntax for a variety of reasons raging from network lag to simply broken json structures. To parse a JSON I use
try { JSON.parse(LARGE_JSON) } catch(e) { .. log the error }
For some reason now, after many parses it seems like my nodejs is leaking and finally encoutering an syntax error
SyntaxError: Unexpected token X in JSON at position 19837
at JSON.parse (<anonymous>)
at Pipe.channel.onread (internal/child_process.js:470:28)
That error crashes the whole process with it, even though I am using try catch. So I really don't understand how to tackle neither the memory leak nor the crashing process, what might the issue be?
I am trying to import a json in spoon. It works just fine with a file .json but when I try it from a URL I get the Unexpected Error, followed by the java null pointer exception, when executing the transformation.
I get the same error with "JSON input", and with "Get content" followed by "extract from stream" which seems to be very alike.
For the simple test I used for URL :
http://echo.jsontest.com/name/James/age/25
I tried with the "Json INPUT", selected the checkbox field as an URL, tried with the URL above (with and without the http://). And in the fields parametre I used the same ones that worked well with a file input (instead of URL).
So the JSONPATH are $.name and $.age
I also tried starting from the sample data "Json input - read incoming stream" and switching it to URL like it was described in this post :
http://forums.pentaho.com/showthread.php?135741-Issues-with-JSON-Input-step
I am working on a distant server running on debian (jessie version).
I use pentaho 6.1
I feel like I've tried everything but I might be missing something obvious.
I apologize for my poor level of english. If more information is needed just ask. Thank you.
I started it all over again. It works now I have no idea why though.
I'm using SJCL, and it works fine with small ASCII strings.
But when I try to decode this piece of JSON (the result of the encryption of an HTML page) I get a "this is not JSON!" error.
The JSON has been produced by SJCL, and while I did encode it and decode it using LZW and base64 I don't get this error for small strings with the same workflow.
I tracked the error message origin to the decode function. I assume the regexes are failing but I don't understand why as this seems to be a perfectly formed JSON string to me.
However, I can be wrong as if I do a JavaScript eval on it it fails on a syntax error. But if I dump it in a file Python parse it fine.
The json that is at your this piece of json link starts and ends with a double-quote character. Is that actually part of the contents of the json? If it is, I believe that is your problem. Otherwise, it looks like valid json to me.
Ok I made a double passed base64 encoding. One before encryption, and one after. It seems that removing the first pass make it work.