Newman CLI treating strings in CSV differently from POSTMAN - parsing failure - csv

Newman Version (can be found via newman -v): 4.2.2
OS details (type, version, and architecture): Windows 10 Pro Version 1803 Running all files locally, but hitting internal API
Are you using Newman as a library, or via the CLI? CLI
Did you encounter this recently, or has this bug always been there: This is a new collection
Expected behaviour: I need to use a CSV file to import data into the response body of POST requests. All values MUST be strings. My CSV works correctly in POSTMAN, but fails with error: Invalid closing quote at line 2; found """ instead of delimiter "," in NEWMAN.
Command / script used to run Newman: newman run allPatients.postman_collection.json -e New_QA.postman_environment.json -d 2.csv
Sample collection, and auxiliary files (minus the sensitive details):
In POSTMAN, when I run the requests, all values are strings and must be surrounded by doubled quotes. I use a CSV file that looks like this:
"bin","pcn","group_id","member_id","last_name","first_name","dob","sex","pharmacy_npi","prescriber_npi"
"""012353""","""01920000""","""TESTD 273444""","""Z9699879901""","""Covg""","""MC""","""19500101""","""2""","""1427091255""","""1134165194"""
When I run the same CSV data file in NEWMAN, I get the error above. I have tried a few options I've seen on this forum without any luck such as using Escape syntax for double quotes such as:
"/"text/""
The only things I've tried that have not failed pre-run with an error like above include removing the double-quotes entirely or replacing them with single-quotes. When I do this, I get 400 Bad Request, which I suspect is due to me sending invalid data-types.

Please close this issue. It was the result of human error.
I was able to fix this by correctly using the syntax suggested elsewhere.
"bin","pcn","group_id","member_id","last_name","first_name","dob","sex","pharmacy_npi","prescriber_npi"
"\"012353\"","\"01920000\"","\"TESTD 273444\"","\"Z9699879901\"","\"Covg\"","\"MC\"","\"19500101\"","\"2\"","
\"1427091255\"","\"1134165194\""

Related

SyntaxError: Unexpected token � in JSON at position 0

I'm getting the previous error while trying to load a simple JSON file:
{
"$schema": "http://json-schema.org/draft-04/schema#"
}
What I'm doing is trying to use act with this git action. However, even when trying to load the JSON with a dummy project it still gives me that error.
As you can see, the JSON is clearly valid, I'm not sure why it complains about that weird char in the beginning of the JSON. I double checked with this JSON validator and it says it's valid.
This was under Windows.
Windows introduced some weird chars in the beginning while outputting:
generate-schema -j .\config.json > .\config.schema.json.
I reran the command on a Linux machine.

Terminal tells there is an issue with perl script

I started facing issue with one of our perl script which was working fine till last month. Can someone help me with this?
malformed JSON string, neither array, object, number, string or atom, at character offset 0
(before "<!doctype html><html...") at ad_lib.pm line 985.
and below is my line
my $response = from_json(qx{$BASE_HASH{CURL} -X GET -H "Content-Type: application/json" -H "Authorization:Basic $encoded" "https://localhost:9090/nwrestapi/v2/global/protectiongroups/" -k -1 2>/dev/null});
encode is used as below
my %BASE_HASH = ();
read_config(\%BASE_HASH);
my $encoded = MIME::Base64::encode($BASE_HASH{NW_USER} . ":" . $BASE_HASH{NW_PW});
It looks like the response you're getting from your HTTP request used to be a JSON string and is now an HTML document.
from_json is trying to decode it as a JSON string but failing because what it finds is neither an array, object, number, string, or atom: it's HTML.
So the problem is more likely with your external service (the one you're talking to) than with the Perl script you're using. Which also might explain why it started failing all of a sudden.
Seems to be the newer version of application has little advanced version of SSL and it is unable to decode the username/password. Instead of $encoded i have passed the chunked/decoded value to the file and scripts are working fine.
Only difference i found with old and new version when executing the script is SSL version.
on system with old app version: TLS_ECDHE_RSA_WITH_AES_128_GCM_SHA256
On system with new app version: TLS_ECDHE_RSA_WITH_AES_256_GCM_SHA384
Need to figure out whats wrong with $encoded and how to pass on the values, may be quotes etc.

How can I display an XML page instead of JSON, for a dataset

I am using the pycsw extension to produce a CSW file. I have harvested data from one CKAN instance [1], into another [2], and am now looking to run the pycsw 'paster load' command:
paster ckan-pycsw load -p /etc/ckan/default/pycsw.cfg -u [CKAN INSTANCE]
I get the error:
Could not pass xml doc from [ID], Error: Start tag expected, '<' not found, line 1, column 1
I think it is because when I visit this url:
[CKAN INSTANCE 2]/harvest/object/[ID]
It comes up with a JSON file as opposed to an XML (which it is expecting)
I have run the pycsw load command on other ckan instances and have had no problems with them. They also display an XML file at the url stated above, so I wanted to know how to get CKAN to serve an XML file instead of JSON?
Thanks in advance for any help!
As you've worked out, your datasets need to be in ISO(XML) format to load into a CSW server. A CKAN only has a copy of the dataset in ISO(XML) format if it harvested them from a CSW.
If you use the CKAN(-to-CKAN) harvester in the chain then the ISO(XML) record doesn't get transferred with it. So you'd either need to add this functionality to the CKAN(-to-CKAN) harvester, or get rid of the CKAN-to-CKAN harvest step.
Alternatively if the record originated in a CKAN, then it has no ISO(XML) version anyway, and you'd need to create that somehow.

HTTP request error from running Postman Collection tests with Newman?

I've been using the new commandline for Postman, Newman, and have been attempting to run Collection tests that work fine when I pass them through the packaged app Jetpacks add-on, but do not run properly in the commandline. Although the json Collection file that I am passing does contain the proper header declarations, I don't have any other clues at this point, so I suspect that this may be an HTTP header issue. But I am not sure exactly what is wrong, as I am rather new to using Postman.
The tests that I'm trying to run are on some calls to an ASP.Net web API, very simple server response-checking one-line javascript tests like the ones in this tutorial.
A sample line that I enter into the console:
$ newman -c collectionfile.json -e environmentfile.json -n 5
achieves such a result:
RequestError: [token] terminated. Error: undefined
Any suggestions/help would be appreciated.
I ran into this problem as well and spent quite a few hours trying to figure it out. Eventually I realized that an address such as "www.google.com" will work in the chrome plugin UI, but not in Newman. For it to work in Newman, the address must be "https://www.google.com". Also, make sure that you have your data file (if you are using variables like {{url}}) set up correctly. You can use "-d dataFile.json" to define a data file. More information on that here.

During json decode "Invalid \uXXXX\uXXXX surrogate pair" appears on GAE server, but not on localhost

In my app, I'm running a following command on Python26 locally:
json.loads('"xxxx \ud83d xxxx"');
And it parses the string no problem.
But once I upload the code to the GAE server, the following error appears:
"Invalid \uXXXX\uXXXX surrogate pair"
Any suggestions? Could it be because I'm running python2.6 locally, and GAE is running on python2.5?
The string that causes problem is from an API from a well known site, so it's 100% percent valid. How do I force GAE to parse it correctly?
IMO the json implementations on your computer and on GAE differ. Here is a simplejson implementation which tells you what are the conditions to get the error.
Update: It looks like you have to prefix your string with u like u"xxxx \ud83d xxxx"