I am trying to pass JSON string in environment.
- name: Start {{service_name}}
shell: "<<starting springboot jar>> --server.port={{service_port}}\""
environment:
- SPRING_APPLICATION_JSON: '{"test-host.1":"{{test_host_1}}","test-host.2":"{{test_host_2}}"}'
test_host_1 is 172.31.00.00
test_host_2 is 172.31.00.00
But in spring logs, I get JSON parse exception where it prints
Caused by: com.fasterxml.jackson.core.JsonParseException: Unexpected character (''' (code 39)): was expecting double-quote to start field name
at [Source: {'test-host.1': '172.31.00.00', 'test-host.2': '172.31.00.00'}; line: 1, column: 3]
As seen, double quotes are converted to single quotes !!!
I tried escaping double quotes but with no luck.
Any idea why it happens, or any work around?
There is a thing about Ansible template engine.
If a string seems like an object (starts with { or [) Ansible converts it into object. See code.
To prevent this, you may use one of STRING_TYPE_FILTERS:
- SPRING_APPLICATION_JSON: "{{ {'test-host.1':test_host_1,'test-host.2':test_host_2} | to_json }}"
P.S. this is why hack with space character from #techraf's answer works: Ansible misses startswith("{") comparison and don't convert string to object.
Quick hack: add a space to the variable definition (after the first single quote) - a single space doesn't influence the actual variable value (space will be ignored):
- name: Start {{service_name}}
shell: "<<starting springboot jar>> --server.port={{service_port}}\""
environment:
- SPRING_APPLICATION_JSON: ' {"test-host.1":"{{test_host_1}}","test-host.2":"{{test_host_2}}"}'
With the space Ansible passes to shell (test1, test2 are values I set):
SPRING_APPLICATION_JSON='"'"' {"test-host.1":"test1","test-host.2":"test2"}'"'"'
Without the space:
SPRING_APPLICATION_JSON='"'"'{'"'"'"'"'"'"'"'"'test-host.2'"'"'"'"'"'"'"'"': '"'"'"'"'"'"'"'"'test2'"'"'"'"'"'"'"'"', '"'"'"'"'"'"'"'"'test-host.1'"'"'"'"'"'"'"'"': '"'"'"'"'"'"'"'"'test1'"'"'"'"'"'"'"'"'}'"'"'
Order is reversed too. Seems like without a space it interprets the JSON, with the space as string.
I don't really get why it happens so...
Related
I am trying to read json's from a text file using below command:
{__FileToString((${JSON_FILE},,)).replaceAll(' ','')}
File not readable.
Error: {"timestamp":1586945558777,"status":400,"error":"Bad Request","exception":"org.springframework.http.converter.HttpMessageNotReadableException","message":"Could not read document: Unexpected character ('' (code 95)): was expecting double-quote to start field name\n at [Source: java.io.PushbackInputStream#6df97f39; line: 1, column: 3]; nested exception is com.fasterxml.jackson.core.JsonParseException: Unexpected character ('' (code 95)): was expecting double-quote to start field name\n at [Source: java.io.PushbackInputStream#6df97f39; line: 1, column: 3]","path":"/service"}
I have gone through all related posts too but still not able to find the solution to it. Please can anyone help.
Refereneces:
JMeter - How to read JSON file?
https://devqa.io/perf/jmeter-send-json-file-as-request-in-body
https://www.360logica.com/blog/how-to-use-http-request-to-send-multiple-json-files
Thanks,
Mrinalini
The correct syntax would be:
${__strReplace(${__FileToString(${JSON_FILE})}, ,,)}
You cannot append normal Java functions like String.replaceAll() to JMeter Functions, if you want to get rid of whitespace characters you need to invoke __strReplace() function like shown above (this function is a Custom JMeter Function, it can be installed using JMeter Plugins Manager)
If you cannot use JMeter Plugins for some reason you can achieve the same using __groovy() function like:
${__groovy(new File(vars.get('JSON_FILE')).text.replaceAll(' '\, ''),)}
Demo:
Error sending JSON structure using aws-cli in Powershell. Specifically a call to put an item into an existing DynamoDB table.
The problem seems to be that the lack of double quotes around keys and values in the JSON object I'm attempting to send. I've read that Powershell is finicky with outputting double quotes, especially when leveraging external APIs.
Unfortunately, since my org uses okta for authenticating AWS requests, I have to use Powershell.
I've tried everything that I've seen here:
https://learn.microsoft.com/en-us/powershell/module/microsoft.powershell.utility/convertto-json?view=powershell-6
...here:
Error parsing parameter '--expression-attribute-values': Invalid JSON: Expecting property name enclosed in double quotes: line 1 column 3 (char 2)
...here:
https://github.com/aws/aws-cli/issues/1326
...and here:
PowerShell: best way to escape double quotes in string passed to an external program? E.g., a JSON string
WHAT I'VE TRIED:
This is the basic first attempt:
okta-aws dh dynamodb put-item --table-name AlexaRoomLookup-dev --item '{"deviceId": {"S":"amzn1.ask.device.AEH2LHYGV7GSPP5THMR
5H56AI2OOMAQ7MF54CZ3E6WR433WGS6QAOCYCKJWRJ3TQY5IE76NWR2IKCANB6TJNKLDEZOO2YN6ACUVT33MKSS4CO6R7GJI6GDFLOBOPUA2IXX7RI732UXJ6PDST5KYC7CSQK
634K4APEBRNVOKVZIDECOCBBIFB4"},"roomNumber": {"N":9110}}' --return-consumed-capacity TOTAL
Then I tried escaping with backslash:
okta-aws dh dynamodb put-item --table-name AlexaRoomLookup-dev --item {\"deviceId\":{\"S\":\"amzn1.ask.device.AEH2LHYGV7GSPP5THMR5H56AI2OOMAQ7MF54CZ3E6WR433WGS6QAOCYCKJWRJ3TQY5IE76NWR2IKCANB6TJNKLDEZOO2YN6ACUVT33MKSS4CO6R7GJI6GDFLOBOPUA2IXX7RI732UXJ6PDST5KYC7CSQK634K4APEBRNVOKVZIDECOCBBIFB4\"},\"roomNumber\": {\"N\":9110}} --return-consumed-capacity TOTAL
Then esacping with backtick (which i've replaced here with an asterisk so SO would read it as code) and backslash:
{*"deviceId*": {*"S*":*"amzn1.ask.device.AEH2LHYGV7GSPP
5THMR5H56AI2OOMAQ7MF54CZ3E6WR433WGS6QAOCYCKJWRJ3TQY5IE76NWR2IKCANB6TJNKLDEZOO2YN6ACUVT33MKSS4CO6R7GJI6GDFLOBOPUA2IXX7RI732UXJ6PDST5KYC
7CSQK634K4APEBRNVOKVZIDECOCBBIFB4*"},*"roomNumber*": {*"N*":9110}}" --return- consumed-capacity TOTAL
I then tried a "here string" to no avail.
EXPECTATIONS and RESULTS:
I would expect a method of escaping that's in the microsoft documentation to work.
Each of the above gave this error with a variation of the problematic "JSON received" based on the escape method, but it never had double quotes around keys and values:
Error parsing parameter '--item': Invalid JSON: Expecting property name enclosed in double quotes: line 1 column 2 (char 1)
JSON received: {deviceId: {S:amzn1.ask.device.AEH2LHYGV7GSPP5THMR5H56AI2OOMAQ7MF54CZ3E6WR433WGS6QAOCYCKJWRJ3TQY5IE76NWR2IKCANB6TJNKLDEZOO2YN6ACUVT33MKSS4CO6R7GJI6GDFLOBOPUA2IXX7RI732UXJ6PDST5KYC7CSQK634K4APEBRNVOKVZIDECOCBBIFB4},roomNumber: {N:9110}}
The only thing that seemed to work was using "file://file.json" as the input to --item, which I couldn't find documented anywhere... I think it was on that github thread I linked. However, I'd rather not have to edit a file every time I want to send JSON with an AWS API call... Here it is:
okta-aws dh dynamodb put-item --table-name AlexaRoomLookup-dev --item file://file.json --return-consumed-capacity TOTAL
Can anyone provide info other than what's listed here as to why the above methods wouldn't work? Have I just implemented them incorrectly?
Thanks.
I was having the same issue while trying to send a message to Amazon SQS with JSON body using PowerShell. After trying different escape characters, the following worked for me.
aws sqs send-message --queue-url "<queue-url>" --message-body '{\""key1\"": \""value1\"",\""key2\"": \""value2\"",\""key3\"": \""value3\"" }'
OS: Windows 10 Pro (Version 1803)
AWS CLI version: 1.16.180
For further information, see the official documentation.
Using quotation marks with strings in the AWS CLI
You could try the PowerShell "stop parsing symbol" (i.e. "--%") at the start of the command. This tells PowerShell to use the rest of the parameters verbatim.
PS> okta-aws --% dh dynamodb put-item --table-name AlexaRoomLookup-dev --item '{"deviceId": {"S":"amzn1.ask.device.AEH...etc...FB4"},"roomNumber": {"N":9110}}' --return-consumed-capacity TOTAL
See about_parsing for more details...
It won't help if your json is in a variable, but if it's hard-coded like your example above it might work.
I have the following two lines of code:
json_str = _cases.to_json
path += " #{USER} #{PASS} #{json_str}"
When I use the debugger, I noticed that json_str appears to be formatted as JSON:
"[["FMCE","Wiltone","Wiltone","04/10/2018","Marriage + - DOM"]]"
However, when I interpolate it into another string, the quotes are removed:
"node superuser 123456 [["FMCE","Wiltone","Wiltone","04/10/2018","Marriage + - DOM"]]"
Why does string interpolation remove the quotes from JSON string and how can I resolve this?
I did find one solution to the problem, which was manually escaping the string:
json_str = _cases.to_json.gsub('"','\"')
path += " #{USER} #{PASS} \"#{json_str}\""
So basically I escape the double quotes generated in the to_json call. Then I manually add two escaped quotes around the interpolated variable. This will produce a desired result:
node superuser 123456 "[[\"FMCE\",\"Wiltone\",\"Wiltone\",\"04/10/2018\",\"Marriage + - DOM\"]]"
Notice how the outer quotes around the collection are not escaped, but the strings inside the collection are escaped. That will enable JavaScript to parse it with JSON.parse.
It is important to note that in this part:
json_str = _cases.to_json.gsub('"','\"')
it is adding a LITERAL backslash. Not an escape sequence.
But in this part:
path += " #{USER} #{PASS} \"#{json_str}\""
The \" wrapping the interpolated variable is an escape sequence and NOT a literal backslash.
Why do you think the first and last quote marks are part of the string? They do not belong to the JSON format. Your program’s behavior looks correct to me.
(Or more precisely, your program seems to be doing exactly what you told it to. Whether your instructions are any good is a question I can’t answer without more context.)
It's hard to tell with the small sample, but it looks like you might be getting quotes from your debugger output. assuming the output of .to_json is a string (usually is), then "#{json_str}" should be exactly equal to json_str. If it isn't, that's a bug in ruby somehow (doubtful).
If you need the quotes, you need to either add them manually or escape the string using whatever escape function is appropriate for your use case. You could use .to_json as your escape function even ("#{json_str.to_json}", for example).
I am trying to submit a spark job in spark job server with input in json format. However in my case one of values contains '!' character, which is not allowing me to parse it.Here is my input and response.
Input
curl -d "{"test.input1"="abc", "test.input2"="def!"}" 'http://localhost:8090/jobs?appName=my_spark_job&classPath=com.example.spark.job.MySparkJob'
Response
"result": "Cannot parse config: String: 1: Reserved character '!' is not allowed outside quotes (if you intended '!' (Reserved character '!' is not allowed outside quotes) to be part of the value for 'test.input2', try enclosing the value in double quotes, or you may be able to rename the file .properties rather than .conf)"
The value of "test.input2" is already in double quotes. I tried adding single/double quotes but still didnt work. Any thoughts how can i parse it.
Thanks
Put everything in a json file and do the following
curl --data-binary #/path/to/json/file 'http://localhost:8090/jobs?appName=my_spark_job&classPath=com.example.spark.job.MySparkJob'
I'm having trouble properly escaping a string I'm trying to use to represent JSON in Erlang. I'm not sure why this particular sequence is giving the parser trouble. I have this string in a Basho Bench configuration file.
'{
"stats":"completed",
"times":[
{
"time":"2014-10-29T23:40:46.558Z"
}
]
}'
I am getting this error:
23:37:18.521 [error] Failed to parse config file server/http.config.erl: {29,erl_scan,{illegal,atom}}
It seems like maybe the issue is the numbers in the string but I don't get how I would escape them. Any thoughts?
You provided insufficient info but anyway, server/http.config.erl is not JSON. It is erlang term, so this error is from Erlang parser. The whole text you provided is parsed as atom because of ' which is delimiter for atoms.
The string is not a string. Single quotes denote an atom. It must be wrapped in double quotes to be interpreted as a string.