Parsing JSON in Delphi with slashes - json

I receive JSON strings from a WebService with slashes in it.
When I parse them, Delphi adds backslashes.
V := TJSONObject.ParseJSONValue( '{"Kode":"ABC/123" }') ;
V.ToString returns : {"Kode":"ABC\\/123"}
What is best practice to parse JSON strings in Delphi?
And why does the function V.Value always returns an empty string?
In the unit System.JSON, I see the following code:
function TJSONAncestor.Value: string;
begin
Result := '';
end;
Do I need to add code myself to System units that ship with Delphi?

\ is required to be escaped as \\, so your claim that "Just removing the backslash in the result does not work, because the value could also be "ABC\123"" is wrong, as it would have to be "ABC\\123" instead.
The only other character required to be escaped is " as \". / is certainly not required to be escaped, but it MAY be escaped at the discretion of the implementation. It is not invalid to do so. As to WHY ToString() does that, you would have to ask Embarcadero.
If you needed to, you could easily do a search-and-replace of \/ into / without affecting other escape sequences, like \\ and \". Or, try using ToJSON() instead of ToString().
Regarding TJSONAncestor.Value(), TJSONAncestor is a base class, and its Value() is a virtual method that returns a blank string by default. Descendant classes (JSONString, TJSONNumber, etc) can override Value() to return more meaningful strings. But in your example, your input string represents a JSON object, so V will be pointing at a TJSONObject instance, and TJSONObject does not override Value() because it does not make sense for an object to be represented by a string. This is even documented behavior:
http://docwiki.embarcadero.com/Libraries/en/System.JSON.TJSONAncestor.Value
Returns the string representation of a simple JSON element like a string, number or boolean.
For structured JSON elements like an object and array returns empty string.

Related

Calling an API returns expected JSONArray, found JSONObject

I'm calling an API from Go and trying to push json string data from another api call into it.
I can hand craft the calls using a payload like
payload := strings.NewReader('[{"value1":333, "value2":444}]'
and everything is happy.
I'm now trying to covert this to take the json string {"value1":333, "value2":444} as an input parameter of type string to a function, but when I try and use that as the payload, the api is responding with
expected type: JSONArray, found: JSONObject
I naively tried setting the input to the function as []string and appending the data to an array as the input, but then strings.NewReader complained that it was being fed an array.. which is was.
I'm at a loss to work out how to convert a string of json into a json array that the api will be happy with.
I tried just surrounding the string with [] but the compiler threw a fit about incorrect line termination.
Must have been doing something wrong with the string, surrounding the {} with [] let the function pass the data, but there must be a better way than this.
Any ideas, or am I making this harder than it should be?
You were on the right track with the brackets, but you actually need to append the characters to the string. For example:
str := `{"value1":333, "value2":444}`
str = "[" + str + "]"
// [{"value1":333, "value2":444}]
https://play.golang.org/p/rWHCLDCAngd
If you use brackets outside a string or rune literal, then it is parsed as Go language syntax.

Escape string in a JSON object

We use Freemarker to transform one JSON to another. The input JSON is something like this:
{"k1": "a", "k2":"line1. \n line2"}
Post using the Freemarker template, the JSON is converted to:
{ \n\n "p1": "a", \n\n "p2": "line1. \n line2"}
Here is the logic we use to do the transformation
final Map<String, Object> input = JsonConverter.convertFromJson(input, Map.class);
final Template template = freeMarkerConfiguration.getTemplate("Template1.ftl");
final Writer out = new StringWriter();
template.process(input, out);
out.flush();
final String newlineFilteredResult = new JSONObject(out.toString).toString();
The conversion to JSON object fails due to a newline character inside a string for key k2 and gives the following exception:
Caused by: org.json.JSONException: Unterminated string at ...
I tried using the following but nothing works:
1. JSONObject.quote
2. JSONValue.escape
3. out.toString().replaceAll("[\n\r]+", "\\n");
I get the following exception due to the newline characters at the beginning as well:
Caused by: org.json.JSONException: Missing value at 1 [character 2 line 1]
Could someone please point me in the correct direction.
Edit
After further clarification from OP he had "${key}": "${value}" in his freemarker template and ${value} could contain line brakes. The solution in this case is to use ${value?json_string}.
Starting from FreeMarker 2.3.32 you can write "${key}": ${value?c} instead of "${key}": "${value}", because if the left-side of ?c is a string, now instead of failing, it quotes and escapes the string. Thus you don't even have to know if the left-side is a number/boolean, which must not be quoted (and ?c won't quote them), or a string, which must be quoted, as it's automatic.
Also, if the left-value is known to be missing/null sometimes, them ?cn will handle that case by printing a null literal.
Also, check out the c_format setting for best results, but by default string formatting is JSON compatible, so using ?c will be an improvement even without setting that.

AWS Lambda Error : 'Could not parse request body into json ' when url parameter contains JSON array

I am trying to invoke my Lambda function by passing parameters as below. it contains apostrophe(').
https://invoke_url?param=[["kurlo jack's book","Adventure Books",8.8,1]]
Stringifyed to be 'https://invoke_url?param=%5B%5B%229780786706211%22s....`
I used the mapping below to pass parameter to lambda
"query": {
#foreach($queryParam in $input.params().querystring.keySet())
"$queryParam": "$util.escapeJavaScript($input.params().querystring.get($queryParam))" #if($foreach.hasNext),#end
#end
}
I got following error
{"message": "Could not parse request body into json: Unrecognized character escape \'\'\' (code 39)\n at [Source: [B#5b70c341; line: 29, column: 65]"}
i have also tried after removing double quotes from mapping template. But did't work.
Be sure to add .replaceAll("\\'","'") to your request body passthrough template after .escapeJavaScript(data)
I found this bit from AWS's documentation to be very helpful for this issue:
$util.escapeJavaScript()
Escapes the characters in a string using JavaScript string rules.
Note This function will turn any regular single quotes (') into
escaped ones (\'). However, the escaped single quotes are not valid in
JSON. Thus, when the output from this function is used in a JSON
property, you must turn any escaped single quotes (\') back to regular
single quotes ('). This is shown in the following example:
$util.escapeJavaScript(data).replaceAll("\\'","'")
I don't have a solution but I have narrowed the root cause. Lambda does not seem to like single quotes to be escaped with a single slash.
If you hardcode your mapping template to look like this:
{
"query-fixed": {
"param": "[[\"kurlo jack\\'s book\",\"Adventure Books\",8.8,1]]"
}
}
my test Lambda invocation succeeds. However, if you hardcode the template to this:
{
"query-fixed": {
"param": "[[\"kurlo jack\'s book\",\"Adventure Books\",8.8,1]]"
}
}
I get the same error message that you got above. Unfortunately, the second variation is what API Gateway produces for the Lambda invocation.
A workaround might involve using the template to replace single quotes escaped with slash to two slashes. See Replace a Substring of a String in Velocity Template Language
I'll follow up with Lambda internally and update if I hear anything or have a functional workaround.
Try changing your encoding of ' to %27 as per what is is defined in this W3Schools page (ironically their example does not encodes the single quote either, I guess its because it belongs to the "supported" ASSCII set of characters)
The "query string" (the part in the hyperlink after ?) must be a string. Whatever you have constructing that must be appended to it like: https://invoke_url?a=x&b=y
In your Lambda code put:
if( event.hasOwnProperty( 'params' ) )
if( event.params.hasOwnProperty( 'querystring' ) )
params = event.params.querystring;
(obviously some extraneous checks, probably unnecessary but ehh)
In your API Gateway go to:
APIs -> api_name -> Resources -> invoke_url -> GET -> Method Execution
Under URL Query String Parameters "Add query string" a and b (or whatever)
When you hit www.com/invoke_url?a=x&b=y you can now access them with:
...
params = event.params.querystring;
console.log( params.a, params.b );
...

Go json.Unmarshal key with \u0000 \x00

Here is the Go playground link.
Basically there are some special characters ('\u0000') in my JSON string key:
var j = []byte(`{"Page":1,"Fruits":["5","6"],"\u0000*\u0000_errorMessages":{"x":"123"},"*_successMessages":{"ok":"hi"}}`)
I want to Unmarshal it into a struct:
type Response1 struct {
Page int
Fruits []string
Msg interface{} `json:"*_errorMessages"`
Msg1 interface{} `json:"\\u0000*\\u0000_errorMessages"`
Msg2 interface{} `json:"\u0000*\u0000_errorMessages"`
Msg3 interface{} `json:"\0*\0_errorMessages"`
Msg4 interface{} `json:"\\0*\\0_errorMessages"`
Msg5 interface{} `json:"\x00*\x00_errorMessages"`
Msg6 interface{} `json:"\\x00*\\x00_errorMessages"`
SMsg interface{} `json:"*_successMessages"`
}
I tried a lot but it's not working.
This link might help golang.org/src/encoding/json/encode_test.go.
Short answer: With the current json implementation it is not possible using only struct tags.
Note: It's an implementation restriction, not a specification restriction. (It's the restriction of the json package implementation, not the restriction of the struct tags specification.)
Some background: you specified your tags with a raw string literal:
The value of a raw string literal is the string composed of the uninterpreted (implicitly UTF-8-encoded) characters between the quotes...
So no unescaping or unquoting happens in the content of the raw string literal by the compiler.
The convention for struct tag values quoted from reflect.StructTag:
By convention, tag strings are a concatenation of optionally space-separated key:"value" pairs. Each key is a non-empty string consisting of non-control characters other than space (U+0020 ' '), quote (U+0022 '"'), and colon (U+003A ':'). Each value is quoted using U+0022 '"' characters and Go string literal syntax.
What this means is that by convention tag values are a list of (key:"value") pairs separated by spaces. There are quite a few restrictions for keys, but values may be anything, and values (should) use "Go string literal syntax", this means that these values will be unquoted at runtime from code (by a call to strconv.Unquote(), called from StructTag.Get(), in source file reflect/type.go, currently line #809).
So no need for double quoting. See your simplified example:
type Response1 struct {
Page int
Fruits []string
Msg interface{} `json:"\u0000_abc"`
}
Now the following code:
t := reflect.TypeOf(Response1{})
fmt.Printf("%#v\n", t.Field(2).Tag)
fmt.Printf("%#v\n", t.Field(2).Tag.Get("json"))
Prints:
"json:\"\\u0000_abc\""
"\x00_abc"
As you can see, the value part for the json key is "\x00_abc" so it properly contains the zero character.
But how will the json package use this?
The json package uses the value returned by StructTag.Get() (from the reflect package), exactly what we did. You can see it in the json/encode.go source file, typeFields() function, currently line #1032. So far so good.
Then it calls the unexported json.parseTag() function, in json/tags.go source file, currently line #17. This cuts the part after the comma (which becomes the "tag options").
And finally json.isValidTag() function is called with the previous value, in source file json/encode.go, currently line #731. This function checks the runes of the passed string, and (besides a set of pre-defined allowed characters "!#$%&()*+-./:<=>?#[]^_{|}~ ") rejects everything that is not a unicode letter or digit (as defined by unicode.IsLetter() and unicode.IsDigit()):
if !unicode.IsLetter(c) && !unicode.IsDigit(c) {
return false
}
'\u0000' is not part of the pre-defined allowed characters, and as you can guess now, it is neither a letter nor a digit:
// Following code prints "INVALID":
c := '\u0000'
if !unicode.IsLetter(c) && !unicode.IsDigit(c) {
fmt.Println("INVALID")
}
And since isValidTag() returns false, the name (which is the value for the json key, without the "tag options" part) will be discarded (name = "") and not used. So no match will be found for the struct field containing a unicode zero.
For an alternative solution use a map, or a custom json.Unmarshaler or use json.RawMessage.
But I would highly discourage using such ugly json keys. I understand likely you are just trying to parse such json response and it may be out of your reach, but you should fight against using these keys as they will just cause more problems later on (e.g. if stored in db, by inspecting records it will be very hard to spot that there are '\u0000' characters in them as they may be displayed as nothing).
You cannot do in such way due to: http://golang.org/ref/spec#Struct_types
But You can unmarshal to map[string]interface{} then check field names of that object through regexp.
I don't think this is possible with struct tags. The best thing you can do is unmarshal it into map[string]interface{} and then get the values manually:
var b = []byte(`{"\u0000abc":42}`)
var m map[string]interface{}
err := json.Unmarshal(b, &m)
if err != nil {
panic(err)
}
fmt.Println(m, m["\x00abc"])
Playground: http://play.golang.org/p/RtS7Nst0d7.

Importing JSON into R with in-line quotation marks

I'm attempting to read the following JSON file ("my_file.json") into R, which contains the following:
[{"id":"484","comment":"They call me "Bruce""}]
using the jsonlite package (0.9.12), the following fails:
library(jsonlite)
fromJSON(readLines('~/my_file.json'))
receiving an error:
"Error in parseJSON(txt) : lexical error: invalid char in json text.
84","comment":"They call me "Bruce""}]
(right here) ------^"
Here is the output from R escaping of the file:
readLines('~/my_file.json')
"[{\"id\":\"484\",\"comment\":\"They call me \"Bruce\"\"}]"
Removing the quotes around "Bruce" solves the problem, as in:
my_file.json
[{"id":"484","comment":"They call me Bruce"}]
But what is the issue with the escapement?
In R strings literals can be defined using single or double quotes.
e.g.
s1 <- 'hello'
s2 <- "world"
Of course, if you want to include double quotes inside a string literal defined using double quotes you need to escape (using backslash) the inner quotes, otherwise the R code parser won't be able to detect the end of the string correctly (the same holds for single quote).
e.g.
s1 <- "Hello, my name is \"John\""
If you print (using cat¹) this string on the console, or you write this string on a file you will get the actual "face" of the string, not the R literal representation, that is :
> cat("Hello, my name is \"John\"")
Hello, my name is "John"
The json parser, reads the actual "face" of the string, so, in your case json reads :
[{"id":"484","comment":"They call me "Bruce""}]
not (the R literal representation) :
"[{\"id\":\"484\",\"comment\":\"They call me \"Bruce\"\"}]"
That being said, also the json parser needs double-quotes escaping when you have quotes inside strings.
Hence, your string should be modified in this way :
[{"id":"484","comment":"They call me \"Bruce\""}]
If you simply modify your file by adding the backslashes you will be perfectly able to read the json.
Note that the corresponding R literal representation of that string would be :
"[{\"id\":\"484\",\"comment\":\"They call me \\\"Bruce\\\"\"}]"
in fact, this works :
> fromJSON("[{\"id\":\"484\",\"comment\":\"They call me \\\"Bruce\\\"\"}]")
id comment
1 484 They call me "Bruce"
¹
the default R print function (invoked also when you simply press ENTER on a value) returns the corresponding R string literal. If you want to print the actual string, you need to use print(quote=F,stringToPrint), or cat function.
EDIT (on #EngrStudent comment on the possibility to automatize quotes escaping) :
Json parser cannot do quotes escaping automatically.
I mean, try to put yourself in the computer's shoes and image you should parse this (unescaped) string as json: { "foo1" : " : "foo2" : "foo3" }
I see at least three possible escaping giving a valid json:
{ "foo1" : " : \"foo2\" : \"foo3" }
{ "foo1\" : " : "foo2\" : \"foo3" }
{ "foo1\" : \" : \"foo2" : "foo3" }
As you can see from this small example, escaping is really necessary to avoid ambiguities.
Maybe, if the string you want to escape has a really particular structure where you can recognize (without uncertainty) the double-quotes needing to be escaped, you can create your own automatic escaping procedure, but you need to start from scratch, because there's nothing built-in.