How to process null value inside json string using lua? - json

I am using lua in asterisk pbx. I encounter following problem while processing json string.
json "null" value converted to function type in lua. why?
how to handle this scenario? because i am expecting nil because no value means null in json and nil means nothing in lua.
local json = require( "json" )
local inspect = require("inspect")
local myjson_str='{"Sms":{"key":"xxxxxxxxxxxxxxxxxxxxx","to":"{caller}","senderid":null,"type":"Simple","content":"Your request has been accepted in Previous Miss call. We get back to you very soon."}}'
local myjson_table = json.decode(myjson_str)
print(type(myjson_table["Sms"]["senderid"]))
print(myjson_table)
print(inspect(myjson_table))
print(json.encode(myjson_table))
out put for above is
function
table: 0xf5e770
{
Sms = {
content = "Your request has been accepted in Previous Miss call. We get back to you very soon.",
key = "xxxxxxxxxxxxxxxxxxxxx",
senderid = <function 1>,
to = "{caller}",
type = "Simple"
}
}
{"Sms":{"type":"Simple","key":"xxxxxxxxxxxxxxxxxxxxx","senderid":null,"content":"Your request has been accepted in Previous Miss call. We get back to you very soon.","to":"{caller}"}}

It is up to specific library to decide how to represent null value.
Using nil has its own problem because its not possible find either
original JSON has key with null value or there no such key at all.
So some libraries just return some unique value. Some provide
a way to pass this value like json.deconde(str, NULL_VALUE).
So answer is just read the doc/source of library you use.
Most likely it provide something like json.null value to check
either value is null. But function is really strange choice because
they have some undetermined rules of uniqueness.
Or try another library.

First of all, #moteus is right:
It is up to specific library to decide how to represent null value
If you're using the JSON library by Jeffrey Friedl the solution is to use a placeholder instead of null and serializing the table structure to a json string using designated encode options:
-- define a placeholder
NullPlaceholder = "\0"
-- use it in an internal table
tableStructure = {}
tableStructure['someNullValue'] = NullPlaceholder
-- pass the placeholder to the encode methode
encode_options = { null = NullPlaceholder }
jsonString = JSON:encode(tableStructure, nil, encode_options)
which leads to
{"someNullValue": null}

Related

How can I convert a JSON response to a hash and extract a specific key-value pair?

Could someone advise me how to correctly parse a JSON response and convert it into a hash, please?
I would also like to extract a specific key-value pair.
The following is my JSON response.
{
"rates""=>"{
"CAD"=>1.266485552,
"HKD"=>7.7526138141,
"DKK"=>6.1233226311,
"HUF"=>294.047913065,
"PLN"=>3.702560303
},
"base""=>""USD",
"date""=>""2021-02-11"
}
Furthermore, I have the following method that can parse that JSON response. However, I would like to extract a specific key-value from 'rates' where the currency variable is my key.
def weather_url
#weather = "#{API_URL}latest?base=USD"
end
def specific_currency(currency)
#response_body ||= RestClient.get(#weather).body
#hash_response = {}
#hash_response = JSON.parse(#response_body).to_hash.assoc(currency)
end
You can use select to extract the expected key and value.
Please refer ruby select.
def specific_currency(currency)
#response_body ||= RestClient.get(#weather).body
#hash_response = JSON.parse(#response_body)['rates'].select { |element| element.to_s == currency }
end
I'll assume that your stating point is actually valid JSON, although what you've written above is not valid JSON...
First, there is no need to write this line:
#hash_response = {}
You're assigning the variable to a meaningless value, then immediately reassigning it to the real value.
There is also no need to call to_hash on the object, You've already got a hash.
I'm not entirely clear what you mean by "extract specific key and value", because you didn't say what you expect that method to return, but probably all you need to do is this:
def specific_currency(currency)
#response_body ||= RestClient.get(#weather).body
JSON.parse(#response_body)['rates'][currency]
end
Or if you prefer,
JSON.parse(#response_body).dig('rates', currency)

Unable to unmarshal json into protobuf message

My problem is pretty much the opposite of this one: Unable to unmarshal json to protobuf struct field
I have a message with several nested messages of the following form:
message MyMsg {
uint32 id = 1;
message Attribute {
...
}
repeated Attribute attrs = 2;
message OtherAttribute {
...
}
OtherAttribute oAttr = 3;
...
}
Some external dependencies will send this message JSON form, which needs to then be unmarshalled into a go struct. When trying to use jsonpb like so, where resp is a *http.Response:
msg := &MyMsg{}
jsonpb.Unmarshal(resp.Body, msg)
The message is not fully decoded into the struct, i.e. some of the nested structs are missing. When the message is however decoded simply using encoding/json like so:
msg := &MyMsg{}
json.NewDecoder(resp.Body).Decode(msg)
All attributes are successfully decoded into the struct.
As jsonpb is the official package to (un)marshall between protobuf/json, I was wondering whether anyone has any idea as to why this type of behaviour could occur. Do the default behaviours for jsonpb and encoding/json differ in a way that would explain one being able to unmarshall and the other not? If so where would one configure the behaviour of jsonpb accordingly?
The default behaviour of encoding/json is the following:
Unknown fields are allowed, i.e. in case of a field not matching it is simply ignored without an error being raised.
Before it is ignored, the Decoder attempts to match the field without case sensitivity
The behaviour in point 1 can be replicated in jsonpb by using the Unmarshaller struct and setting the property AllowUnknownFields to true
var umrsh = jsonpb.Unmarshaler{}
umrsh.AllowUnknownFields = true
msg := &MyMsg{}
umrsh.Unmarshal(resp.Body, msg)
It does not seem to be possible to replicate the behaviour from point 2 within jsonpb.

how to Validate if JSON Path Exists in JSON

In a given json document, how to validate if a json path exists ?
I am using jayway-jsonpath and have the below code
JsonPath.read(jsonDocument, jsonPath)
The above code can potentially throw below exception
com.jayway.jsonpath.PathNotFoundException: No results for path:
$['a.b.c']
In order to mitigate it, I intend to validate if the path exists before trying to read it with JsonPath.read
For reference I went through the following 2 documentations, but couldn't really get what I want.
http://www.baeldung.com/guide-to-jayway-jsonpath
https://github.com/json-path/JsonPath
Whilst it is true that you can catch an exception, like it is mentioned in the comments there might be a more elegant way to check if a path exists without writing try catch blocks all over the code.
You can use the following configuration option with jayway-jsonpath:
com.jayway.jsonpath.Option.SUPPRESS_EXCEPTIONS
With this option active no exception is thrown. If you use the read method, it simply returns null whenever a path is not found.
Here is an example with JUnit 5 and AssertJ showing how you can use this configuration option, avoiding try / catch blocks just for checking if a json path exists:
#ParameterizedTest
#ArgumentsSource(CustomerProvider.class)
void replaceStructuredPhone(JsonPathReplacementArgument jsonPathReplacementArgument) {
DocumentContext dc = jsonPathReplacementHelper.replaceStructuredPhone(
JsonPath.parse(jsonPathReplacementArgument.getCustomerJson(),
Configuration.defaultConfiguration().addOptions(Option.SUPPRESS_EXCEPTIONS)),
"$.cps[5].contactPhoneNumber", jsonPathReplacementArgument.getUnStructuredPhoneNumberType());
UnStructuredPhoneNumberType unstructRes = dc.read("$.cps[5].contactPhoneNumber.unStructuredPhoneNumber");
assertThat(unstructRes).isNotNull();
// this path does not exist, since it should have been deleted.
Object structRes = dc.read("$.cps[5].contactPhoneNumber.structuredPhoneNumber");
assertThat(structRes).isNull();
}
You can also create a JsonPath object or ReadContext with a Configuration if you have a use case to check multiple paths.
// Suppress errors thrown by JsonPath and instead return null if a path does not exist in a JSON blob.
Configuration suppressExceptionConfiguration = Configuration
.defaultConfiguration()
.addOptions(Option.SUPPRESS_EXCEPTIONS);
ReadContext jsonData = JsonPath.using(suppressExceptionConfiguration).parse(jsonString);
for (int i = 0; i < listOfPaths.size(); i++) {
String pathData = jsonData.read(listOfPaths.get(i));
if (pathData != null) {
// do something
}

JSON.parse and JSON.stringify are not idempotent and that is bad

This question is multipart-
(1a) JSON is fundamental to JavaScript, so why is there no JSON type? A JSON type would be a string that is formatted as JSON. It would be marked as parsed/stringified until the data was altered. As soon as the data was altered it would not be marked as JSON and would need to be re-parsed/re-stringified.
(1b) In some software systems, isn't it possible to (accidentally) attempt to send a plain JS object over the network instead of a serialized JS object? Why not make an attempt to avoid that?
(1c) Why can't we call JSON.parse on a straight up JavaScript object without stringifying it first?
var json = { //JS object in properJSON format
"baz":{
"1":1,
"2":true,
"3":{}
}
};
var json0 = JSON.parse(json); //will throw a parse error...bad...it should not throw an error if json var is actually proper JSON.
So we have no choice but to do this:
var json0= JSON.parse(JSON.stringify(json));
However, there are some inconsistencies, for example:
JSON.parse(true); //works
JSON.parse(null); //works
JSON.parse({}); //throws error
(2) If we keep calling JSON.parse on the same object, eventually it will throw an error. For example:
var json = { //same object as above
"baz":{
"1":1,
"2":true,
"3":{}
}
};
var json1 = JSON.parse(JSON.stringify(json));
var json2 = JSON.parse(json1); //throws an error...why
(3) Why does JSON.stringify infinitely add more and more slashes to the input? It is not only hard to read the result for debugging, but it actually puts you in dangerous state because one JSON.parse call won't give you back a plain JS object, you have to call JSON.parse several times to get back the plain JS object. This is bad and means it is quite dangerous to call JSON.stringify more than once on a given JS object.
var json = {
"baz":{
"1":1,
"2":true,
"3":{}
}
};
var json2 = JSON.stringify(json);
console.log(json2);
var json3 = JSON.stringify(json2);
console.log(json3);
var json4 = JSON.stringify(json3);
console.log(json4);
var json5 = JSON.stringify(json4);
console.log(json5);
(4) What is the name for a function that we should be able to call over and over without changing the result (IMO how JSON.parse and JSON.stringify should behave)? The best term for this seems to be "idempotent" as you can see in the comments.
(5) Considering JSON is a serialization format that can be used for networked objects, it seems totally insane that you can't call JSON.parse or JSON.stringify twice or even once in some cases without incurring some problems. Why is this the case?
If you are someone who is inventing the next serialization format for Java, JavaScript or whatever language, please consider this problem.
IMO there should be two states for a given object. A serialized state and a deserialized state. In software languages with stronger type systems, this isn't usually a problem. But with JSON in JavaScript, if call JSON.parse twice on the same object, we run into fatal exceptions. Likewise, if we call JSON.stringify twice on the same object, we can get into an unrecoverable state. Like I said there should be two states and two states only, plain JS object and serialized JS object.
1) JSON.parse expects a string, you are feeding it a Javascript object.
2) Similar issue to the first one. You feed a string to a function that needs an object.
3) Stringfy actually expects a string, but you are feeding it a String object. Therefore, it applies the same measures to escape the quotes and slashes as it would for the first string. So that the language can understand the quotes, other special characters inside the string.
4) You can write your own function for this.
5) Because you are trying to do a conversion that is illegal. This is related to the first and second question. As long as the correct object types are fed, you can call it as many times as you want. The only problem is the extra slashes but it is in fact the standard.
We'll start with this nightmare of your creation: string input and integer output.
IJSON.parse(IJSON.stringify("5")); //=> 5
The built-in JSON functions would not fail us this way: string input and string output.
JSON.parse(JSON.stringify("5")); //=> "5"
JSON must preserve your original data types
Think of JSON.stringify as a function that wraps your data up in a box, and JSON.parse as the function that takes it out of a box.
Consider the following:
var a = JSON.stringify;
var b = JSON.parse;
var data = "whatever";
b(a(data)) === data; // true
b(b(a(a(data)))) === data; // true
b(b(b(a(a(a(data)))))) === data; // true
That is, if we put the data in 3 boxes, we have to take it out of 3 boxes. Right?
If I put my data in 2 boxes and take it out of 1, I'm not holding my data yet, I'm holding a box that contains my data. Right?
b(a(a(data))) === data; // false
Seems sane to me...
JSON.parse unboxes your data. If it is not boxed, it cannot unbox it. JSON.parse expects a string input and you're giving it a JavaScript object literal
The first valid call to JSON.parse would return an object. Calling JSON.parse again on this object output would result in the same failure as #1
repeated calls to JSON.stringify will "box" our data multiple times. So of course you have to use repeated calls to JSON.parse then to get your data out of each "box"
Idempotence
No, this is perfectly sane. You can't triple-stamp a double-stamp.
You'd never make a mistake like this, would you?
var json = IJSON.stringify("hi");
IJSON.parse(json);
//=> "hi"
OK, that's idempotent, but what about
var json = IJSON.stringify("5");
IJSON.parse(json);
//=> 5
UH OH! We gave it a string each time, but the second example returns an integer. The input data type has been lost!
Would the JSON functions have failed us here?
var json = JSON.stringify("hi");
JSON.parse(json);
//=> "hi"
All good. And what about the "5" ?
var json = JSON.stringify("5");
JSON.parse(json));
//=> "5"
Yay, the types have been preseved! JSON works, IJSON does not.
Maybe a more real-life example:
OK, so you have a busy app with a lot of developers working on it. It makes
reckless assumptions about the types of your underlying data. Let's say it's a chat app that makes several transformations on messages as they move from point to point.
Along the way you'll have:
IJSON.stringify
data moves across a network
IJSON.parse
Another IJSON.parse because who cares? It's idempotent, right?
String.prototype.toUpperCase — because this is a formatting choice
Let's see the messages
bob: 'hi'
// 1) '"hi"', 2) <network>, 3) "hi", 4) "hi", 5) "HI"
Bob's message looks fine. Let's see Alice's.
alice: '5'
// 1) '5'
// 2) <network>
// 3) 5
// 4) 5
// 5) Uncaught TypeError: message.toUpperCase is not a function
Oh no! The server just crashed. You'll notice it's not even the repeated calling of IJSON.parse that failed here. It would've failed even if you called it once.
Seems like you were doomed from the start... Damned reckless devs and their careless data handling!
It would fail if Alice used any input that happened to also be valid JSON
alice: '{"lol":"pwnd"}'
// 1) '{"lol":"pwnd"}'
// 2) <network>
// 3) {lol:"pwnd"}
// 4) {lol:"pwnd"}
// 5) Uncaught TypeError: message.toUpperCase is not a function
OK, unfair example maybe, right? You're thinking, "I'm not that reckless, I
wouldn't call IJSON.stringify or IJSON.parse on user input like that!"
It doesn't matter. You've fundamentally broken JSON because the original
types can no longer be extracted.
If I box up a string using IJSON, and then unbox it, who knows what I will get back? Certainly not you, and certainly not the developer using your reckless function.
"Will I get a string type back?"
"Will I get an integer?"
"Maybe I'll get an object?"
"Maybe I will get cake. I hope it's cake"
It's impossible to tell!
You're in a whole new world of pain because you've been careless with your data types from the start. Your types are important so start handling them with care.
JSON.stringify expects an object type and JSON.parse expects a string type.
Now do you see the light?
I'll try to give you one reason why JSON.parse cannot be called multiple time on the same data without us having a problem.
you might not know it but a JSON document does not have to be an object.
this is a valid JSON document:
"some text"
lets store the representation of this document inside a javascript variable:
var JSONDocumentAsString = '"some text"';
and work on it:
var JSONdocument = JSON.parse(JSONDocumentAsString);
JSONdocument === 'some text';
this will cause an error because this string is not the representation of a JSON document
JSON.parse(JSONdocument);
// SyntaxError: JSON.parse: unexpected character at line 1 column 1 of the JSON data
in this case how could have JSON.parse guessed that JSONdocument (being a string) was a JSON document and that it should have returned it untouched ?

A JSON text must at least contain two octets

I received this error, and I couldn't find any reasonable answer to this question, so I thought I'd write a summary of the problem.
If you run this snippet in irb:
JSON.parse( nil )
You'll see the following error:
TypeError: can't convert nil into String
I was kind of expecting the function to return nil, and not a TypeError. If you convert all input using to_s, then you'll see the octet error:
JSON::ParserError: A JSON text must at least contain two octets!
That's just fine and well. If you don't know what an octet is, read this post for a summary and solution:
What is a JSON octet and why are two required?
Solution
The variable you're passing in is an empty string. Don't attempt to use an empty string in the JSON.parse method.
Question
So, now I know the cause of the error, what pattern should I use to handle this? I'm a bit loathe to monkey patch the JSON library to allow nil values. Any suggestions would be greatly appreciated.
parsed = json && json.length >= 2 ? JSON.parse(json) : nil
But really the library should be able to handle this case and return nil. Web browsers with built-in JSON support seem to work just like you expect after all.
Or to do it with a only slightly intrusive mini patch:
module JSON
def self.parse_nil(json)
JSON.parse(json) if json && json.length >= 2
end
end
parsed = JSON.parse_nil(json)
data.presence && JSON.parse(data)
JSON.parse(data.presence || '{}')
According to json.org
JSON is built on two structures:
A collection of name/value pairs. In various languages, this is realized as an object, record, struct, dictionary, hash table, keyed list, or associative array.
An ordered list of values. In most languages, this is realized as an array, vector, list, or sequence.
So, minimum two octets(8 bits) required at the top level would be {} or []
IMO, the best solution would be to make sure the argument to JSON.parse is either an strigified object or a strigified array. :-)
hash = JSON.parse(json) rescue {}
array = JSON.parse(json) rescue []
string = JSON.parse(json) rescue ''