std.json - A bit confused with TRUE, FALSE and NULL values - json

I was looking over the std.json library as part of program I am working on, and I'm a bit confused about how to get data out of JSONValues whose types are inferred as TRUE, FALSE or NULL.
For example, if I parse the following JSON:
{
"foo" : "bar"
}
I can then extract the string held in the attribute "foo" by doing something like:
auto json = parseJSON("/path/to/json/example.json");
auto foo_attr = json["foo"].str;
But suppose instead that I had JSON like this:
{
"foo" : false,
"bar" : true,
"baz" : null
}
What would I need to do to get at the attribute values of "foo", "bar" and "baz"?

Look at the variable's type.
auto json = parseJSON("/path/to/json/example.json");
bool foo = json["foo"].type == JSON_TYPE.TRUE;
bool bar = json["bar"].type == JSON_TYPE.TRUE;
bool bazIsNull = json["baz"].type == JSON_TYPE.NULL;
Of course, if you expect that values might have other types, you'll need extra checks.

Related

How to read field from nested json?

this is my test json file.
{
"item" : {
"fracData" : [ ],
"fractimeData" : [ {
"number" : "1232323232",
"timePeriods" : [ {
"validFrom" : "2021-08-03"
} ]
} ],
"Module" : [ ]
}
}
This is how I read the json file.
starhist_test_df = spark.read.json("/mapr/xxx/yyy/ttt/dev/rawdata/Test.json", multiLine=True)
starhist_test_df.createOrReplaceTempView("v_test_df")
This query works.
df_test_01 = spark.sql("""
select item.fractimeData.number from v_test_df""")
df_test_01.collect();
Result
[Row(number=['1232323232'])]
But this query doesn't work.
df_test_01 = spark.sql("""
select item.fractimeData.timePeriods.validFrom from v_test_df""")
df_test_01.collect();
Error
cannot resolve 'v_test_df.`item`.`fractimeData`.`timePeriods`['validFrom']' due to data type mismatch: argument 2 requires integral type, however, ''validFrom'' is of string type.; line 3 pos 0;
What do I have to change, to read the validFrom field?
dot notation to access values works with struct or array<struct> types.
The schema for field number in item.fractimeData is string and accessing it via dot notation returns an array<string> since fractimeData is an array.
Similarly, the schema for field timePeriods in item.fractimeData is <array<struct<validFrom>>, and accessing it via dot notation wraps it into another array, resulting in final schema of array<array<struct<validFrom>>>.
The error you get is because the dot notation can work on array<struct> but not on array<array>.
Hence, flatten the result from item.fractimeData.timePeriods to get back an array<struct<validFrom>> and then apply the dot notation.
df_test_01 = spark.sql("""
select flatten(item.fractimeData.timePeriods).validFrom as validFrom from v_test_df""")
df_test_01.collect()
"""
[Row(validFrom=['2021-08-03', '2021-08-03'])]
"""

Groovy: json get value of id based on child value

I have a json as below:
[
{
"id":6619137,
"oid":"6fykq37gm60x",
"key":{
"key":"6619137"
},
"name":"Prod",
"planKey":{
"key":"PDP"
},
"environments":[
{
"id":6225923,
"key":{
"key":"6619137-6225923"
},
"name":"Production",
"deploymentProjectId":6619137,
}
],
},
{
"id":6619138,
"oid":"6fykq37gm60y",
"key":{
"key":"6619138"
},
"name":"QA",
"planKey":{
"key":"QDP"
},
"environments":[
{
"id":6225924,
"key":{
"key":"6619138-6225924"
},
"name":"QA",
"deploymentProjectId":6619138,
}
],
},
]
I can use the below code to extract the value of id and environments.id based on the name value. projectID will give 6619137 and environmentID will give 6225923
def e = json.planKey.find{it.name=='Prod'}
def projectID = e.id
def environmentID = e.environments[0].id
However, when I try to extract the value of id and environments.id based on the plankey.key value (e.g. PDP or QDP), using the same format above returns me an error of java.lang.NullPointerException: Cannot get property 'id' on null object at Script60.run(Script60.groovy:52)
def e = json.planKey.find{it.planKey=='{key=PDP}'}
def projectID = e.id
def environmentID = e.environments[0].id
Is there a way I can get the projectID with the plankey key value?
Think of your JSON object as a key/value map with multiple levels. There is no such item as
json.find{it.planKey=='{key=PDP}'}
However, you can find with values at any level, like this:
def e = json.find{it.planKey.key == "PDP"}
If you have a structure where planKey may not exist, or it doesn't always have key, that's a bit different, but from your question it sounds like that's not the case here.
EDIT: correcting syntax based on comment below.

A better way to extract the value of a field within an Option[JsValue] in scala?

I have a value in a JsObject which I want to assign to a specific key in Map, and I want to ask if there is a better way to extract that value without using a case matcher.
I have access to a request variable which is a case class that has a parameter myData
myData is an Option[JsValue] which contains multiple fields and I want to return the boolean value of one specific field in there called “myField” in a string format. The below works, but I want to find a more succinct way of getting the value of "myField" without case matching.
val newMap =
Map(
“myNewKey” -> request.myData.map(_ match {
case JsObject(fields) => fields.getOrElse(“myField”, "unknown").toString
case _ => “unknown”})
The output would then be
"myField": "true"
or
"myField": "false"
or if it isn't true or false, i.e the field doesn't exist
"myField": "unknown"
Rather:
myMap ++ (request.myData \ "myField").validateOpt[String].asOpt.
toSeq.collect { // to map entries
case Some(str) => "keyInMap" -> str
}
Do not use .toString to convert a JSON value or pretty print it.

How to merge a dynamically named record with a static one in Dhall?

I'm creating an AWS Step Function definition in Dhall. However, I don't know how to create a common structure they use for Choice states such as the example below:
{
"Not": {
"Variable": "$.type",
"StringEquals": "Private"
},
"Next": "Public"
}
The Not is pretty straightforward using mapKey and mapValue. If I define a basic Comparison:
{ Type =
{ Variable : Text
, StringEquals : Optional Text
}
, default =
{ Variable = "foo"
, StringEquals = None Text
}
}
And the types:
let ComparisonType = < And | Or | Not >
And adding a helper function to render the type as Text for the mapKey:
let renderComparisonType = \(comparisonType : ComparisonType )
-> merge
{ And = "And"
, Or = "Or"
, Not = "Not"
}
comparisonType
Then I can use them in a function to generate the record halfway:
let renderRuleComparisons =
\( comparisonType : ComparisonType ) ->
\( comparisons : List ComparisonOperator.Type ) ->
let keyName = renderComparisonType comparisonType
let compare = [ { mapKey = keyName, mapValue = comparisons } ]
in compare
If I run that using:
let rando = ComparisonOperator::{ Variable = "$.name", StringEquals = Some "Cow" }
let comparisons = renderRuleComparisons ComparisonType.Not [ rando ]
in comparisons
Using dhall-to-json, she'll output the first part:
{
"Not": {
"Variable": "$.name",
"StringEquals": "Cow"
}
}
... but I've been struggling to merge that with "Next": "Sup". I've used all the record merges like /\, //, etc. and it keeps giving me various type errors I don't truly understand yet.
First, I'll include an approach that does not type-check as a starting point to motivate the solution:
let rando = ComparisonOperator::{ Variable = "$.name", StringEquals = Some "Cow" }
let comparisons = renderRuleComparisons ComparisonType.Not [ rando ]
in comparisons # toMap { Next = "Public" }
toMap is a keyword that converts records to key-value lists, and # is the list concatenation operator. The Dhall CheatSheet has a few examples of how to use both of them.
The above solution doesn't work because # cannot merge lists with different element types. The left-hand side of the # operator has this type:
comparisons : List { mapKey : Text, mapValue : Comparison.Type }
... whereas the right-hand side of the # operator has this type:
toMap { Next = "Public" } : List { mapKey : Text, mapValue : Text }
... so the two Lists cannot be merged as-is due to the different types for the mapValue field.
There are two ways to resolve this:
Approach 1: Use a union whenever there is a type conflict
Approach 2: Use a weakly-typed JSON representation that can hold arbitrary values
Approach 1 is the simpler solution for this particular example and Approach 2 is the more general solution that can handle really weird JSON schemas.
For Approach 1, dhall-to-json will automatically strip non-empty union constructors (leaving behind the value they were wrapping) when translating to JSON. This means that you can transform both arguments of the # operator to agree on this common type:
List { mapKey : Text, mapValue : < State : Text | Comparison : Comparison.Type > }
... and then you should be able to concatenate the two lists of key-value pairs and dhall-to-json will render them correctly.
There is a second solution for dealing with weakly-typed JSON schemas that you can learn more about here:
Dhall Manual - How to convert an existing YAML configuration file to Dhall
The basic idea is that all of the JSON/YAML integrations recognize and support a weakly-typed JSON representation that can hold arbitrary JSON data, including dictionaries with keys of different shapes (like in your example). You don't even need to convert the entire the expression to this weakly-typed representation; you only need to use this representation for the subset of your configuration where you run into schema issues.
What this means for your example, is that you would change both arguments to the # operator to have this type:
let Prelude = https://prelude.dhall-lang.org/v12.0.0/package.dhall
in List { mapKey : Text, mapValue : Prelude.JSON.Type }
The documentation for Prelude.JSON.Type also has more details on how to use this type.

std.json - Any way to check if a JSONValue has a particular field

Suppose I have an unknown bit of JSON, and I want to check if it has a form similar to this:
{
"foo": stuff
"bar": stuff
}
where stuff is anything - integer, object, whatever. If I do something like this:
auto json = parseJSON("{}");
auto foo = json["foo"];
I will get a segfault. Is there any way to gracefully handle this (return null, throw exception, anything other than a segfault)?
Just use the D in operator, like with a D associative array:
auto foo = "foo" in json ? json["foo"].str : null;
If you're using DMD 2.065 or older, you need to use json.object for the in operator:
auto foo = "foo" in json.object ? json["foo"].str : null;