JSON Decoder in Elm 0.18 - json

In Elm 0.18, I would like to build a JSON decoder for the following examples:
case 1:
{"metadata": {"signatures":[{"metadata": {"code": "1234"}},
{"metadata": {"code": "5678"}}]}}
-> { code = Just "1234" }
case 2:
{"metadata": {"signatures":[]}}
-> { code = Nothing }
case 3:
{"metadata": {"signatures":[{"metadata": null}]}}
-> { code = Nothing }
This is what I got working, but it fails for case 3.
type alias Code = { code : Maybe String }
let
js = """{"metadata": {"signatures":[{"metadata": {"code": "1234"}},
{"metadata": {"code": "5678"}}]}}"""
dec1 =
Decode.at [ "metadata", "code" ] Decode.string
dec0 =
Decode.list dec1
|> Decode.andThen
(\v ->
if List.isEmpty v then
Decode.succeed Nothing
else
Decode.succeed <| List.head v
)
dec =
decode Code
|> optionalAt [ "metadata", "signatures" ] dec0 Nothing
expected =
Ok { code = Just "1234" }
in
Decode.decodeString dec js
|> Expect.equal expected
A workaround would be to import all the data to the model and then obtain the info from the model, but I prefer to avoid adding unnecessary data into my model. How can I improve this?

A more simplified approach could use Json.Decode.index to force the decoding at index zero as a string if it exists, which will fail otherwise, so you can use Json.Decode.maybe to return Nothing on failure.
dec0 =
Decode.maybe (Decode.index 0 dec1)

Related

F# JSON Type Provider, do not serialize null values

Background
I am using the FSharp.Data JSON Type Provider with a sample that has an array of objects that may have different properties. Here is an illustrative example:
[<Literal>]
let sample = """
{ "input": [
{ "name": "Mickey" },
{ "year": 1928 }
]
}
"""
type InputTypes = JsonProvider< sample >
The JSON Type Provider creates an Input type which has both an Optional Name and an Optional Year property. That works well.
Problem
When I try to pass an instance of this to the web service, I do something like this:
InputTypes.Root(
[|
InputTypes.Input(Some("Mouse"), None)
InputTypes.Input(None, Some(2028))
|]
)
The web service is receiving the following and choking on the nulls.
{
"input": [
{
"name": "Mouse",
"year": null
},
{
"name": null,
"year": 2028
}
]
}
What I Tried
I find that this works:
InputTypes.Root(
[|
InputTypes.Input(JsonValue.Parse("""{ "name": "Mouse" }"""))
InputTypes.Input(JsonValue.Parse("""{ "year": 2028 }"""))
|]
)
It sends this:
{
"input": [
{
"name": "Mouse"
},
{
"year": 2028
}
]
}
However, on my real project, the structures are larger and would require a lot more conditional JSON string building. It kind of defeats the purpose.
Questions
Is there a way to cause the JSON Type Provider to not serialize null properties?
Is there a way to cause the JSON Type Provider to not serialize empty arrays?
As a point of comparison, the Newtonsoft.JSON library has a NullValueHandling attribute.
I don't think there is an easy way to get the JSON formatting in F# Data to drop the null fields - I think the type does not clearly distinguish between what is null and what is missing.
You can fix that by writing a helper function to drop all null fields:
let rec dropNullFields = function
| JsonValue.Record flds ->
flds
|> Array.choose (fun (k, v) ->
if v = JsonValue.Null then None else
Some(k, dropNullFields v) )
|> JsonValue.Record
| JsonValue.Array arr ->
arr |> Array.map dropNullFields |> JsonValue.Array
| json -> json
Now you can do the following and get the desired result:
let json =
InputTypes.Root(
[|
InputTypes.Input(Some("Mouse"), None)
InputTypes.Input(None, Some(2028))
|]
)
json.JsonValue |> dropNullFields |> sprintf "%O"

Decoding polymorphic JSON objects into elm with andThen

My JSON looks similar to this:
{ "items" :
[ { "type" : 0, "order": 10, "content": { "a" : 10, "b" : "description", ... } }
, { "type" : 1, "order": 11, "content": { "a" : 11, "b" : "same key, but different use", ... } }
, { "type" : 2, "order": 12, "content": { "c": "totally different fields", ... } }
...
]
}
and I want to use the type value to decide what union type to create while decoding. So, I defined alias types and decoders for all the above in elm :
import Json.Decode exposing (..)
import Json.Decode.Pipeline exposing (..)
type alias Type0Content = { a : Int, b : String }
type alias Type1Content = { a : Int, b2 : String }
type alias Type2Content = { c : String }
type Content = Type0 Type0Content | Type1 Type1Content | Type2 Type2Content
type alias Item = { order : Int, type : Int, content: Content }
decode0 = succeed Type0Content
|> requiredAt ["content", "a"] int
|> requiredAt ["content", "b"] string
decode1 = succeed Type1Content
|> requiredAt ["content", "a"] int
|> requiredAt ["content", "b"] string
decode2 = succeed Type2Content
|> requiredAt ["content", "c"] string
decodeContentByType hint =
case hint of
0 -> Type0 decode0
1 -> Type1 decode1
2 -> Type2 decode2
_ -> fail "unknown type"
decodeItem = succeed Item
|> required "order" int
|> required "type" int `andThen` decodeContentByType
Can't get the last two functions to interact as needed.
I've read through page 33 of json-survival-kit by Brian Thicks, but that didn't bring me on track either.
Any advice and lecture appreciated!
It looks like the book was written targeting Elm 0.17 or below. In Elm 0.18, the backtick syntax was removed. You will also need to use a different field name for type since it is a reserved word, so I'll rename it type_.
Some annotations might help narrow down bugs. Let's annotate decodeContentByType, because right now, the branches aren't returning the same type. The three successful values should be mapping the decoder onto the expected Content constructor:
decodeContentByType : Int -> Decoder Content
decodeContentByType hint =
case hint of
0 -> map Type0 decode0
1 -> map Type1 decode1
2 -> map Type2 decode2
_ -> fail "unknown type"
Now, to address the decodeItem function. We need three fields to satisfy the Item constructor. The second field is the type, which can be obtained via required "type" int, but the third field relies on the "type" value to deduce the correct constructor. We can use andThen (with pipeline syntax as of Elm 0.18) after fetching the Decoder Int value using Elm's field decoder:
decodeItem : Decoder Item
decodeItem = succeed Item
|> required "order" int
|> required "type" int
|> custom (field "type" int |> andThen decodeContentByType)

How to handle error in JSON data has incorrect is node

I'm expecting following json format:
'{
"teamId" : 9,
"teamMembers" : [ {
"userId" : 1000
}, {
"userId" : 2000
}]
}'
If I test my code with following format:-
'{
"teaXmId" : 9,
"teamMembers" : [ {
"usXerId" : 1000
}, {
"userXId" : 2000
}]
}'
I'm parsing json value as follows:-
val userId = (request.body \\ "userId" )
val teamId = (request.body \ "teamId")
val list = userId.toList
list.foreach( x => Logger.info("x val: "+x)
It doesn't throw any error to handle. Code execution goes one. Later if I try to use teamId or userId, of course it doesn't work then.
So how to check whether parsing was done correctly or stop execution right away and notify user to provide correct json format
If a value is not found when using \, then the result will be of type JsUndefined(msg). You can throw an error immediately by making sure you have the type you expect:
val teamId = (request.body \ "teamId").as[Int]
or:
val JsNumber(teamId) = request.body \ "teamId" //teamId will be BigDecimal
When using \\, if nothing is found, then an empty List is returned, which makes sense. If you want to throw an error when a certain key is not found on any object of an array, you might get the object that contains the list and proceed from there:
val teamMembers = (request.body \"teamMembers").as[Seq[JsValue]]
or:
val JsObject(teamMembers) = request.body \ "teamMembers"
And then:
val userIds = teamMembers.map(v => (v \ "userId").as[Int])

Scala 2.10: Array + JSON arrays to hashmap

After reading a JSON result from a web service response:
val jsonResult: JsValue = Json.parse(response.body)
Containing content something like:
{
result: [
["Name 1", "Row1 Val1", "Row1 Val2"],
["Name 2", "Row2 Val1", "Row2 Val2"]
]
}
How can I efficiently map the contents of the result array in the JSON with a list (or something similar) like:
val keys = List("Name", "Val1", "Val2")
To get an array of hashmaps?
Something like this ?
This solution is functional and handles None/Failure cases "properly" (by returning a None)
val j = JSON.parseFull( json ).asInstanceOf[ Option[ Map[ String, List[ List[ String ] ] ] ] ]
val res = j.map { m ⇒
val r = m get "result"
r.map { ll ⇒
ll.foldRight( List(): List[ Map[ String, String ] ] ) { ( l, acc ) ⇒
Map( ( "Name" -> l( 0 ) ), ( "Val1" -> l( 1 ) ), ( "Val2" -> l( 2 ) ) ) :: acc
}
}.getOrElse(None)
}.getOrElse(None)
Note 1: I had to put double quotes around result in the JSON String to get the JSON parser to work
Note 2: the code could look nicer using more "monadic" sugar such as for comprehensions or using applicative functors

Using RJSONIO and AsIs class

I am writing some helper functions to convert my R variables to JSON. I've come across this problem: I would like my values to be represented as JSON arrays, this can be done using the AsIs class according to the RJSONIO documentation.
x = "HELLO"
toJSON(list(x = I(x)), collapse="")
"{ \"x\": [ \"HELLO\" ] }"
But say we have a list
y = list(a = "HELLO", b = "WORLD")
toJSON(list(y = I(y)), collapse="")
"{ \"y\": {\n \"a\": \"HELLO\",\n\"b\": \"WORLD\" \n} }"
The value found in y -> a is NOT represented as an array. Ideally I would have
"{ \"y\": [{\n \"a\": \"HELLO\",\n\"b\": \"WORLD\" \n}] }"
Note the square brackets. Also I would like to get rid of all "\n"s, but collapse does not eliminate the line breaks in nested JSON. Any ideas?
try writing as
y = list(list(a = "HELLO", b = "WORLD"))
test<-toJSON(list(y = I(y)), collapse="")
when you write to file it appears as:
{ "y": [
{
"a": "HELLO",
"b": "WORLD"
}
] }
I guess you could remove the \n as
test<-gsub("\n","",test)
or use RJSON package
> rjson::toJSON(list(y = I(y)))
[1] "{\"y\":[{\"a\":\"HELLO\",\"b\":\"WORLD\"}]}"
The reason
> names(list(a = "HELLO", b = "WORLD"))
[1] "a" "b"
> names(list(list(a = "HELLO", b = "WORLD")))
NULL
examining the rjson::toJSON you will find this snippet of code
if (!is.null(names(x)))
return(toJSON(as.list(x)))
str = "["
so it would appear to need an unnamed list to treat it as a JSON array. Maybe RJSONIO is similar.