I have code that parses json:
(Aeson.Object jsonObj) -> case (HashMap.lookup "one" jsonObj, HashMap.lookup "two" jsonObj, , HashMap.lookup "three" jsonObj) of
(Just one, Just two, Just three) -> -- what's next?
_ -> error "All three keys don't exist
"
How do I retrieve the actual values from "one", "two" and "three"? All the three are Data.Aeson.Value and its data type is defined as following but yet I can't figure out what to do next:
data Value Source
A JSON value represented as a Haskell value.
Constructors
Object !Object
Array !Array
String !Text
Number !Number
Bool !Bool
Null
You can check if the values are of the expected type right in the pattern matching like this:
case HashMap.lookup "one" jsonObj of
(Just (Aeson.String t)) -> -- do with string t what pleases you
_ -> error "key not present or not a string"
You could adapt this to the triple in your code or leave it as a separate function, whatever you see fit. But be aware that this style would really mean that you're not using the Parser monad Aeson offers, which is really nice to dissect arbitrary JSON. Assume in your example above, what you really want to achieve is something like this:
parseTriple :: ByteString -> Maybe (Text, Integer, Bool)
then, using parseEither, this can be expressed as simple as
parseTriple bs = decode bs >>= parseMaybe $ \o -> do
one <- o .: "one"
two <- o .: "two"
three <- o .: "three"
return (one, two, three)
I think this nicely expresses what you're doing with the JSON and is easy to follow and maintain. Also, if you don't use parseMaybe but parseEither, you even get kind-of-useful error messages for free.
Related
I'm trying to parse JSON-LD, and one of the possible constructs is
"John" : {
"type": "person",
"friend": [ "Bob", "Jane" ],
}
I would like to decode into records of type
type alias Triple =
{ subject: String, predicate: String, object: String }
so the example above becomes:
Triple "John" "type" "person"
Triple "John" "friend" "Bob"
Triple "John" "friend" "Jane"
But "friend" in the JSON object could also be just a string:
"friend": "Mary"
in which case the corresponding triple would be
Triple "John" "friend" "Mary"
Any idea?
First, you'll need a way to list all key/value pairs from a JSON object. Elm offers the Json.Decode.keyValuePairs function for this purpose. It gives you a list of key names which you'll use for the predicate field, but you'll also have to describe a decoder for it to use for the values.
Since your values are either a string or a list of strings, you can use Json.Decode.oneOf to help. In this example, we'll just convert a string to a singleton list (e.g. "foo" becomes ["foo"]), just because it makes it easier to map over later.
stringListOrSingletonDecoder : Decoder (List String)
stringListOrSingletonDecoder =
JD.oneOf
[ JD.string |> JD.map (\s -> [ s ])
, JD.list JD.string
]
Since the output of keyValuePairs will be a list of (String, List String) values, we'll need a way to flatten those into a List (String, String) value. We can define that function like this:
flattenSnd : ( a, List b ) -> List ( a, b )
flattenSnd ( key, vals ) =
List.map (\val -> ( key, val )) vals
Now you can use these two functions to split up an object into a triple. This accepts a string argument which is the key to look up in your calling function (e.g. we need to look up the wrapping "John" key).
itemDecoder : String -> Decoder (List Triple)
itemDecoder key =
JD.field key (JD.keyValuePairs stringListOrSingletonDecoder)
|> JD.map
(List.map flattenSnd
>> List.concat
>> List.map (\( a, b ) -> Triple key a b)
)
See a working example here on Ellie.
Note that the order of keys may not match how you listed them in the input JSON, but that is just how JSON works. It's a lookup table, not an ordered list
I'm writing some code to auto-gen JSON codecs for Elm data-structures. There is a point my code, where a "sub-structure/sub-type", has already been encoded to a Json.Encode.Value, and I need to add another key-value pair to it. Is there any way to "destructure" a Json.Encode.Value in Elm? Or combine two values of type Json.Encode.Value?
Here's some sample code:
type alias Entity record =
{ entityKey: (Key record)
, entityVal: record
}
jsonEncEntity : (record -> Value) -> Entity record -> Value
jsonEncEntity localEncoder_record val =
let
encodedRecord = localEncoder_record val.entityVal
in
-- NOTE: the following line won't compile, but this is essentially
-- what I'm looking for
Json.combine encodedRecord (Json.Encode.object [ ( "id", jsonEncKey val.entityKey ) ] )
You can decode the value into a list of key value pairs using D.keyValuePairs D.value and then append the new field. Here's how you'd do that:
module Main exposing (..)
import Json.Decode as D
import Json.Encode as E exposing (Value)
addKeyValue : String -> Value -> Value -> Value
addKeyValue key value input =
case D.decodeValue (D.keyValuePairs D.value) input of
Ok ok ->
E.object <| ( key, value ) :: ok
Err _ ->
input
> import Main
> import Json.Encode as E
> value = E.object [("a", E.int 1)]
{ a = 1 } : Json.Encode.Value
> value2 = Main.addKeyValue "b" E.null value
{ b = null, a = 1 } : Json.Encode.Value
If the input is not an object, this will return the input unchanged:
> Main.addKeyValue "b" E.null (E.int 1)
1 : Json.Encode.Value
If you want to do this, you need to use a decoder to unwrap the values by one level into a Dict String Value, then combine the dictionaries, and finally re-encode as a JSON value. You can unwrap like so:
unwrapObject : Value -> Result String (Dict String Value)
unwrapObject value =
Json.Decode.decodeValue (Json.Decode.dict Json.Decode.value) value
Notice that you have to work with Results from this point on because there's the possibility, as far as Elm is concerned, that your JSON value wasn't really an object (maybe it was a number or a string instead, for instance) and you have to handle that case. For that reason, it's not really best practice to do too much with JSON Values directly; if you can, keep things as Dicts or some other more informative type until the end of processing and then convert the whole result into a Value as the last step.
I'd like to decode a Json file that would look like this:
{ 'result': [
{'id': 1, 'model': 'online', 'app_label': 'some_app_users'},
{'id': 2, 'model': 'rank', 'app_label': 'some_app_users'},
]}
or like this:
{ 'result': [
{'id': 1, 'name': 'Tom', 'skills': {'key': 'value', ...}, {'key': 'value', ...}},
{'id': 1, 'name': 'Bob', 'skills': {'key': 'value', ...}, {'key': 'value', ...}},
]}
Basically, the content under result is a list of dicts with the same keys - but I don't know these keys in advance and I don't know their value types (int, string, dict, etc.).
The goal is to show databases tables content; the Json contains the result of the SQL query.
My decoder looks like this (not compiling):
tableContentDecoder : Decode.Decoder (List dict)
tableContentDecoder =
Decode.at [ "result" ] (Decode.list Decode.dict)
I use it like this:
Http.send GotTableContent (Http.get url tableContentDecoder)
I'm getting that error:
Function list is expecting the argument to be:
Decode.Decoder (Dict.Dict String a)
But it is:
Decode.Decoder a -> Decode.Decoder (Dict.Dict String a)
What's the correct syntax to use the dict decoder? Will that work? I couldn't find any universal Elm decoder...
Decode.list is a function that takes a value of type Decoder a and returns a value of the type Decoder (List a). Decode.dict is also a function that takes a value of type Decoder a that returns a decoder of Decoder (Dict String a). This tells us two things:
We need to pass a decoder value to Decode.dict before we pass it to Decoder.list
A Dict may not fit your use case as Dicts can only map between two fixed types and do not support nest values like 'skills': {'key': 'value', ...}
Elm doesn't provide a universal decoder. The motivation for this has to do with Elm's guarantee of "no runtime errors". When dealing with the outside world, Elm needs to protect its runtime from the possibility of external failures, mistakes, ect. Elm's primary mechanism for doing this is types. Elm only lets data in that is correctly described and by doing so eliminates the possibility of errors that a universal decoder would introduce.
Since your primary goal is to display content, something like Dict String String might work, but it depends on how deeply nested your data is. You could implement this with a small modification to your code: Decode.at [ "result" ] <| Decode.list (Decode.dict Decode.string).
Another possibility is using Decode.value and Decode.andThen to test for values that indicate which table we are reading from.
It's important that our decoder has a single consistent type, which means we would need to represent our possible results as a sum type.
-- represents the different possible tables
type TableEntry
= ModelTableEntry ModelTableFields
| UserTableEntry UserTableFields
| ...
-- we will use this alias as a constructor with `Decode.map3`
type alias ModelTableFields =
{ id : Int
, model : String
, appLabel : String
}
type alias UserTableFields =
{ id : Int
, ...
}
tableContentDecoder : Decoder (List TableEntry)
tableContentDecoder =
Decode.value
|> Decode.andThen
\value ->
let
tryAt field =
Decode.decodeValue
(Decode.at ["result"] <|
Decode.list <|
Decode.at [field] Decode.string)
value
in
-- check the results of various attempts and use
-- the appropriate decoder based on results
case ( tryAt "model", tryAt "name", ... ) of
( Ok _, _, ... ) ->
decodeModelTable
( _, Ok _, ... ) ->
decodeUserTable
...
(_, _, ..., _ ) ->
Decode.fail "I don't know what that was!"
-- example decoder for ModelTableEntry
-- Others can be constructed in a similar manner but, you might
-- want to use NoRedInk/Json.Decode.Pipline for more complex data
decodeModel : Decoder (List TableEntry)
decodeModel =
Decode.list <|
Decode.map3
(ModelTableEntry << ModelTableFields)
(Decode.field "id" Decode.int)
(Decode.field "model" Decode.string)
(Decode.field "app_label" Decode.string)
decodeUser : Decoder (List TableEntry)
decodeUser =
...
It is fair to say that this is a lot more work than most other languages would make you do to parse JSON. However, this comes with the benefit of being able to use outside data without worrying about exceptions.
One way of thinking about it is that Elm makes you do all the work upfront. Where other languages might let you get up and running faster but, do less to help you get to a stable implementation.
I couldn't figure out how to get the Decode.dict to work so I have changed my Json and splited the columns and results:
data={
'columns': [column.name for column in cursor.description],
'results': [[str(column) for column in record] for record in cursor.fetchall()]
}
I also had to convert all the results to String to make it simple. The Json will have 'id': "1" for example.
With the Json done that way, the Elm code is really simple:
type alias QueryResult =
{ columns : List String, results : List (List String) }
tableContentDecoder : Decode.Decoder QueryResult
tableContentDecoder =
Decode.map2
QueryResult
(Decode.field "columns" (Decode.list Decode.string))
(Decode.field "results" (Decode.list (Decode.list Decode.string)))
Let's suppose I have a simple JSON array like this:
[
{
"name": "Alex",
"age": 12
},
{
"name": "Peter"
}
]
Notice that the second object doesn't have an age field.
I'm using JSON4S to query JSON (using the for-comprehension style to extract values):
for {
JArray(persons) <- json
JObject(person) <- persons
JField("name", JString(name)) <- person
JField("age", JString(age)) <- person
} yield new Person(name, age)
The problem for me is that this expression will skip the second object (the one with the missing age field). I don't want to skip such objects; I need to get it as null or better as None.
This answer gives an example of how to deal with null values in JSON using custom extractors, but it works only if the field is present and if its value is null.
Deconstructing objects in json4s may lead to some inconvenience, as you no longer can use fancy \ and \\ queries.
I prefer to do something like that:
for {
JArray(persons) <- json
person#JObject(_) <- persons
JString(name) <- person \ "name"
age = (person \ "age").extractOpt[Int]
} yield (name, age)
res7: List[(String, Option[Int])] = List(("Alex", Some(12)), ("Peter", None))
This example also illustrates two alternatives how object fields can be extracted (you can also use name = (person \ "name").extract[String] instead).
It may be simple question, but I am new to Scala and not able to find the proper solution
I am trying to create a JSON object from the Option values. Will check if the value is not empty then create the Json obj, if the value is None I don't want to create the json object. With out else, default else is Unit which will fail to create Json obj
Json.obj(if(position.nonEmpty) ("position" -> position.get),
if(place.nonEmpty) ("place" -> place.get),
if(country.nonEmpty) ("country" -> country.get))
Need to put the If condition so that the final json string to look like
{
"position": "M2",
"place": "place",
"country": "country"
}
val obj = for {
p <- position
o <- otherOption
...
} yield Json.obj(
"position" -> p,
"other" -> o)
Will only yield a Some of Json Object if all options are defined. Otherwise None
Option is a monad and there are few convenient ways for using it.
First, if you want to extract value you should use map or flatMap and getOrElse methods:
val res = position.map(value => Json.obj("position" -> value)).getOrElse(null)
Another way is to keep Option of another type and use it latter:
val jsonOption = position.map(value => Json.obj("position" -> value))
After you can use it in for comprehension with another options or perform another mutations without extracting:
for (positionJson <- jsonOption; xJson <- xJsonOption) yield positionJson.toString + xJson.toString
jsonOption.map(_.toString).foreach(print(_))
And always try to avoid pattern matching on monads.