Haskell JSON parsing with Aeson - json

I have a JSON data source which looks like this:
{ "fields": [
{ "type": "datetime",
"name": "Observation Valid",
"description": "Observation Valid Time"},
{ "type": "datetime",
"name": "Observation Valid UTC",
"description": "Observation Valid Time UTC"},
{ "type": "number",
"name": "Air Temperature[F]",
"description": "Air Temperature at 2m AGL"},
{ "type": "number",
"name": "Wind Speed[kt]",
"description": "Wind Speed"},
{ "type": "number",
"name": "Wind Gust[kt]",
"description": "Wind Gust"},
{ "type": "number", "name":
"Wind Direction[deg]",
"description": "Wind Direction"}
],
"rows": [
["2018-04-22T00:10:00", "2018-04-22T05:10:00Z", 50.0, 9.0, null, 50.0],
["2018-04-22T00:15:00", "2018-04-22T05:15:00Z", 50.0, 9.0, null, 60.0],
["2018-04-22T00:20:00", "2018-04-22T05:20:00Z", 50.0, 8.0, null, 60.0],
["2018-04-22T00:30:00", "2018-04-22T05:30:00Z", 50.0, 9.0, null, 60.0]
]
}
( https://mesonet.agron.iastate.edu/json/obhistory.py?station=TVK&network=AWOS&date=2018-04-22 )
And tried several data descriptions, lastly this:
data Entry = -- Data entries
Entry { time :: Text -- Observation Valid Time
, timeUTC :: Text -- Observation Valid Time UTC
, airTemp :: Float -- Air Temperature[F] at 2m AGL
, wind :: Float -- Wind Speed [kt]
, gust :: Float -- Wind Gust [kt]
, direction :: Int -- Wind Direction[deg]
} deriving (Show,Generic)
data Field = -- Schema Definition
Field { ftype :: String --
, name :: String --
, description :: String --
} deriving (Show,Generic)
data Record =
Record { fields :: [Field] --
, rows :: [Entry] -- data
} deriving (Show,Generic)
-- Instances to convert our type to/from JSON.
instance FromJSON Entry
instance FromJSON Field
instance FromJSON Record
-- Get JSON data and decode it
dat <- (eitherDecode <$> getJSON) :: IO (Either String Record)
which gives this error:
Error in $.fields[0]: key "ftype" not present
The (first) error comes from the field definitions (which I don’t use). In the JSON the Entry’s are arrays of mixed types, but in the Haskell it is just a data structure, not an array – not sure how to reconcile these.
No doubt a beginner error – but I haven’t found any examples which seem to have this structure. Do I need to write a custom parser for this?

Three things prevent this from working as intended:
The JSON data contains a field named "type" . A custom FromJson instance for the Field record type can handle this.
The data in the Entry type is unnamed so it is better represented as either a data record without field names or a tuple.
The Float representing wind gust is sometimes null so it should be a Maybe Float
The code below contains all of these modifications and parses your example JSON data :
{-# LANGUAGE DeriveGeneric #-}
{-# LANGUAGE OverloadedStrings #-}
import Data.ByteString.Lazy as BSL
import Data.Text (Text)
import Data.Aeson
import GHC.Generics
-- Either this tuple definition of Entry or the data definition without
-- names (commented out) will work.
type Entry = -- Data entries
( Text -- Observation Valid Time
, Text -- Observation Valid Time UTC
, Float -- Air Temperature[F] at 2m AGL
, Float -- Wind Speed [kt]
, Maybe Float -- Wind Gust [kt]
, Int -- Wind Direction[deg]
)
-- data Entry = -- Data entries
-- Entry Text -- Observation Valid Time
-- Text -- Observation Valid Time UTC
-- Float -- Air Temperature[F] at 2m AGL
-- Float -- Wind Speed [kt]
-- (Maybe Float) -- Wind Gust [kt]
-- Int -- Wind Direction[deg]
-- deriving (Show,Generic)
-- instance FromJSON Entry
data Field = -- Schema Definition
Field { ftype :: String --
, name :: String --
, description :: String --
} deriving (Show,Generic)
instance FromJSON Field where
parseJSON = withObject "Field" $ \v -> Field
<$> v .: "type"
<*> v .: "name"
<*> v .: "description"
data Record =
Record { fields :: [Field] --
, rows :: [Entry] -- data
} deriving (Show,Generic)
instance FromJSON Record
getJSON :: IO ByteString
getJSON = BSL.readFile "json.txt"
main :: IO()
main = do
-- Get JSON data and decode it
dat <- (eitherDecode <$> getJSON) :: IO (Either String Record)
case dat of
Right parsed -> print parsed
Left err -> print err

Field has an ftype field, so AESON is trying to find ftype in the JSON but can't (as it contains ftype). I understand you can't name a field type in Haskell so you need to find a way make AESON use a different name. You need to use template Haskell and set fieldLabelModifier accordingly. Alternatively, writing the insistence manually might be simpler.

Related

Aeson: converting a JSON object to a List of key, value type

I have some JSON fields stored in a database which contain a String -> Double mapping, e.g.:
{
"some type of thing": 0.45,
"other type of thing": 0.35,
"something else": 0.2
}
I want to represent this as a ThingComposition:
data ThingType = ThingTypeSome
| ThingTypeOther
| ThingTypeUnknown Text
-- | Create a ThingType from a text representation
txtToThing :: Text -> ThingType
txtToThing "some type of thing" = ThingTypeSome
txtToThing "other type of thing" = ThingTypeOther
txtToThing s = ThingTypeUnknown s
-- Deserialise ThingType from JSON text
instance FromJSON ThingType where
parseJSON val = withText "ThingType" (return . txtToThing) val
data ThingComposition = ThingComposition [(ThingType, Double)]
| InvalidThingComposition
instance FromJSON ThingComposition where
parseJSON val = withObject "ThingComposition"
_
val
The _ is what I have no idea how to fill out. I've tried something like the following but I can't get the types to align and I can't work out the best way to do this, given that it's possible that the JSON representation won't match the types, but I don't want to create a list of [(Either String ThingType, Either String Double)]. How can I parse that the object at the top into the ThingComposition type?
_ = (return . ThingComposition) . map (bimap parseJSON parseJSON) . toList
I would make some supporting instances for your ThingType, then reuse the FromJSON (HashMap k v) instance.
-- added "deriving Eq" to your declaration; otherwise unchanged
data ThingType = ThingTypeSome
| ThingTypeOther
| ThingTypeUnknown Text
deriving Eq
thingToTxt :: ThingType -> Text
thingToTxt ThingTypeSome = "some type of thing"
thingToTxt ThingTypeOther = "other type of thing"
thingToTxt (ThingTypeUnknown s) = s
instance FromJSONKey ThingType where
fromJSONKey = FromJSONKeyText txtToThing
instance Hashable ThingType where
hashWithSalt n = hashWithSalt n . thingToTxt
With that supporting code, you now have a FromJSON instance for HashMap ThingType Double, which is superior in many ways to a [(ThingType, Double)].

How to use Aeson to get a vector of strings inside a deep JSON object?

Let's say I want to use Aeson to parse the following JSON object:
{
"data": [
[
"data",
"more data"
],
[
"data",
"more data"
]
],
"error": {
"code": ""
}
}
I can create the records for the JSON objects, then create the instances to parse the pieces out like the documentation describes. But, I'm really only interested in the Vector Text that's inside data. Is there a more direct way to get at this than creating the records? It's not obvious how to create the Parser that gets me this directly.
It appears that there is an Aeson tutorial documenting exactly this problem: Parsing without creating extra types
In your case, data has arrays of arrays, so I'm not sure if you want a Vector (Vector Text) or flatten all of it into one array, but adapting from the documentation:
justData :: Value -> Parser (Vector (Vector Text))
justData = withObject "structure with data" $ \o -> o .: "data"
justDataFlat :: Value -> Parser (Vector Text)
justDataFlat value = fmap join (justData value)
Also note that if your structure is deeper, like this:
{
"data": {
"deep": [
"data",
"more data"
]
}
}
you can use .: more than once:
deeperData :: Value -> Parser (Vector Text)
deeperData = withObject "structure with deeper data" $ \o ->
step1 <- o .: "data"
step1 .: "deep"

F# JSON Type Provider, do not serialize null values

Background
I am using the FSharp.Data JSON Type Provider with a sample that has an array of objects that may have different properties. Here is an illustrative example:
[<Literal>]
let sample = """
{ "input": [
{ "name": "Mickey" },
{ "year": 1928 }
]
}
"""
type InputTypes = JsonProvider< sample >
The JSON Type Provider creates an Input type which has both an Optional Name and an Optional Year property. That works well.
Problem
When I try to pass an instance of this to the web service, I do something like this:
InputTypes.Root(
[|
InputTypes.Input(Some("Mouse"), None)
InputTypes.Input(None, Some(2028))
|]
)
The web service is receiving the following and choking on the nulls.
{
"input": [
{
"name": "Mouse",
"year": null
},
{
"name": null,
"year": 2028
}
]
}
What I Tried
I find that this works:
InputTypes.Root(
[|
InputTypes.Input(JsonValue.Parse("""{ "name": "Mouse" }"""))
InputTypes.Input(JsonValue.Parse("""{ "year": 2028 }"""))
|]
)
It sends this:
{
"input": [
{
"name": "Mouse"
},
{
"year": 2028
}
]
}
However, on my real project, the structures are larger and would require a lot more conditional JSON string building. It kind of defeats the purpose.
Questions
Is there a way to cause the JSON Type Provider to not serialize null properties?
Is there a way to cause the JSON Type Provider to not serialize empty arrays?
As a point of comparison, the Newtonsoft.JSON library has a NullValueHandling attribute.
I don't think there is an easy way to get the JSON formatting in F# Data to drop the null fields - I think the type does not clearly distinguish between what is null and what is missing.
You can fix that by writing a helper function to drop all null fields:
let rec dropNullFields = function
| JsonValue.Record flds ->
flds
|> Array.choose (fun (k, v) ->
if v = JsonValue.Null then None else
Some(k, dropNullFields v) )
|> JsonValue.Record
| JsonValue.Array arr ->
arr |> Array.map dropNullFields |> JsonValue.Array
| json -> json
Now you can do the following and get the desired result:
let json =
InputTypes.Root(
[|
InputTypes.Input(Some("Mouse"), None)
InputTypes.Input(None, Some(2028))
|]
)
json.JsonValue |> dropNullFields |> sprintf "%O"

Traversing JSON in Haskell with wreq - key issues

I'm trying to traverse some JSON response I'm getting from the OpenWeatherMap API but I'm getting some issues to retrieve some values. Here is my code:
{-# LANGUAGE OverloadedStrings #-}
import Control.Lens
import Data.Aeson.Lens (_String, key)
import Network.Wreq
myAPIKey :: String
myAPIKey = "my_api_key_here"
conditionsQuery :: String -> String -> String -> String
conditionsQuery city country key =
"https://api.openweathermap.org/data/2.5/forecast?q=" ++ city ++ "," ++ country ++ "&appid=" ++ key
main = do
print "What's the city?"
city <- getLine
print "And the country?"
country <- getLine
r <- get (conditionsQuery city country myAPIKey)
print $ r ^. responseBody . key "name" . _String
print $ r ^. responseBody . key "cod" . _String
print $ r ^. responseBody . key "id" . _String
The issue is that only the value of "cod" is returned ("200" in that case). The values for "name" and "id" appear as "", if we try with London,GB, Chicago, US (for instance). Yet the response body looks like:
{
...
"id": 2643743,
"name": "London",
"cod": 200
}
I first thought it was a type mismatch, but 200 is an Int there (unless I'm mistaken?) so I am not sure where the issue lies? "" seems to indicate that those 2 keys (id and name) do not exist, but they do.
Any ideas? Thanks in advance.
The response body does not look like that.
According to https://openweathermap.org/forecast5, the key "cod" appears at the outermost level of the JSON object, but "id" and "name" do not.
{
"city":{
"id":1851632,
"name":"Shuzenji",
...
}
"cod":"200",
...
}

Decoding polymorphic JSON objects into elm with andThen

My JSON looks similar to this:
{ "items" :
[ { "type" : 0, "order": 10, "content": { "a" : 10, "b" : "description", ... } }
, { "type" : 1, "order": 11, "content": { "a" : 11, "b" : "same key, but different use", ... } }
, { "type" : 2, "order": 12, "content": { "c": "totally different fields", ... } }
...
]
}
and I want to use the type value to decide what union type to create while decoding. So, I defined alias types and decoders for all the above in elm :
import Json.Decode exposing (..)
import Json.Decode.Pipeline exposing (..)
type alias Type0Content = { a : Int, b : String }
type alias Type1Content = { a : Int, b2 : String }
type alias Type2Content = { c : String }
type Content = Type0 Type0Content | Type1 Type1Content | Type2 Type2Content
type alias Item = { order : Int, type : Int, content: Content }
decode0 = succeed Type0Content
|> requiredAt ["content", "a"] int
|> requiredAt ["content", "b"] string
decode1 = succeed Type1Content
|> requiredAt ["content", "a"] int
|> requiredAt ["content", "b"] string
decode2 = succeed Type2Content
|> requiredAt ["content", "c"] string
decodeContentByType hint =
case hint of
0 -> Type0 decode0
1 -> Type1 decode1
2 -> Type2 decode2
_ -> fail "unknown type"
decodeItem = succeed Item
|> required "order" int
|> required "type" int `andThen` decodeContentByType
Can't get the last two functions to interact as needed.
I've read through page 33 of json-survival-kit by Brian Thicks, but that didn't bring me on track either.
Any advice and lecture appreciated!
It looks like the book was written targeting Elm 0.17 or below. In Elm 0.18, the backtick syntax was removed. You will also need to use a different field name for type since it is a reserved word, so I'll rename it type_.
Some annotations might help narrow down bugs. Let's annotate decodeContentByType, because right now, the branches aren't returning the same type. The three successful values should be mapping the decoder onto the expected Content constructor:
decodeContentByType : Int -> Decoder Content
decodeContentByType hint =
case hint of
0 -> map Type0 decode0
1 -> map Type1 decode1
2 -> map Type2 decode2
_ -> fail "unknown type"
Now, to address the decodeItem function. We need three fields to satisfy the Item constructor. The second field is the type, which can be obtained via required "type" int, but the third field relies on the "type" value to deduce the correct constructor. We can use andThen (with pipeline syntax as of Elm 0.18) after fetching the Decoder Int value using Elm's field decoder:
decodeItem : Decoder Item
decodeItem = succeed Item
|> required "order" int
|> required "type" int
|> custom (field "type" int |> andThen decodeContentByType)