Generic Codable types - json

I've got an idea that I'm trying to test out, I want to be able to have an array of different objects that are all Codable.
Here is the json
{
"cells":
[
{
"header": "dummy header"
},
{
"title": "dummy title"
}
]
}
Also a picture from Firestore because I'm not sure if I wrote that json out correctly:
Here's what I had so far testing with generics
struct Submission<Cell: Codable>: Codable {
let cells: [Cell]
}
struct ChecklistCell: Codable {
let header: String
}
struct SegmentedCell: Codable {
let title: String
}
The overarching goal is to decode a document that has an array (of cells) that can be different types, but are all codable. I'm not sure if this is possible, or if there is an even better approach. Thanks.
Update:
I did #Fogmeister 's solution and got it working, but not the most desirable outcome. It adds a weird layer to the json that ideally wouldn't be there. Any ideas?

I have done something similar to this in the past. Not with Firestore (although, more recently I did) but with our CMS that we use.
As #vadian pointed out, heterogeneous arrays are not supported by Swift.
Also... something else to point out.
When you have a generic type defined like...
struct Submission<Cell> {
let cells: [Cell]
}
Then, by definition, cells is a homogeneous array of a single type. If you try to put different types into it it will not compile.
You can get around this though by using an enum to bundle all your different Cells into a single type.
enum CellTypes {
case checkList(CheckListCell)
case segmented(SegmentedCell)
}
Now your array would be a homogeneous array of [CellTypes] where each element would be a case of the enum which would then contain the model of the cell inside it.
struct Submission {
let cells: [CellTypes]
}
This takes some custom decoding to get straight from JSON but I can't add that right now. If you need some guidance on that I'll update the answer.
Encoding and Decoding
Something to note from a JSON point of view. Your app will need to know which type of cell is being encoded/decoded. So your original JSON schema will need some updating to add this.
The automatic update from Firestore that you have shown is a fairly common way of doing this...
The JSON looks a bit like this...
{
"cells":
[
{
"checkListCell": {
"header": "dummy header"
}
},
{
"segmentedCell": {
"title": "dummy title"
}
}
]
}
Essentially, each item in the array is now an object that has a single key. From checkListCell, segmentedCell. This will be from any of the cases of your enum. This key tells your app which type of cell the object is.
Then the object shown against that key is then the underlying cell itself.
This is probably the cleanest way of modelling this data.
So, you might have two checklist cells and then a segmented cell and finally another checklist cell.
This will look like...
{
"cells":
[
{
"checkListCell": {
"header": "First checklist"
}
},
{
"checkListCell": {
"header": "Second checklist"
}
},
{
"segmentedCell": {
"title": "Some segmented stuff"
}
},
{
"checkListCell": {
"header": "Another checklist"
}
},
]
}
The important thing to think when analysing this JSON is not that it's harder for you (as a human being) to read. But that it's required, and actually fairly easy, for your app to read and decode/encode.
Hope that makes sense.

Related

JSON data, properties are sometimes arrays sometimes objects

I'm reading a JSON response from a third party and I'm finding that some of the properties return in the notation for a single object when there is only one object to be returned and when there is multiple objects for the property the value is returned as an array of objects.
Example of a single object in the response
{
"data": {
"property1":"value",
"property2":"value",
"property3":"value"
}
}
Example of an array of objects in the response
{
"data": [
{
"property1":"value",
"property2":"value",
"property3":"value"
},
{
"property1":"value",
"property2":"value",
"property3":"value"
},
{
"property1":"value",
"property2":"value",
"property3":"value"
},
{
"property1":"value",
"property2":"value",
"property3":"value"
}
]
}
Why would the two different response formats be acceptable from the same endpoint?
This question bothered me as well whenever I saw it happening. I never really liked having to check the value in order to know how to access it.
One could argue that doing this saves some space in the payload. You save two bytes omitting the [] when there's only a single value. But it's weak IMHO and manipulating the data is harder as we already know.
But looking at it in a different way, this seems to make some sense: it's optimizing for the more common result, a single value. I've seen my fair share of data formats where the structure was very strict. For example, a recursive dictionary-like structure where any property that would contain an object, must be an array of that object. So in a deeply nested object, accessing a value may look like this:
root.data[0].aparent[0].thechild[0].myvalue
vs:
root.data.aparent.thechild.myvalue
If there were actually multiple values, then using an array would be appropriate.
I don't necessarily buy this since you still have to do a check, you'd have to do some tests before consuming the data (like if a response didn't came back). This type of response might make more sense in languages that have some form of pattern matching.

How I can fix json structure to help spark read it properly. Different types for same key

I'm reciving json. I don't know on which keys problem will appear. When spark see different types for same key it puts this into string and I need to have data in array type. I'm using spark 2.4 with json lib so I read jsons as
spark.read.json("jsonfile")
I'm flattening my json schema to this kind of format where col name is:
B__C
B__somedifferentColname
Sample json look like this
{
"A":[
{
"B":{
"C":"Hello There"
}
},
{
"B":[
{
"C":"Hello"
},
{
"C":"Hi"
}
]
}
]
}
and I would like to have this json in format like this:
{
"A":[
{
"B":[{
"C":"Hello There"
}]
},
{
"B":[
{
"C":"Hello"
},
{
"C":"Hi"
}
]
}
]
}
So as you can see what I have changed is added square brackets to first object.
But when I have one value as struct type and one value as a list it puts this to string so the column value will be look like:
"[{"C":"Hello"},{"C":"Hi"}]"
and it should look like that
B__C
Hello
Hi
Hello There
Is anyone able to help me what trick I can use to resolve this issue?
Team which delivers jsons to us said this is not possible to do this from thier side so we have to resolve this on our side.

Assign proper types to data read from JSON

I have a struct such as this one:
type Data struct {
Id string
Value string
Custom customtype1
Special customtype2
TimeStamp Time
}
var model Data
I am reading data from a JSON object. Because the JSON is structured very differently, I can't just directly unmarshall the JSON into the struct. So I am trying to "match" the fields from the JSON objects to those of the struct one by one. I don't actually need to properly unmarshall the JSON data into the struct, all I really need is to be able to assign, for each field, the proper type to its value.
So I unmarshall the JSON to a generic interface, then convert it to a map[string]interface{} and iterate over that. For each field, I try to find a match among the field names in the model variable which I get using reflect.
Now this all works fine, but the problem arises when I try to get the right type for the values.
I can get the Type for a certain field from the model using reflect, but then I can't use that to cast the type of the value I get from the JSON because that is not a type. I can't use a switch statement either, because this is a simplified version of the situation and in reality I'm dealing with 1000+ different possible types. How can I convert the values I have for each field into their proper type ?
The only I can think of solving this would be to recreate a json string that matches the format of the struct and then unmarshall that into its proper struct, but that seems way to convoluted. Surely there must be a simpler way?
Here's a sample JSON (I can not change this structure, unless I rework it within my Go program):
{
"requestId": 101901,
"userName": "test",
"options": [1, 4],
"request": {
"timeStamp": {
"Value1": "11/02/2018",
"Value2": "11/03/2018"
},
"id": {
"Value1": "123abcd",
"Value2": "0987acd",
"Value3": "a9c003"
},
"custom": {
"Value1": "customtype1_value",
"Value2": "customtype1_value"
}
}
}
I'd advise against your current approach. You haven't provided enough context to tell us why you're choosing to unmarshall things one by one, but Go's support for JSON is good enough that I'd guess it is capable of doing what you want.
Are you aware of Marshall's support for struct tags? Those might serve the purpose you're looking for. Your struct would then look something more like:
type Data struct {
Id string `json:"id"`
Value string `json:"value"`
Custom customtype1 `json:"custom_type"`
Special customtype2 `json:"special_type"`
TimeStamp Time `json:"timestamp"`
}
If your problem is that the custom types don't know how to be unmarshalled, you can define custom unmarshalling functions for them.
This would then enable you to unmarshall an object like the following:
{
"id": "foo",
"value": "bar",
"custom_type": "2342-5234-4b24-b23a",
"special_type": "af23-af2f-rb32-ba23",
"timestamp": "2018-05-01 12:03:41"
}

Using struct with API call in Go

I'm new to Go and working hard to follow its style and I'm not sure how to proceed.
I want to push a JSON object to a Geckoboard leaderboard, which I believe requires the following format based on the API doc and the one for leaderboards specifically:
{
"api_key": "222f66ab58130a8ece8ccd7be57f12e2",
"data": {
"item": [
{ "label": "Bob", "value": 4, "previous_value": 6 },
{ "label": "Alice", "value": 3, "previous_value": 4 }
]
}
}
My instinct is to build a struct for the API call itself and another called Contestants, which will be nested under item. In order to use json.Marshall(Contestant1), the naming convention of my variables would not meet fmt's expectations:
// Contestant structure to nest into the API call
type Contestant struct {
label string
value int8
previous_rank int8
}
This feels incorrect. How should I configure my Contestant objects and be able to marshall them into JSON without breaking convention?
To output a proper JSON object from a structure, you have to export the fields of this structure. To do it, just capitalize the first letter of the field.
Then you can add some kind of annotations, to tell your program how to name your JSON fields :
type Contestant struct {
Label string `json:"label"`
Value int8 `json:"value"`
PreviousRank int8 `json:"previous_rank"`
}

Point of JSON-RPC vs simpler JSON

this is a JSON-RPC object I am implementing
{
"method":"create",
"params": [
{
"nid": "69",
"body":
{
"und":
[
{
"value":
"blah"
}
]
}
}
]
}
here is how I would do it with "normal" JSON
{
"method":"create",
"id":"69",
"value":"blah"
}
since JSON is a parsed as a map or dictionary, this should be adequate regardless of the presence of nested JSONArrays and JSON Objects in those arrays, explain why JSON-RPC is better or desired by anything at all
thanks!
Your JSON-RPC is invalid; id has to be at the top level, as it is in your "normal" JSON
After correcting for the above, your JSON-RPC is still needlessly complex; params could just be [{"value":"blah"}]. Which would make your "normal" JSON very slightly less complex, but harder to parse (since you couldn't rely on "params" no matter what)
Your "normal" JSON would not allow for unnamed parameters (ones identified solely by position). Thus, the minimal added complexity buys you something which you might not need in your application, but others might