Reusing type definitions with JSONProvider? - json

I'm using the JSONProvider from FSharp-Data to automatically create types for a webservice that I'm consuming using sample responses from the service.
However I'm a bit confused when it comes to types that are reused in the service, like for example there is one api method that return a single item of type X while another returns a list of X and so on. Do I really have to generate multiple definitions for this, and won't that mean that I will have duplicate types for the same thing?
So, I guess what I'm really asking, is there a way to create composite types from types generated from JSON samples?

If you call JsonProvider separately with separate samples, then you will get duplicate types for the same things in the sample. Sadly, there is not much that the F# Data library can do about this.
One option that you have would be to pass multiple samples to the JsonProvider at the same time (using the SampleIsList parameters). In that case, it tries to find one type for all the samples you provide - but it will also share types with the same structure among all the samples.
I assume you do not want to get one type for all your samples - in that case, you can wrap the individual samples with additional JSON object like this (here, the real samples are the records nested under "one" and "two"):
type J = JsonProvider<"""
[ { "one": { "person": {"name": "Tomas"} } },
{ "two": { "num": 42, "other": {"name": "Tomas"} } } ]""", SampleIsList=true>
Now, you can run the Parse method and wrap the samples in a new JSON object using "one" or "two", depending on which sample you are processing:
let j1 = """{ "person": {"name": "Tomas"} }"""
let o1 = J.Parse("""{"one":""" + j1 + "}").One.Value
let j2 = """{ "num": 42, "other": {"name": "Tomas"} }"""
let o2 = J.Parse("""{"two":""" + j2 + "}").Two.Value
The "one" and "two" records are completely arbitrary (I just added them to have two separate names). We wrap the JSON before parsing it and then we access it using the One or Two property. However, it means that o1.Person and o2.Other are now of the same type:
o1.Person = o2.Other
This returns false because we do not implement equality on JSON values in F# Data, but it type checks - so the types are the same.
This is fairly complicated, so I would probably look for other ways of doing what you need - but it is one way to get shared types among multiple JSON samples.

Related

Validating against dynamic data - JSON Schema - Ajv

I'm trying to create a JSON Schema for something very dynamic. Say I have two pieces of data, and I want one (the source) to determine the validity of the other (the target). Both can change over time, but both will always be an array of objects with known properties. For example:
source.json
[
{ "id": 23, "active": true },
{ "id": 9, "active": false },
{ "id": 6, "active": true }
]
target.json
[
{ "identifier": 6 }
]
The schema I'm trying to create is this: For each active object in the source array, there should be an equivalent object in the target array. A little more formally, given an object in the source array where "active" equals true and "id" equals x, there should be an object in the target array where "identifier" equals x.
In the example above, the target would be invalid because it's missing an object like { "identifier": 23 }.
However, I want to statically define this schema (or something capable of generating it) in a JSON file ahead of time, and this feels pretty tough since the source array can change. I'm using Ajv, and I'm aware that it supports the $data reference, but I'm not sure that's enough to help me here. The other option I could see is creating some kind of schema-generator definition? In concept, it too would be a JSON object I define ahead of time, but at runtime it would be used to safely generate arbitrary schemas based on runtime data such as the source array. However, if a mechanism like this doesn't already exist, trying to implement it myself sounds like a great way to give myself a code-injection vulnerability.
Thanks for your time!

AWS Glue Crawler - DynamoDB Export - Get attribute names in schema instead of struct

I've defined a default crawler on the data directory of an export from dynamodb. I'm trying to get it to give me a structured table instead of a table with a single column of type struct. What do I have to do make get the actual column names in there? I've tried adding custom classifiers and different path expressions but nothing seems to work, and I feel like I'm missing something really obvious.
I'm using the crawler builder inside of glue, which doesn't seem to offer much customization.
Here's the schema from the table generated by the default crawler:
And here's one of the items that I've exported from dynamo:
{
"Item": {
"the_url": {
"S": "/2021/07/06/****redacted****.html"
},
"as_of_when": {
"S": "2021-09-01"
},
"user_hashes": {
"SS": [
"****redacted*****"
]
},
"user_id_hashes": {
"SS": [
"u3MeXDcpQm0ACYuUv6TMrg=="
]
},
"accumulated_count": {
"N": "1"
},
"today_count": {
"N": "1"
}
}
}
The way Athena interprets JSON data means that your data has only a single column, Item. Athena doesn't have any mechanism to map arbitrary parts of a JSON object to columns, it can only map top-level attributes to columns.
If you want other parts of the objects as columns you will either have to create a new table with transformed data, or create a view with the attributes as columns, e.g.
CREATE OR REPLACE VIEW attributes_as_top_level_columns AS
SELECT
item.the_url.S AS the_url,
CAST(item.as_of_when.S AS DATE) AS as_of_when,
item.user_hashes.SS AS user_hashes,
item.user_id_hashes.SS AS user_id_hashes,
item.accumulated_count.N AS accumulated_count,
item.today_count.N AS today_count
FROM items
In the example above I've also flattened the data type keys (S, SS, N) and I converted the date string to a date.

How to properly use JSON.parse in kotlinjs with enums?

During my fresh adventures with kotlin-react I hit a hard stop when trying to parse some data from my backend which contains enum values.
Spring-Boot sends the object in JSON form like this:
{
"id": 1,
"username": "Johnny",
"role": "CLIENT"
}
role in this case is the enum value and can have the two values CLIENT and LECTURER. If I were to parse this with a java library or let this be handled by Spring-Boot, role would be parsed to the corresponding enum value.
With kotlin-js' JSON.parse, that wouldn't work and I would have a simple string value in there.
After some testing, I came up with this snippet
val json = """{
"id": 1,
"username": "Johnny",
"role": "CLIENT",
}"""
val member: Member = JSON.parse(json) { key: String, value: Any? ->
if (key == "role") Member.Role.valueOf(value.toString())
else value
}
in which I manually have to define the conversion from the string value to the enum.
Is there something I am missing that would simplify this behaviour?
(I am not referring to using ids for the JSON and the looking those up, etc. I am curious about some method in Kotlin-JS)
I have the assumption there is not because the "original" JSON.parse in JS doesn't do this and Kotlin does not add any additional stuff in there but I still have hope!
As far as I know, no.
The problem
Kotlin.JS produces an incredibly weird type situation when deserializing using the embedded JSON class, which actually is a mirror for JavaScript's JSON class. While I haven't done much JavaScript, its type handling is near non-existent. Only manual throws can enforce it, so JSON.parse doesn't care if it returns a SomeCustomObject or a newly created object with the exact same fields.
As an example of that, if you have two different classes with the same field names (no inheritance), and have a function that accepts a variable, it doesn't care which of those (or a third for that matter) it receives as long as the variables it tries accessing on the class exists.
The type issues manifest themselves into Kotlin. Now wrapping it back to Kotlin, consider this code:
val json = """{
"x": 1, "y": "yes", "z": {
"x": 42, "y": 314159, "z": 444
}
}""".trimIndent()
data class SomeClass(val x: Int, val y: String, val z: Struct)
data class Struct(val x: Int, val y: Int, val z: Int)
fun main(args: Array<String>) {
val someInstance = JSON.parse<SomeClass>(json)
if(someInstance.z::class != Struct::class) {
println("Incompatible types: Required ${Struct::class}, found ${someInstance.z::class}");
}
}
What would you expect this to print? The natural would be to expect a Struct. The type is also explicitly declared
Unfortunately, that is not the case. Instead, it prints:
Incompatible types: Required class Struct, found class Any
The point
The embedded JSON de/serializer isn't good with types. You might be able to fix this by using a different serializing library, but I'll avoid turning this into a "use [this] library".
Essentially, JSON.parse fails to parse objects as expected. If you entirely remove the arguments and try a raw JSON.parse(json); on the JSON in your question, you'll get a role that is a String and not a Role, which you might expect. And with JSON.parse doing no type conversion what so ever, that means you have two options: using a library, or using your approach.
Your approach will unfortunately get complicated if you have nested objects, but with the types being changed, the only option you appear to have left is explicitly parsing the objects manually.
TL;DR: your approach is fine.

Return nested JSON in AWS AppSync query

I'm quite new to AppSync (and GraphQL), in general, but I'm running into a strange issue when hooking up resolvers to our DynamoDB tables. Specifically, we have a nested Map structure for one of our item's attributes that is arbitrarily constructed (its complexity and form depends on the type of parent item) — a little something like this:
"item" : {
"name": "something",
"country": "somewhere",
"data" : {
"nest-level-1a": {
"attr1a" : "foo",
"attr1b" : "bar",
"nest-level-2" : {
"attr2a": "something else",
"attr2b": [
"some list element",
"and another, for good measure"
]
}
}
},
"cardType": "someType"
}
Our accompanying GraphQL type is the following:
type Item {
name: String!
country: String!
cardType: String!
data: AWSJSON! ## note: it was originally String!
}
When we query the item we get the following response:
{
"data": {
"genericItemQuery": {
"name": "info/en/usa/bra/visa",
"country": "USA:BRA",
"cardType": "visa",
"data": "{\"tourist\":{\"reqs\":{\"sourceURL\":\"https://travel.state.gov/content/passports/en/country/brazil.html\",\"visaFree\":false,\"type\":\"eVisa required\",\"stayLimit\":\"30 days from date of entry\"},\"pages\":\"One page per stamp required\"}}"
}}}
The problem is we can't seem to get the Item.data field resolver to return a JSON object (even when we attach a separate field-level resolver to it on top of the general Query resolver). It always returns a String and, weirdly, if we change the expected field type to String!, the response will replace all : in data with =. We've tried everything with our response resolvers, including suggestions like How return JSON object from DynamoDB with appsync?, but we're completely stuck at this point.
Our current response resolver for our query has been reverted back to the standard response after none of the suggestions in the aforementioned post worked:
## 'Before' response mapping template on genericItemQuery query; same result as the 'After' listed below **
#set($result = $ctx.result)
#set($result.data = $util.parseJson($ctx.result.data))
$util.toJson($result)
## 'After' response mapping template **
$util.toJson($ctx.result)
We're trying to avoid a situation where we need to include supporting types for each nest level in data (since it changes based on parent Item type and in cases like the example I gave it can have three or four tiers), and we thought changing the schema type to AWSJSON! would do the trick. I'm beginning to worry there's no way to get around rebuilding our base schema, though. Any suggestions to the contrary would be helpful!
P.S. I've noticed in the CloudWatch logs that the appropriate JSON response exists under the context.result.data response field, but somehow there's the following transformedTemplate (which, again, I find very unusual considering we're not applying any mapping template except to transform the result into valid JSON):
"arn": ...
"transformedTemplate": "{data={tourist={reqs={sourceURL=https://travel.state.gov/content/passports/en/country/brazil.html, visaFree=false, type=eVisa required, stayLimit=30 days from date of entry}, pages=One page per stamp required}}, resIds=USA:BRA, cardType=visa, id=info/en/usa/bra/visa}",
"context": ...
Apologies for the lengthy question, but I'm stumped.
AWSJSON is a JSON string type so you will always get back a string value (this is what your type definition must adhere to).
You could try to make a type for data field which contains all possible fields and then resolve fields to a corresponding to a parent type or alternatively you could try to implement graphQL interfaces

Translating JSON values using io.circe

I have a function in scala that translates a value and produces a string.
strOut = translate(strIn)
Suppose the following JSON object:
{
"id": "c730433b-082c-4984-3d56-855c243265f0",
"standard": "stda",
"timestamp": "tsx000",
"stdparms" : {
"stdparam1": "a",
"stdparam2": "b"
}
}
and the following mapping provided by the translation function:
"stda" -> "stdb"
"tsx000" -> "tsy000"
"a" -> "f"
"b" -> "g"
What is the best way to translate the whole JSON object using the translate function? My goal is to obtain the following result:
{
"id": "c730433b-082c-4984-3d56-855c243265f0",
"standard": "stdb",
"timestamp": "tsy000",
"stdparms" : {
"stdparam1": "f",
"stdparam2": "g"
}
}
I must use the io.circe library due to project related matters.
If you know beforehand which fields you want to translate, or what translations apply to that field, you can use Cursors to traverse the JSON tree. Or if the fields themselves are fixed (you always know what fields to expect) Optics may require less code.
When you get to the right leaf, you apply the translation.
However, when you don't know what could apply when/where it might be easier to find/replace using string methods.
Note that the JSON you provided as an example is not valid JSON by the way.