How to convert a Json to Graphql mutation format [duplicate] - json

I have a Json object and I want to convert it graphql mutation query inorder to invoke a graphql api.
So my requirement is I need to create a springboot project and expose a POST API which will take input as a json and convert it into graphql mutation query format,
then I need to consume a graphql api with the mutation query. Can any one please provide any pointer to it as I am not able to proceed.

Pass your object as a variable. You are able to put it in JSON format in the Graphql variables definitions.
If you look at the official tutorial on mutations, in the query, they pass the ReviewInput as a variable, $review:
mutation CreateReviewForEpisode($ep: Episode!, $review: ReviewInput!) {
createReview(episode: $ep, review: $review) {
stars
commentary
}
}
As the ReviewInput is, at least in my use case, a standard object from the model, used for both REST and GraphQl, the JSON can be directly pasted in the GraphQl in the corresponding review field in the GraphQl variables:
{
"ep": "JEDI",
"review": {
"stars": 5,
"commentary": "This is a great movie!"
}
}
{ "stars": 5, "commentary": "This is a great movie!" } being your JSON.
So, in short, no conversion is required, you just need your API to put the JSON in the correct mutation template.

Related

Read JSON data values

For example,
if the data in kafka toipc looks like this
{
"header": {
"name": "jake"
},
"body": {
"Data":"!#$%&&"
}
}
So how do I read the value "!#$%&&" from my consumer application? I need to process the data once I get that data
You'll need to consume the data using String Serde, JSON Serde, or define your own.
If you define your own, then you'd call value.getBody().getData(), like any other Java Object, where value is the argument from mapValues, peek, filter, etc. Kafka Streams DSL
For the others, the answer will depend on what JSON library you're using, but the answer isn't unique to Kafka, so read that library's documentation on parsing strings.
Here's one example of consuming using String Serde - https://github.com/confluentinc/kafka-streams-examples/blob/7.1.1-post/src/main/java/io/confluent/examples/streams/JsonToAvroExample.java#L118

Sending raw JSON to GraphQL mutation

I need to upload through a GraphQL mutation an object whose schema is not known at design time. Basically it is a raw JSON blob that may or may not come from other, possibly non-GraphQL services. I am using JSON scalar from https://github.com/graphql-java/graphql-java-extended-scalars:
scalar JSON
input MyInput {
jsonField: JSON
}
It works nicely except when the input blob contains keys like "$ref", which is the standard way to describe references in JSON. However, field name $ref does not seem to be accepted as it does not conform to the GraphQL naming conventions.
Obvious solutions appear to be to encode-decode field names in some way, or send the whole JSON as a String, perhaps with a custom scalar type. But is there perhaps a more elegant way to achieve this?

MongoDB Realm convert BJSON to JSON

I am using MongoDB Realm Webhooks to allow for my react app to fetch data from my MongoDB Database. Everything works fine, however the data that I receive is not raw JSON, in the sense that every integer field in the object has an additional property which shows its data type. To better clarify this, I will show an example of the data that is returned:
stats: {
draws: {$numberInt: "1"}
games: {$numberInt: "271"}
goals: {$numberInt: "417"}
losses: {$numberInt: "23"}
}
Is there a way where I am able to parse the data so it can be formatted without the datatype? For example:
stats: {
draws: 1,
games: 271,
goals: 417,
losses: 23,
}
This would improve code readability for my frontend. I've tried to flatten the object manually using an object flatten function, but what I am dealing with is a large nested object, so it is difficult.
Try BSON.Binary.fromText or similar functions. There's not a lot of documentation of the BSON utilities

Can Loggly parse (derive values from) a JSON array (received using HTTP/S Bulk Endpoint)

I have a Loggly free trial account set up and receiving events
.
The JSON is an array of objects, devliered over a webhooks https POST. Each object is an event (as would seem reasonable for a bulk load interface). Simplified example:
[
{
"msys": {
"track_event": {
"event_id": "319115158633969504",
"friendly_from": "traffic.gen#example.com"
}
}
},
{
"msys": {
"track_event": {
"event_id": "319115158633970211",
"friendly_from": "traffic2.gen#example.com"
}
}
}
]
Can Loggly parse the contents of the JSON and extract values (e.g. event_id)? I've tried using the "Create Derived Fields" dialog, but this seems to be regex/line based rather than having JSON awareness.
Logically, I thought passing a JSON array would work, however after much trial-and-error, I found if you pass multiple, separate JSON objects separated by a newline character, these get parsed as separate events:
curl -H "content-type:application/json" -d $'{"timestamp":"2018-09-02T17:16:15.003123Z", "message":"test1"}\n{"timestamp":"2018-09-02T17:17:15.003123Z", "message":"test2"}' http://logs-01.loggly.com/bulk/<your-token>
A couple of things do bother me with this method (e.g. its not strictly, valid application/json you are passing, despite the content-type, and its not a JSON array), but this may get you over the hump in the problems you are having, but I'm hoping there is a more elegant answer than this.
See the following for more information: Loggly - Bulk Endpoint and Loggly - Automated Parsing

Dojo reading JSON data without an Identifier

I'm trying to parse a JSON from a rest service. This service does not put the data into the format that I think ItemFileReadStore wants, but I cannot change it. Everything I have found in the dojo library for reading JSON data requires an identifier, which my data does not have. This is the JSON data:
{"ChannelReadResponse":[
{"Event": {"#entityOrigin":"System","#entityId":"0x080e00000136ad8986520af104608052","Name":"Untitled","SymbolCode":"OHVPEV---------","TimeObserved":"2012-04-13T21:09:49.207Z","CreatedUser":"Helpdesk","ModifiedUser":"Helpdesk","CreatedTime":"2012-04-13T21:09:49.207Z","ModifiedTime":"2012-04-17T15:51:12.496Z"},
{"#entityOrigin":"System","#entityId":"0x080e00000136bb54ec770af104608028","Name":"My Event","SymbolCode":"OHVPE----------","Severity":"SIGACT","Outcome":"Effective","TimeObserved":"2012-04-16T14:34:29.796Z","CreatedUser":"Helpdesk","ModifiedUser":"Helpdesk","CreatedTime":"2012-04-16T14:34:29.796Z","ModifiedTime":"2012-04-17T15:50:52.499Z"}
]
,"Channel":{"#writable":"false","#connected":"true","#entityId":"0x080e00000136ad8500760af104608064","Name":"Ozone",
"Members":{"Member":[{"#entityOrigin":"System","#entityRef":"0x080e00000136ad8986520af104608052"},{"#entityOrigin":"System","#entityRef":"0x080e00000136bb54ec770af104608028"}]
}}},
{"Event": {"#entityOrigin":"System","#entityId":"0x080e00000136bc3c92d80af104608042","Name":"From2","SymbolCode":"OHVPE----------","TimeObserved":"2012-04-16T19:43:03.150Z","CreatedUser":"Helpdesk","ModifiedUser":"Helpdesk","CreatedTime":"2012-04-16T19:43:03.150Z","ModifiedTime":"2012-04-16T19:43:03.150Z"},
"Channel": {"#writable":"false","#connected":"true","#entityId":"0x080e00000136bc3c92d80af104608034","Name":"Ozone2",
"Members":{"Member":{"#entityOrigin":"System","#entityRef":"0x080e00000136bc3c92d80af104608042"}}}
]}
]}
Is there any way to work with this data? I specifically want all the Events out of it.
Just massage it into the form that the store wants. For example, if you get the data back in a variable called 'data', you could easily just do:
var json = {
identifier: "#entityId",
items: data
};
Then just use the json object in the store.
I can only think of converting your JSON data to JavaScript Object Literal and then add the ID and Name to the JavaScript Object Literal....then convert it into JSON before passing it to your Dojo Store.
I have faced similar issue but i had the luxury to change my service to return JSON with Identifier and Name. I haven't tried what I wrote above.