How to cast an `_InternalLinkedHashMap` created from json? - json

Sometimes, when traversing complex json files in Dart it would be nice if we could tell our editor what the expected structure is so that we can make best use of the editor's intelligent code completion features.
As a toy example, consider the script writer.dart:
import 'dart:convert' show json;
main() {
Map<String, num> myMap = {"a": 1, "b": 2, "c": 3};
print(json.encode(myMap));
}
Let's say we use writer.dart to create a json file:
dart writer.dart > data.json
We have another script, called reader.dart, that will read this file and ideally interpret the data as a Map<String, num> instance:
import 'dart:io' show File;
import 'dart:convert' show json;
main() async {
Map<String, num> myMap = json.decode(await File("data.json").readAsString());
}
This script, however, throws the exception type '_InternalLinkedHashMap<String, dynamic>' is not a subtype of type 'Map<String, num>'.
Naive attempts such as the following also don't work:
var myMap = json.decode(await File("data.json").readAsString()) as Map<String, num>;
Of course we could do something like:
var myMap = Map<String, num>();
(json.decode(await File("data.json").readAsString()) as Map).forEach((k, v) {
myMap[k] = v;
});
But that's really ugly!
What is the best way to let the editor know what data structure to expect when parsing json?

The map returned by json.decode is a mutable Map<String, Object>. In this case, you know that the values are all num instances, but the JSON decoder didn't know that. Even if it did, since the value is mutable, the decoder has to assume someone might want to add any object to the map later.
There are two basic ways to get a map with a stricter type from another map: Copying to a new map (and checking the stricter type while copying), or wrapping/adapting the original map (and checking the stricter type on every access).
You create a new map of whatever type you want by:
Map<String, num> newMap = Map<String, num>.from(originalMap);
The argument to Map.from is a Map<dynamic, dynamic>, so it really accepts any map, and it then copies each entry into a new map with the requested type.
If you change the original map after making the copy, the copy is unaffected.
The other option is to wrap using:
Map<String, num> wrappedMap = originalMap.cast<String, num>();
This does nothing up-front. Every time you try to look something up in the wrapped map, it checks the argument type and return value type against the types you asked for. If you change the original map, the wrapped map changes as well (which is another reason it can't just check the types once, it has to check every time in case the value has changed).
The two approaches have different efficiency trade-offs. Copying everything takes space and time. Checking types takes time too. If you plan to use the same map over and over again, or to modify it, then creating a new map is probably more efficient. If not, Map.cast is easier to use.

(Although this answer seems obvious in retrospect, I was struggling with this for a while and only figured it out while composing the above question on this site. Just in case it is useful to anyone else, I went ahead and posted the question anyway.)
Just use the from constructor!
var myMap = Map<String, num>.from(json.decode(await File("data.json").readAsString()));

Related

How to split the data of NodeObject in Apache Flink

I'm using Flink to process the data coming from some data source (such as Kafka, Pravega etc).
In my case, the data source is Pravega, which provided me a flink connector.
My data source is sending me some JSON data as below:
{"key": "value"}
{"key": "value2"}
{"key": "value3"}
...
...
Here is my piece of code:
PravegaDeserializationSchema<ObjectNode> adapter = new PravegaDeserializationSchema<>(ObjectNode.class, new JavaSerializer<>());
FlinkPravegaReader<ObjectNode> source = FlinkPravegaReader.<ObjectNode>builder()
.withPravegaConfig(pravegaConfig)
.forStream(stream)
.withDeserializationSchema(adapter)
.build();
final StreamExecutionEnvironment env = StreamExecutionEnvironment.getExecutionEnvironment();
DataStream<ObjectNode> dataStream = env.addSource(source).name("Pravega Stream");
dataStream.map(new MapFunction<ObjectNode, String>() {
#Override
public String map(ObjectNode node) throws Exception {
return node.toString();
}
})
.keyBy("word") // ERROR
.timeWindow(Time.seconds(10))
.sum("count");
As you see, I used the FlinkPravegaReader and a proper deserializer to get the JSON stream coming from Pravega.
Then I try to transform the JSON data into a String, KeyBy them and count them.
However, I get an error:
The program finished with the following exception:
Field expression must be equal to '*' or '_' for non-composite types.
org.apache.flink.api.common.operators.Keys$ExpressionKeys.<init>(Keys.java:342)
org.apache.flink.streaming.api.datastream.DataStream.keyBy(DataStream.java:340)
myflink.StreamingJob.main(StreamingJob.java:114)
It seems that KeyBy threw this exception.
Well, I'm not a Flink expert so I don't know why. I've read the source code of the official example WordCount. In that example, there is a custtom splitter, which is used to split the String data into words.
So I'm thinking if I need to use some kind of splitter in this case too? If so, what kind of splitter should I use? Can you show me an example? If not, why did I get such an error and how to solve it?
I guess you have read the document about how to specify keys
Specify keys
The example codes use keyby("word") because word is a field of POJO type WC.
// some ordinary POJO (Plain old Java Object)
public class WC {
public String word;
public int count;
}
DataStream<WC> words = // [...]
DataStream<WC> wordCounts = words.keyBy("word").window(/*window specification*/);
In your case, you put a map operator before keyBy, and the output of this map operator is a string. So there is obviously no word field in your case. If you actually want to group this string stream, you need to write it like this .keyBy(String::toString)
Or you can even implement a customized keySelector to generate your own key.
Customized Key Selector

How can I get JSON.parse result covered by the Flow type checker?

I'm new to working with flow. I'm trying to get as close to 100% flow coverage as possible on a project, and one thing I can't figure out is how to handle is JSON.parse.
type ExampleType = {
thingOne: boolean,
thingTwo: boolean,
};
const exampleVariable: ExampleType = JSON.parse(
'{thingOne: true, thingTwo: false}'
);
So I have a type, and I receive a string from another source, and I parse it and expect it to be of that type.
The whole JSON.parse(...) section is marked as "Not covered by flow".
Is there a way to get a file to 100% flow coverage if JSON.parse is used in that file? How? What exactly is flow saying when it says that line isn't covered?
The problem is that JSON.parse returns any. Here is the signature:
static parse(text: string, reviver?: (key: any, value: any) => any): any;
Flow cannot guarantee that the assignment of the parse result to the type ExampleType is correct because who knows what will come out when you parse incoming JSON?
But you can get coverage to 100% if you parse with flow-validator instead. When parsing a string as far as Flow knows that string could have come from anywhere. So there is no static guarantee that the JSON data in the string has the shape that you expect. What flow-validator does it to provide an API to describe a validation schema for your data instead of a type. The schema is checked at runtime while parsing. Flow-validator automatically generates a static type from your schema, and assigns the result from a successful parse to that type. Here is what your example looks like with flow-validator:
import { boolean, object } from "flow-validator"
const ExampleSchema = object({
thingOne: boolean,
thingTwo: boolean
})
const exampleVariable = ExampleSchema.parse(
'{"thingOne": true, "thingTwo": false}'
)
You can check and see that Flow infers the correct type for exampleVariable, and your Flow coverage is now at 100%. If the JSON data does not have the correct shape then ExampleSchema.parse will throw an error.
You can get a type from the schema like this:
type ExampleType = typeof ExampleSchema.type
This version of ExampleType is just like the one in your original example. Extracting a type automatically saves you from having to write the shape for your data structure twice, and it also guarantees that the static type stays in sync with the runtime validation schema.

JSON.parse and JSON.stringify are not idempotent and that is bad

This question is multipart-
(1a) JSON is fundamental to JavaScript, so why is there no JSON type? A JSON type would be a string that is formatted as JSON. It would be marked as parsed/stringified until the data was altered. As soon as the data was altered it would not be marked as JSON and would need to be re-parsed/re-stringified.
(1b) In some software systems, isn't it possible to (accidentally) attempt to send a plain JS object over the network instead of a serialized JS object? Why not make an attempt to avoid that?
(1c) Why can't we call JSON.parse on a straight up JavaScript object without stringifying it first?
var json = { //JS object in properJSON format
"baz":{
"1":1,
"2":true,
"3":{}
}
};
var json0 = JSON.parse(json); //will throw a parse error...bad...it should not throw an error if json var is actually proper JSON.
So we have no choice but to do this:
var json0= JSON.parse(JSON.stringify(json));
However, there are some inconsistencies, for example:
JSON.parse(true); //works
JSON.parse(null); //works
JSON.parse({}); //throws error
(2) If we keep calling JSON.parse on the same object, eventually it will throw an error. For example:
var json = { //same object as above
"baz":{
"1":1,
"2":true,
"3":{}
}
};
var json1 = JSON.parse(JSON.stringify(json));
var json2 = JSON.parse(json1); //throws an error...why
(3) Why does JSON.stringify infinitely add more and more slashes to the input? It is not only hard to read the result for debugging, but it actually puts you in dangerous state because one JSON.parse call won't give you back a plain JS object, you have to call JSON.parse several times to get back the plain JS object. This is bad and means it is quite dangerous to call JSON.stringify more than once on a given JS object.
var json = {
"baz":{
"1":1,
"2":true,
"3":{}
}
};
var json2 = JSON.stringify(json);
console.log(json2);
var json3 = JSON.stringify(json2);
console.log(json3);
var json4 = JSON.stringify(json3);
console.log(json4);
var json5 = JSON.stringify(json4);
console.log(json5);
(4) What is the name for a function that we should be able to call over and over without changing the result (IMO how JSON.parse and JSON.stringify should behave)? The best term for this seems to be "idempotent" as you can see in the comments.
(5) Considering JSON is a serialization format that can be used for networked objects, it seems totally insane that you can't call JSON.parse or JSON.stringify twice or even once in some cases without incurring some problems. Why is this the case?
If you are someone who is inventing the next serialization format for Java, JavaScript or whatever language, please consider this problem.
IMO there should be two states for a given object. A serialized state and a deserialized state. In software languages with stronger type systems, this isn't usually a problem. But with JSON in JavaScript, if call JSON.parse twice on the same object, we run into fatal exceptions. Likewise, if we call JSON.stringify twice on the same object, we can get into an unrecoverable state. Like I said there should be two states and two states only, plain JS object and serialized JS object.
1) JSON.parse expects a string, you are feeding it a Javascript object.
2) Similar issue to the first one. You feed a string to a function that needs an object.
3) Stringfy actually expects a string, but you are feeding it a String object. Therefore, it applies the same measures to escape the quotes and slashes as it would for the first string. So that the language can understand the quotes, other special characters inside the string.
4) You can write your own function for this.
5) Because you are trying to do a conversion that is illegal. This is related to the first and second question. As long as the correct object types are fed, you can call it as many times as you want. The only problem is the extra slashes but it is in fact the standard.
We'll start with this nightmare of your creation: string input and integer output.
IJSON.parse(IJSON.stringify("5")); //=> 5
The built-in JSON functions would not fail us this way: string input and string output.
JSON.parse(JSON.stringify("5")); //=> "5"
JSON must preserve your original data types
Think of JSON.stringify as a function that wraps your data up in a box, and JSON.parse as the function that takes it out of a box.
Consider the following:
var a = JSON.stringify;
var b = JSON.parse;
var data = "whatever";
b(a(data)) === data; // true
b(b(a(a(data)))) === data; // true
b(b(b(a(a(a(data)))))) === data; // true
That is, if we put the data in 3 boxes, we have to take it out of 3 boxes. Right?
If I put my data in 2 boxes and take it out of 1, I'm not holding my data yet, I'm holding a box that contains my data. Right?
b(a(a(data))) === data; // false
Seems sane to me...
JSON.parse unboxes your data. If it is not boxed, it cannot unbox it. JSON.parse expects a string input and you're giving it a JavaScript object literal
The first valid call to JSON.parse would return an object. Calling JSON.parse again on this object output would result in the same failure as #1
repeated calls to JSON.stringify will "box" our data multiple times. So of course you have to use repeated calls to JSON.parse then to get your data out of each "box"
Idempotence
No, this is perfectly sane. You can't triple-stamp a double-stamp.
You'd never make a mistake like this, would you?
var json = IJSON.stringify("hi");
IJSON.parse(json);
//=> "hi"
OK, that's idempotent, but what about
var json = IJSON.stringify("5");
IJSON.parse(json);
//=> 5
UH OH! We gave it a string each time, but the second example returns an integer. The input data type has been lost!
Would the JSON functions have failed us here?
var json = JSON.stringify("hi");
JSON.parse(json);
//=> "hi"
All good. And what about the "5" ?
var json = JSON.stringify("5");
JSON.parse(json));
//=> "5"
Yay, the types have been preseved! JSON works, IJSON does not.
Maybe a more real-life example:
OK, so you have a busy app with a lot of developers working on it. It makes
reckless assumptions about the types of your underlying data. Let's say it's a chat app that makes several transformations on messages as they move from point to point.
Along the way you'll have:
IJSON.stringify
data moves across a network
IJSON.parse
Another IJSON.parse because who cares? It's idempotent, right?
String.prototype.toUpperCase — because this is a formatting choice
Let's see the messages
bob: 'hi'
// 1) '"hi"', 2) <network>, 3) "hi", 4) "hi", 5) "HI"
Bob's message looks fine. Let's see Alice's.
alice: '5'
// 1) '5'
// 2) <network>
// 3) 5
// 4) 5
// 5) Uncaught TypeError: message.toUpperCase is not a function
Oh no! The server just crashed. You'll notice it's not even the repeated calling of IJSON.parse that failed here. It would've failed even if you called it once.
Seems like you were doomed from the start... Damned reckless devs and their careless data handling!
It would fail if Alice used any input that happened to also be valid JSON
alice: '{"lol":"pwnd"}'
// 1) '{"lol":"pwnd"}'
// 2) <network>
// 3) {lol:"pwnd"}
// 4) {lol:"pwnd"}
// 5) Uncaught TypeError: message.toUpperCase is not a function
OK, unfair example maybe, right? You're thinking, "I'm not that reckless, I
wouldn't call IJSON.stringify or IJSON.parse on user input like that!"
It doesn't matter. You've fundamentally broken JSON because the original
types can no longer be extracted.
If I box up a string using IJSON, and then unbox it, who knows what I will get back? Certainly not you, and certainly not the developer using your reckless function.
"Will I get a string type back?"
"Will I get an integer?"
"Maybe I'll get an object?"
"Maybe I will get cake. I hope it's cake"
It's impossible to tell!
You're in a whole new world of pain because you've been careless with your data types from the start. Your types are important so start handling them with care.
JSON.stringify expects an object type and JSON.parse expects a string type.
Now do you see the light?
I'll try to give you one reason why JSON.parse cannot be called multiple time on the same data without us having a problem.
you might not know it but a JSON document does not have to be an object.
this is a valid JSON document:
"some text"
lets store the representation of this document inside a javascript variable:
var JSONDocumentAsString = '"some text"';
and work on it:
var JSONdocument = JSON.parse(JSONDocumentAsString);
JSONdocument === 'some text';
this will cause an error because this string is not the representation of a JSON document
JSON.parse(JSONdocument);
// SyntaxError: JSON.parse: unexpected character at line 1 column 1 of the JSON data
in this case how could have JSON.parse guessed that JSONdocument (being a string) was a JSON document and that it should have returned it untouched ?

Return a JsonSlurper result from SoapUI

I'm using SOAPUI to mock out a web-api service, I'm reading the contents of a static json response file, but changing the contents of a couple of the nodes based on what the user has passed through in the request.
I can't create multiple responses as surcharge is calculated from the amount passed.
The toString() method of the object that gets return by the slurper is replacing { with [ with invalidates my JsonResponse. I've included the important bits of the code below, has anyone got a way around this or is JsonSlurper not the right thing to use here?
def json=slurper.parseText(new File(path).text)
// set the surcharge to the two credit card nodes
// these are being set fine
json.AvailableCardTypeResponse.PaymentCards[0].Surcharge="${sur_charge}"
json.AvailableCardTypeResponse.PaymentCards[1].Surcharge="${sur_charge}"
response.setContentType("application/json;charset=utf-8" );
response.setContentLength(length);
Tools.readAndWrite( new ByteArrayInputStream(json.toString().getBytes("UTF-8")), length,response.getOutputStream() )
return new com.eviware.soapui.impl.wsdl.mock.WsdlMockResult(mockRequest)
You're slurping the json into a List/Map structure, then writing this List/Map structure out.
You need to convert your lists and maps back to json.
Change the line:
Tools.readAndWrite( new ByteArrayInputStream(json.toString().getBytes("UTF-8")), length,response.getOutputStream() )
to
Tools.readAndWrite( new ByteArrayInputStream( new JsonBuilder( json ).toString().getBytes("UTF-8")), length,response.getOutputStream() )

Map to String/String to Map conversion in Groovy

I have a json object that gets passed into a save function as
{
"markings": {
"headMarkings": "Brindle",
"leftForeMarkings": "",
"rightForeMarkings": "sock",
"leftHindMarkings": "sock",
"rightHindMarkings": "",
"otherMarkings": ""
}
** EDIT **
The system parses it and passes it to my function as a mapping. I don't actually have the JSON, although it wouldn't be difficult to build up the JSON myself, it just seems like overkill
* END EDIT **
The toString() function ends up putting the results into the database as
"[rightForeMarkings:, otherMarkings:, leftForeMarkings:sock, leftHindMarkings:sock, rightHindMarkings:, headMarkings:brindle]"
I then want to save that as a string (fairly easy) by calling
params.markings.toString()
From here, I save the info and return the updated information.
My issue is that since I am storing the object in the DB as a string, I can't seem to get the markings back out as a map (to then be converted to JSON).
I have tried a few different things to no avail, although it is completely possible that I went about something incorrectlywith these...
Eval.me(Item.markings)
evaluate(Item.markings)
Item.markings.toList()
Thanks in advance for the help!
Throwing my tests.
Using JSON converters in Grails, I think this should be the approach: (synonymous to #JamesKleeh and #GrailsGuy)
def json = '''{
"markings": {
"headMarkings": "Brindle",
"leftForeMarkings": "",
"rightForeMarkings": "sock",
"leftHindMarkings": "sock",
"rightHindMarkings": "",
"otherMarkings": ""
}
}'''
def jsonObj = grails.converters.JSON.parse(json)
//This is your JSON object that should be passed in to the method
print jsonObj //[markings:[rightForeMarkings:sock, otherMarkings:, leftForeMarkings:, leftHindMarkings:sock, rightHindMarkings:, headMarkings:Brindle]]
def jsonStr = jsonObj.toString()
//This is the string which should be persisted in db
assert jsonStr == '{"markings":{"rightForeMarkings":"sock","otherMarkings":"","leftForeMarkings":"","leftHindMarkings":"sock","rightHindMarkings":"","headMarkings":"Brindle"}}'
//Get back json obj from json str
def getBackJsobObj = grails.converters.JSON.parse(jsonStr)
assert getBackJsobObj.markings.leftHindMarkings == 'sock'
If I understand correctly, you want to convert a String to a JSON object? You can actually bypass converting it to a map, and parse it directly as a JSON object:
import grails.converters.JSON
def json = JSON.parse(Item.markings)
This will give you your entire JSON object, and then you can just reference the values as you would a map.
Edit #2:
So apparently there is no "safe" way to convert that string back to a map without something custom. I would recommend saving the structure in the database as it originally comes in. If you can do that, then all you would need is JSON.parse()