Preferred way to serialize/deserialize js-joda LocalDate? - json

We are using js-joda LocalDate to represent various dates in our model and are storing those dates in sessionStorage. Is there a generalized preferred way of storing those dates so that they can serialize/deserialize without adding special code to each object that contains them?
We have been using the standard JSON.stringify / JSON.parse to do this, but since LocalDate converts to an ISO string when stringified, we lose its LocalDate type when we parse it back.
As demonstrated here
Here's the summary:
const myObj = { a: "thing", d: LocalDate.parse('2019-01-20') };
const stringified = JSON.stringify(myObj);
const parsed = JSON.parse(stringified);
// this fails because d is no longer a LocalDate
console.log(parsed.d.year());
Our workaround now is that we have custom deserializers for any class that contains a LocalDate, but it seems a little kludgy.
Seeking a cleaner solution for this. Perhaps we could make a generalized serializer for LocalDate that outputs the same thing as the %o modifier in console.log?
mydate -> serialize -> "LocalDate { _year: 2019, _month: 1, _day: 20}"
Before we do that, I'm looking to see if this has already been done cleanly or if I'm just missing something obvious.

Answering my own question.
I'm surprised it hasn't come up, but the solution is right there in the definitions of JSON.stringify and JSON.parse.
This post pointed me to the solution when I needed to do the same thing with a Map.
JSON.parse(text[, reviver])
JSON.stringify(value[, replacer[, space]])
I needed to add replacers and revivers to do the custom serialization:
function myReviver(key: string, value: any) {
if (value === undefined) return undefined;
if (value === null) return null;
if (typeof value === 'object') {
switch (value.dataType) {
case 'LocalDate':
return LocalDate.parse(value.value);
case 'LocalTime':
return LocalTime.parse(value.value);
case 'LocalDateTime':
return LocalDateTime.parse(value.value);
case 'Period':
return Period.parse(value.value);
}
}
return value;
}
function myReplacer(key, value) {
const originalObject = this[key];
if (originalObject instanceof LocalDate) {
return {
dataType: 'LocalDate',
value: originalObject.toJSON()
};
} else if (originalObject instanceof LocalTime) {
return {
dataType: 'LocalTime',
value: originalObject.toJSON()
};
} else if (originalObject instanceof LocalDateTime) {
return {
dataType: 'LocalDateTime',
value: originalObject.toJSON()
};
} else if (originalObject instanceof Period) {
return {
dataType: 'Period',
value: originalObject.toJSON()
};
} else {
return value;
}
}
Whenever I call stringify or parse, I add the above functions as their replacer/revivers.
JSON.stringify(mystuff, myReplacer);
JSON.parse(mystuff, myReviver);

Related

How to hint the type of a function I do not control?

When parsing a JSON-formatted string I get a linter error:
let mqttMessage = JSON.parse(message.toString())
// ESLint: Unsafe assignment of an `any` value. (#typescript-eslint/no-unsafe-assignment)
I control the content of message so I would like to tell TS that what comes out of JSON.parse() is actually an Object. How can I do that?
Note: I could silence the warning, but I would like to understand if there is a better way to approach the problem.
The problem is that JSON.parse returns an any type.
That's fair enough right - TypeScript doesn't know if it's going to parse out to a string, a number, or an object.
You have a linting rule saying 'Don't allow assigning variables as any'.
So yeah, you could coerce the result of your JSON.parse
type SomeObjectIKnowAbout = {
};
const result = JSON.parse(message.toString()) as SomeObjectIKnowAbout;
What I tend to like doing in this scenario is create a specific parsing function, that will assert at runtime that the obj really is of the shape you are saying, and will do the type casting to you can treat it while you're writing your code as that object.
type SomeObjectIKnowAbout = {
userId: string;
}
type ToStringable = {
toString: () => string;
}
function parseMessage(message: ToStringable ) : SomeObjectIKnowAbout {
const obj = JSON.parse(message.toString()); //I'm not sure why you are parsing after toStringing tbh.
if (typeof obj === 'object' && obj.userId && typeof obj.userId === 'string') {
return obj as SomeObjectIKnowAbout;
}
else {
throw new Error ("message was not a valid SomeObjectIKnowAbout");
}
}
JSON.parse isn't generic, so we can't supply a generic argument to do it.
You have a couple of options.
The simple thing is that since JSON.parse returns any, you can just define the type of what you're assigning it to:
let mqttMessage: MQTTMessage = JSON.parse(message.toString());
(I've used MQTTMessage as a stand-in for the appropriate type.)
That may not be typesafe enough for everyone, though, since it makes the assumption that the string defines what you expect it to define. And it has the problem that if you do it elsewhere, you repeat the assumption.
Instead, you could define a function:
function parseMQTTMessageJSON(json: string): MQTTMessage {
const x: object = JSON.parse(json);
if (x && /*...appropriate checks for properties here...*/"someProp" in x) {
return x as MQTTMessage;
}
throw new Error(`Incorrect JSON for 'MQTTMessage' type`);
}
Then your code is:
let mqttMessage = parseMQTTMessageJSON(message.toString());
As an alternative to type assertions and runtime wrapper functions, you can utilize declaration merging to augment the global JSON object with a generic overload for the parse method. This will allow you to pass through the expected type and give you improved IntelliSense in case you use a reviver when parsing:
interface JSON {
parse<T = unknown>(text: string, reviver?: (this: any, key: keyof T & string, value: T[keyof T]) => unknown): T
}
type Test = { a: 1, b: "", c: false };
const { a, b, c } = JSON.parse<Test>(
"{\"a\":1,\"b\":\"\",\"c\":false}",
//k is "a"|"b"|"c", v is false | "" | 1
(k,v) => v
);
Or, if you are relying on declaration files to augment global interfaces:
declare global {
interface JSON {
parse<T = unknown>(text: string, reviver?: (this: any, key: keyof T & string,
value: T[keyof T]) => unknown): T
}
}
Playground

How do I verify a json object loaded into a typescript class is correct?

I want to make sure the JSON I load to my typescript classes is valid.
Also, my classes have some logic, so I want want them to remain classes and not become interfaces.
I also need type checks and required/not required checks.
Right now I just load an object to my constructor and manually check each field.
Is there a way to do it by using the type information from typescript?
I tried looking at the generated javascript files but it's just javascript and the type information is already gone there.
Perhaps there's a way to use it in compile time? at that time, typescript knows the type.
I found a few libraries that do it, but none of them deals with the type of the property as well as required or not.
class MyObject {
constructor(json: any) {
if (typeof json.id == 'number') {
this.id = json.id;
} else {
throw new Error('ID is required');
}
if (json.name != undefined) {
if (typeof json.name == 'string') {
this.name = json.name;
} else {
throw new Error('Name exists, but it is not a string!');
}
}
}
id: number; // required
name?: string; // optional
}
try {
let m1: MyObject = {
id: 123
}; // works but m1 is not a class
let isMyObject = m1 instanceof MyObject ? '' : 'not ';
console.log(`m1 is ${isMyObject}instance of MyObject`);
let m2 = new MyObject({
"id": 123
});
let m3 = new MyObject({
"id": 123,
"name": "Mickey"
});
let m4 = new MyObject({
// this will throw an error
});
console.log('OK!');
} catch (error) {
console.log("Error: " + error.message);
}

Can I define a GraphQL field to be any valid json? [duplicate]

Is it possible to specify that a field in GraphQL should be a blackbox, similar to how Flow has an "any" type? I have a field in my schema that should be able to accept any arbitrary value, which could be a String, Boolean, Object, Array, etc.
I've come up with a middle-ground solution. Rather than trying to push this complexity onto GraphQL, I'm opting to just use the String type and JSON.stringifying my data before setting it on the field. So everything gets stringified, and later in my application when I need to consume this field, I JSON.parse the result to get back the desired object/array/boolean/ etc.
#mpen's answer is great, but I opted for a more compact solution:
const { GraphQLScalarType } = require('graphql')
const { Kind } = require('graphql/language')
const ObjectScalarType = new GraphQLScalarType({
name: 'Object',
description: 'Arbitrary object',
parseValue: (value) => {
return typeof value === 'object' ? value
: typeof value === 'string' ? JSON.parse(value)
: null
},
serialize: (value) => {
return typeof value === 'object' ? value
: typeof value === 'string' ? JSON.parse(value)
: null
},
parseLiteral: (ast) => {
switch (ast.kind) {
case Kind.STRING: return JSON.parse(ast.value)
case Kind.OBJECT: throw new Error(`Not sure what to do with OBJECT for ObjectScalarType`)
default: return null
}
}
})
Then my resolvers looks like:
{
Object: ObjectScalarType,
RootQuery: ...
RootMutation: ...
}
And my .gql looks like:
scalar Object
type Foo {
id: ID!
values: Object!
}
Yes. Just create a new GraphQLScalarType that allows anything.
Here's one I wrote that allows objects. You can extend it a bit to allow more root types.
import {GraphQLScalarType} from 'graphql';
import {Kind} from 'graphql/language';
import {log} from '../debug';
import Json5 from 'json5';
export default new GraphQLScalarType({
name: "Object",
description: "Represents an arbitrary object.",
parseValue: toObject,
serialize: toObject,
parseLiteral(ast) {
switch(ast.kind) {
case Kind.STRING:
return ast.value.charAt(0) === '{' ? Json5.parse(ast.value) : null;
case Kind.OBJECT:
return parseObject(ast);
}
return null;
}
});
function toObject(value) {
if(typeof value === 'object') {
return value;
}
if(typeof value === 'string' && value.charAt(0) === '{') {
return Json5.parse(value);
}
return null;
}
function parseObject(ast) {
const value = Object.create(null);
ast.fields.forEach((field) => {
value[field.name.value] = parseAst(field.value);
});
return value;
}
function parseAst(ast) {
switch (ast.kind) {
case Kind.STRING:
case Kind.BOOLEAN:
return ast.value;
case Kind.INT:
case Kind.FLOAT:
return parseFloat(ast.value);
case Kind.OBJECT:
return parseObject(ast);
case Kind.LIST:
return ast.values.map(parseAst);
default:
return null;
}
}
For most use cases, you can use a JSON scalar type to achieve this sort of functionality. There's a number of existing libraries you can just import rather than writing your own scalar -- for example, graphql-type-json.
If you need a more fine-tuned approach, than you'll want to write your own scalar type. Here's a simple example that you can start with:
const { GraphQLScalarType, Kind } = require('graphql')
const Anything = new GraphQLScalarType({
name: 'Anything',
description: 'Any value.',
parseValue: (value) => value,
parseLiteral,
serialize: (value) => value,
})
function parseLiteral (ast) {
switch (ast.kind) {
case Kind.BOOLEAN:
case Kind.STRING:
return ast.value
case Kind.INT:
case Kind.FLOAT:
return Number(ast.value)
case Kind.LIST:
return ast.values.map(parseLiteral)
case Kind.OBJECT:
return ast.fields.reduce((accumulator, field) => {
accumulator[field.name.value] = parseLiteral(field.value)
return accumulator
}, {})
case Kind.NULL:
return null
default:
throw new Error(`Unexpected kind in parseLiteral: ${ast.kind}`)
}
}
Note that scalars are used both as outputs (when returned in your response) and as inputs (when used as values for field arguments). The serialize method tells GraphQL how to serialize a value returned in a resolver into the data that's returned in the response. The parseLiteral method tells GraphQL what to do with a literal value that's passed to an argument (like "foo", or 4.2 or [12, 20]). The parseValue method tells GraphQL what to do with the value of a variable that's passed to an argument.
For parseValue and serialize we can just return the value we're given. Because parseLiteral is given an AST node object representing the literal value, we have to do a little bit of work to convert it into the appropriate format.
You can take the above scalar and customize it to your needs by adding validation logic as needed. In any of the three methods, you can throw an error to indicate an invalid value. For example, if we want to allow most values but don't want to serialize functions, we can do something like:
if (typeof value == 'function') {
throw new TypeError('Cannot serialize a function!')
}
return value
Using the above scalar in your schema is simple. If you're using vanilla GraphQL.js, then use it just like you would any of the other scalar types (GraphQLString, GraphQLInt, etc.) If you're using Apollo, you'll need to include the scalar in your resolver map as well as in your SDL:
const resolvers = {
...
// The property name here must match the name you specified in the constructor
Anything,
}
const typeDefs = `
# NOTE: The name here must match the name you specified in the constructor
scalar Anything
# the rest of your schema
`
Just send a stringified value via GraphQL and parse it on the other side, e.g. use this wrapper class.
export class Dynamic {
#Field(type => String)
private value: string;
getValue(): any {
return JSON.parse(this.value);
}
setValue(value: any) {
this.value = JSON.stringify(value);
}
}
For similar problem I've created schema like this:
"""`MetadataEntry` model"""
type MetadataEntry {
"""Key of the entry"""
key: String!
"""Value of the entry"""
value: String!
}
"""Object with metadata"""
type MyObjectWithMetadata {
"""
... rest of my object fields
"""
"""
Key-value entries that you can attach to an object. This can be useful for
storing additional information about the object in a structured format
"""
metadata: [MetadataEntry!]!
"""Returns value of `MetadataEntry` for given key if it exists"""
metadataValue(
"""`MetadataEntry` key"""
key: String!
): String
}
And my queries can look like this:
query {
listMyObjects {
# fetch meta values by key
meta1Value: metadataValue(key: "meta1")
meta2Value: metadataValue(key: "meta2")
# ... or list them all
metadata {
key
value
}
}
}

typeof comparison NOT equal to fails (JAVASCRIPT)

I'm trying to convert any item within a JSON object to a string. JSON.stringify won't work because it doesn't convert the individual values. If its an object or number, I want the entire object to be a string. How do I test if typeof is NOT a string. I can't figure out why this doesn't work...
if (typeof(value) !== 'string') {
return String(value);
}
Any insights? Full example below:
var myjson = {
"current_state":"OPEN",
"details":"Apdex < .80 for at least 10 min",
"severity":"WARN",
"incident_api_url":"https://alerts.newrelic.com/api/explore/applications/incidents/1234",
"incident_url":"https://alerts.newrelic.com/accounts/99999999999/incidents/1234",
"owner":"user name",
"policy_url":"https://alerts.newrelic.com/accounts/99999999999/policies/456",
"runbook_url":"https://localhost/runbook",
"policy_name":"APM Apdex policy",
"condition_id":987654,
"condition_name":"My APM Apdex condition name",
"event_type":"INCIDENT",
"incident_id":1234
};
function replacer(key, value) {
if (typeof(value) !== 'string') {
return String(value);
}
return value;
}
console.log(JSON.stringify(myjson, replacer));
This actually isn't a problem with the typeof comparison.
The replacer function is initially called with an empty key and a value representing the entire JSON object (reference). Since the JSON object is not a string, the first thing your replacer function does is replace the whole JSON object with the string "[object Object]".
To fix this, check that the key does, in fact exist. Thus, your replacer function will look like this:
function replacer(key, value) {
if (key && (typeof(value) !== 'string')) {
return String(value);
}
return value;
}
I have a working fiddle of it here as well.

Can I stop Angular.js’s json filter from excluding properties that start with $?

Angular.js has a handy built-in filter, json, which displays JavaScript objects as nicely formatted JSON.
However, it seems to filter out object properties that begin with $ by default:
Template:
<pre>{{ {'name':'value', 'special':'yes', '$reallyspecial':'Er...'} | json }}</pre>
Displayed:
{
"name": "value",
"special": "yes"
}
http://plnkr.co/edit/oem4HJ9utZMYGVbPkT6N?p=preview
Can I make properties beginning with $ be displayed like other properties?
Basically you can't. It is "hard-coded" into the filter's behaviour.
Nonetheless, it is quite easy to build a custom JSON filter that behaves identically with the Angular's one but not filtering out properties starting with '$'.
(Scroll further down for sample code and a short demo.)
If you take a look at the 1.2.15 version source code, you will find out that the json filter is defined like this:
function jsonFilter() {
return function(object) {
return toJson(object, true);
};
}
So, it uses the toJson() function (the second parameter (true) means: format my JSON nicely).
So, our next stop is the toJson() function, that looks like this:
function toJson(obj, pretty) {
if (typeof obj === 'undefined') return undefined;
return JSON.stringify(obj, toJsonReplacer, pretty ? ' ' : null);
}
This function makes use of the "native" JSON.stringify() function, passing a custom replacer function (toJsonReplacer).
The toJsonReplacer() function handles some special cases: It checks if the key starts with $ and ignores it if it does (this is what we want to change) and it checks if the value is either a Window, a Document or a Scope object (in which case it converts it to a descriptive string in order to avoid "Converting circular structure to JSON" errors).
function toJsonReplacer(key, value) {
var val = value;
if (typeof key === 'string' && key.charAt(0) === '$') {
val = undefined;
} else if (isWindow(value)) {
val = '$WINDOW';
} else if (value && document === value) {
val = '$DOCUMENT';
} else if (isScope(value)) {
val = '$SCOPE';
}
return val;
}
For the sake of completeness, the two functions that check for Window and Scope look like this:
function isWindow(obj) {
return obj && obj.document && obj.location && obj.alert && obj.setInterval;
}
function isScope(obj) {
return obj && obj.$evalAsync && obj.$watch;
}
Finally, all we need to do is to create a custom filter that uses the exact same code, with the sole difference that our toJsonReplacer() won't filter out properties starting with $.
app.filter('customJson', function () {
function isWindow(obj) {
return obj &&
obj.document &&
obj.location &&
obj.alert &&
obj.setInterval;
}
function isScope(obj) {
return obj &&
obj.$evalAsync &&
obj.$watch;
}
function toJsonReplacer(key, value) {
var val = value;
if (isWindow(value)) {
val = '$WINDOW';
} else if (value && (document === value)) {
val = '$DOCUMENT';
} else if (isScope(value)) {
val = '$SCOPE';
}
return val;
}
function toJson(obj, pretty) {
if (typeof obj === 'undefined') { return undefined; }
return JSON.stringify(obj, toJsonReplacer, pretty ? ' ' : null);
}
return function(object) {
return toJson(object, true);
};
});
See, also, this short demo.
* The downside is that your custom JSON filter will not benefit from further improvement/enhancement of Angular's json filter, so you'll have to re-define your's to incorporate changes. Of course, for such a basic and simple filter like this, one should'nt expect frequent or extensive changes, but that doesn't mean there aren't going to be any.