TypeScript: serialize BigInt in JSON - json

I'm looking for a way to force JSON.stringify to always print BigInts without complaining.
I know it's non-standard, I know there's a package for that in pure JavaScript; but it doesn't fit my needs. I even know a fix in raw JavaScript by setting BigInt.prototype.toJSON. What I need is some way to override the normal JSON.stringify function globally in my TypeScript code.
I had found the following code, a year or so ago:
declare global
{
interface BigIntConstructor
{
toJSON:()=>BigInt;
}
}
BigInt.toJSON = function() { return this.toString(); };
on some web page I can't manage to find again. It used to work in another project of mine, but it doesn't seem to work any more. I have no idea why.
No matter what I do to the lines above, if I try to print a JSON containing a BigInt, I get: TypeError: Do not know how to serialize a BigInt.
Any help is appreciated - many thanks in advance.

You could use the replacer argument for JSON.stringify like this:
const obj = {
foo: 'abc',
bar: 781,
qux: 9n
}
JSON.stringify(obj, (_, v) => typeof v === 'bigint' ? v.toString() : v)

This is what you looking for:
BigInt.prototype.toJSON = function() { return this.toString() }
https://github.com/GoogleChromeLabs/jsbi/issues/30#issuecomment-953187833

I needed to get JSON.stringify to work in one of the dependencies, so I couldn't use the above answer. Instead, I created a patch.js file:
BigInt.prototype.toJSON = function() {
return this.toString()
}
Then at the beginning of my TypeScript source I added:
require('patch.js')
After that, JSON.stringify could handle BigInts without any problems.

Related

Determining the underlying type of a generic Type with TypeScript

Consider the following interface within TypeScript
interface IApiCall<TResponse> {
method: string;
url: string;
}
Which is then used within the following method;
const call = <TResponse>(api: IApiCall<TResponse>): void => {
// call to API via ajax call
// on response, grab data
// use JSON.parse(data) to convert to json object
return json as TResponse;
};
Now we use this for Type safety within our methods so we know what objects are being returned from the API. However, when we are returning a single string from the API, JSON.parse is converting the string '12345' into a number, which then breaks further down the line when we are trying to treat this as a string and use value.trim() yet it has been translated into a number.
So ideas to solve this so that we are not converting a string into a number.
How can we stop JSON.parse from converting a single string value into a number?
If using JSON.parse, we check the type of TResponse and compare it against the typeof of json generated.
if (typeof (json) !== typeof(TResponse))...
However there doesn't seem to be an obvious way to determine the generic type.
Question 1: How can we stop JSON.parse() from converting a single string value into a number?
JSON is a text format, so in JSON.parse(x), x needs to be a string. But JSON text represents data of not-necessarily-string types. It sounds like you might be making a category mistake, by confusing a thing with its representation.
If you convert the number 12345 to JSON (JSON.stringify(12345)) you will get the string "12345". If you parse that string, (JSON.parse("12345")), you will get the number 12345 back. If you wanted to get the string "12345", you need to encode it as JSON ( JSON.stringify("12345")) as the string "\"12345\"". If you parse that ( JSON.parse('"12345"') you will get the string "12345" out.
So the straightforward answer to the question "How can we stop JSON.parse() from converting a single string value into a number" is "by properly quoting it". But maybe the real problem is that you are using JSON.parse() on something that isn't really JSON at all. If you are given the string "12345" and want to treat it as the string "12345", then you don't want to do anything at all to it... just use it as-is without calling JSON.parse().
Hope that helps. If for some reason either of those don't work for you, you should post more details about your use case as a Minimal, Complete, and Verifiable example.
Question 2: How do we determine that the returned JSON-parsed object matches the generic type?
In TypeScript, the type system exists only at design time and is erased in the emitted JavaScript code that runs later. So you can't access interfaces and type parameters like TResponse at runtime. The general solution to this is to start with the runtime solution (how would you do this in pure JavaScript) and help the compiler infer proper types at design time.
Furthermore, the interface type IApiCall
interface IApiCall<TResponse> {
method: string;
url: string;
}
has no structural dependence on TResponse, which is not recommended. So even if we write good runtime code and try to infer types from it, the compiler will never be able to figure out what TResponse is.
In this case I'd recommend that you make the IApiCall interface include a member which is a type guard function, and then you will have to write your own runtime test for each type you care about. Like this:
interface IApiCall<TResponse> {
method: string;
url: string;
validate: (x: any) => x is TResponse;
}
And here's an example of how to create such a thing for a particular TResponse type:
interface Person {
name: string,
age: number;
}
const personApiCall: IApiCall<Person> = {
method: "GET",
url: "https://example.com/personGrabber",
validate(x): x is Person {
return (typeof x === "object") &&
("name" in x) && (typeof x.name === "string") &&
("age" in x) && (typeof x.age === "number");
}
}
You can see that personApiCall.validate(x) should be a good runtime check for whether or not x matches the Person interface. And then, your call() function can be implemented something like this:
const call = <TResponse>(api: IApiCall<TResponse>): Promise<TResponse | undefined> => {
return fetch(api.url, { method: api.method }).
then(r => r.json()).
then(data => api.validate(data) ? data : undefined);
};
Note that call returns a Promise<Person | undefined> (api calls are probably asynchronous, right? and the undefined is to return something if the validation fails... you can throw an exception instead if you want). Now you can call(personApiCall) and the compiler automatically will understand that the asynchronous result is a Person | undefined:
async function doPersonStuff() {
const person = await call(personApiCall); // no <Person> needed here
if (person) {
// person is known to be of type Person here
console.log(person.name);
console.log(person.age);
} else {
// person is known to be of type undefined here
console.log("File a missing Person report!")
}
}
Okay, I hope those answers give you some direction. Good luck!
Type annotations only exist in TS (TResponse will be nowhere within the output JS), you cannot use them as values. You have to use the type of the actual value, here it should be enough to single out the string, e.g.
if (typeof json == 'string')

Asserting entire response body in post man

I recently started working on spring boot projects.
I am looking for a way to assert the entire response of my API.
The intention of this is to reduce the testing time taken for the API.
Found A few solutions mentioned below, but nothing helped me resolve the issue.
pm.test("Body matches string", function () {
pm.expect(pm.response.text()).to.include("string_you_want_to_search");
});
pm.test("Body is correct", function () {
pm.response.to.have.body("response_body_string");
});
When I put the entire response body as an argument, I get the below errors.
Unclosed String
2.
3.
If you want to use the same type of quotes you defined the string with inside it, you have to escape them:
'string with "quotes"'
"string with 'quotes'"
'string with \'quotes\''
"string with \"quotes\""
You probably want to put your json in single quotes as they are not allowed by json itself.
You could try setting the response as a variable and then assert against that?
var jsonData = pm.response.json()
pm.environment.set('responseData', JSON.stringify(jsonData))
From here you can get the data JSON.parse(pm.enviroment.get('responseData')) and then use this within any test to assert against all of the values.
pm.test("Body is correct", () => {
var jsonData = pm.response.json()
pm.expect(jsonData).to.deep.equal(JSON.parse(pm.environment.get('responseData')))
})
My reasoning is that you’re trying to assert against JSON anyway but doing as a plain text string.
Or you could assert against the values separately like this:
pm.test("Body is correct", () => {
var jsonData = pm.response.json()
pm.expect(jsonData[0].employeeName).to.equal("tushar")
pm.expect(jsonData[0].phNum).to.equal(10101010)
})
Depending on the JSON structure you may not need to access an array of data and the [0] can be dropped.

Emoijis in JSON, Datapower

I have a mpgw where the request is JSON.
I save the content in a context variable with JSON.stringify(json)
The problem is when json contains a emoiji eg \uD83D\uDE0D tha variable no longer will be a string, it will be binary and the emoijis is shown as dots.
I need to use the the content of the variable later to calculate hmac so it has to look exact as the original json.
Is there any way to get around this?
Help wold be much appreciated.
We are running firmware: IDG.7.5.2.9
/Jocke D
Ok, from your comment I can conclude that it is the Stringify() that messes it up. This is according to the cookbook for escaping (there is a RFC describing this)...
Try adding your own function for stringify() that will handle unicode better:
function JSON_stringify(s, emit_unicode) {
var json = JSON.stringify(s);
return emit_unicode ? json : json.replace(/[\u007f-\uffff]/g,
function(c) {
return '\\u'+('0000'+c.charCodeAt(0).toString(16)).slice(-4);
}
);
}
ctx.setVar('json', JSON_stringify(json, false));
Something like that...

ES6 Set does not serialize to array

I've noticed that the Set in ES2015 does not implement a simple toJSON function, such as serializing to an array. Below is the implementation I came up with that does just that:
Object.defineProperty(Set.prototype, 'toJSON', {
enumerable: false,
value: function () {
return [...this];
}
});
Is there any reason why a Set does not serialize to an array?
Are there any edge cases where this override for toJSON is a bad idea?
See this answer as to why there can't be a general toJSON case for Maps, and for similar reasons, Sets. Basically, keys and/or Set items can be anything, including objects and references to other things that can't be serialized into JSON (which, remember, is a specific format with specific, stricter rules than just "turn into intelligible data of another type"). What you want here is more like "toArray" anyhow. You method already works for that inline, as would Array.from(Set), I think.
But if you wanted to add this sort of method to the prototype for your own internal usage without risking possible problems if a similar (but not identical) method is ever added, you could use a Symbol key'd prop.
var toArray = Symbol('toArray');
Object.defineProperty(Set.prototype, toArray, {
enumerable: false,
value: function () {
return [...this];
}
});
var g = new Set();
g.add(9);
g[toArray]();//-> [9]
If you do that, then you are guaranteed to not cause problems with anything other than your own code, since only your code will have access to the toArray Symbol key that references that method.

Manually parse json data according to kendo model

Any built-in ready-to-use solution in Kendo UI to parse JSON data according to schema.model?
Maybe something like kendo.parseData(json, model), which will return array of objects?
I was searching for something like that and couldn't find anything built-in. However, using Model.set apparently uses each field's parse logic, so I ended up writing this function which works pretty good:
function parse(model, json) {
// I initialize the model with the json data as a quick fix since
// setting the id field doesn't seem to work.
var parsed = new model(json);
var fields = Object.keys(model.fields);
for (var i=0; i<fields.length; i++) {
parsed.set(fields[i], json[fields[i]]);
}
return parsed;
}
Where model is the kendo.data.Model definition (or simply datasource.schema.model), and json is the raw object. Using or modifying it to accept and return arrays shouldn't be too hard, but for my use case I only needed a single object to be parsed at a time.
I actually saw your post the day you posted it but did not have the answer. I just needed to solve this problem myself as part of a refactoring. My solution is for DataSources, not for models directly.
kendo.data.DataSource.prototype.parse = function (data) {
return this.reader.data(data);
// Note that the original data will be modified. If that is not what you want, change to the following commented line
// return this.reader.data($.extend({}, data));
}
// ...
someGrid.dataSource.parse(myData);
If you want to do it directly with a model, you will need to look at the DataReader class in kendo.data.js and use a similar logic. Unfortunately, the DataReader takes a schema instead of a model and the part dealing with the model is not extracted in it's own method.