If I have a JSON schema saved in a file like f1040.json with content:
[
{
name: "L1",
type: "String"
},
{
name: "C2",
type: "Boolean"
},
{
name: "L3",
type: "String"
},
...
]
And I want to generate a type that looks like:
type F1040 = {
L1: string;
C2: boolean;
L3: string;
...
}
How can I generate this without specifying each field manually (there are hundreds of them)? My first attempt at a solution isn't valid typescript (bc I'm providing more then one mapped type I think) but hopefully this invalid example clarifies what I'm trying to do:
import { f1040 } from "./f1040.json";
const bools = f1040.filter(e => e.type === "Boolean").map(e => name) as string[];
const strings = f1040.filter(e => e.type === "String").map(e => e.name) as string[];
export type F1040 = {
[key in (typeof bools)[number]]?: boolean;
[key in (typeof strings)[number]]?: string;
};
My incorrect solution was inspired by an answer to a similar but different question: TS create keys of new type from array of strings
Edit1: The solution doesn't need to be dynamic but if it's a build-time solution then it needs to play nice with rollup & the declarations bundler I'm using
Edit2: Another unsuccessful attempt, this time utilizing #sinclair/typebox:
import { Static, Type } from "#sinclair/typebox";
import { f1040 } from "./f1040.json";
const F1040 = Type.Object({
...f1040
.filter(e => e.type === "Boolean")
.map(e => e.name)
.reduce((res, f) => ({ ...res, [f]: Type.Boolean() }), {}),
...f1040
.filter(e => e.type === "String")
.map(e => e.name)
.reduce((res, f) => ({ ...res, [f]: Type.String() }), {}),
});
export type F1040 = Static<typeof F1040>;
const f1040Data = {
L1: true,
C2: "Yea",
} as F1040
The above attempt builds fine w/out any error.. which is too bad because the type assignments at the end are wrong. This should fail to build with a TypeError saying something like
Types of property 'L1' are incompatible. Type 'boolean' is not comparable to type 'string'.
It cannot be done dynamically, because your typescipt program is compiled into javascript before it is run, and in javascript all type information is removed. Types are only used in the typescript compilation process.
So you need to have the types before typescript compilation. E.g. using https://www.npmjs.com/package/json-schema-to-typescript
How one can write a function, which takes only few attributes in most-compact way in ES6?
I've came up with solution using destructuring + simplified object literal, but I don't like that list of fields is repeated in the code.
Is there an even slimmer solution?
(v) => {
let { id, title } = v;
return { id, title };
}
Here's something slimmer, although it doesn't avoid repeating the list of fields. It uses "parameter destructuring" to avoid the need for the v parameter.
({id, title}) => ({id, title})
(See a runnable example in this other answer).
#EthanBrown's solution is more general. Here is a more idiomatic version of it which uses Object.assign, and computed properties (the [p] part):
function pick(o, ...props) {
return Object.assign({}, ...props.map(prop => ({[prop]: o[prop]})));
}
If we want to preserve the properties' attributes, such as configurable and getters and setters, while also omitting non-enumerable properties, then:
function pick(o, ...props) {
var has = p => o.propertyIsEnumerable(p),
get = p => Object.getOwnPropertyDescriptor(o, p);
return Object.defineProperties({},
Object.assign({}, ...props
.filter(prop => has(prop))
.map(prop => ({prop: get(props)})))
);
}
I don't think there's any way to make it much more compact than your answer (or torazburo's), but essentially what you're trying to do is emulate Underscore's pick operation. It would be easy enough to re-implement that in ES6:
function pick(o, ...fields) {
return fields.reduce((a, x) => {
if(o.hasOwnProperty(x)) a[x] = o[x];
return a;
}, {});
}
Then you have a handy re-usable function:
var stuff = { name: 'Thing', color: 'blue', age: 17 };
var picked = pick(stuff, 'name', 'age');
The trick to solving this as a one-liner is to flip the approach taken: Instead of starting from original object orig, one can start from the keys they want to extract.
Using Array#reduce one can then store each needed key on the empty object which is passed in as the initialValue for said function.
Like so:
const orig = {
id: 123456789,
name: 'test',
description: '…',
url: 'https://…',
};
const filtered = ['id', 'name'].reduce((result, key) => { result[key] = orig[key]; return result; }, {});
console.log(filtered); // Object {id: 123456789, name: "test"}
alternatively...
const filtered = ['id', 'name'].reduce((result, key) => ({
...result,
[key]: orig[key]
}), {});
console.log(filtered); // Object {id: 123456789, name: "test"}
A tiny bit shorter solution using the comma operator:
const pick = (O, ...K) => K.reduce((o, k) => (o[k]=O[k], o), {})
console.log(
pick({ name: 'John', age: 29, height: 198 }, 'name', 'age')
)
ES6 was the latest spec at the time when the question was written. As explained in this answer, key picking is significantly shorter in ES2019 than in ES6:
Object.fromEntries(
Object.entries(obj)
.filter(([key]) => ['foo', 'bar'].includes(key))
)
TC39's object rest/spread properties proposal will make this pretty slick:
let { x, y, ...z } = { x: 1, y: 2, a: 3, b: 4 };
z; // { a: 3, b: 4 }
(It does have the downside of creating the x and y variables which you may not need.)
You can use object destructuring to unpack properties from the existing object and assign them to variables with different names - fields of a new, initially empty object.
const person = {
fname: 'tom',
lname: 'jerry',
aage: 100,
}
let newPerson = {};
({fname: newPerson.fname, lname: newPerson.lname} = person);
console.log(newPerson);
There's currently a strawman proposal for improving JavaScript's object shorthand syntax, which would enable "picking" of named properties without repetition:
const source = {id: "68646", genre: "crime", title: "Scarface"};
const target = {};
Object.assign(target, {source.title, source.id});
console.log(picked);
// {id: "68646", title: "Scarface"}
Unfortunately, the proposal doesn't seem to be going anywhere any time soon. Last edited in July 2017 and still a draft at Stage 0, suggesting the author may have ditched or forgotten about it.
ES5 and earlier (non-strict mode)
The concisest possible shorthand I can think of involves an ancient language feature nobody uses anymore:
Object.assign(target, {...(o => {
with(o) return { id, title };
})(source)});
with statements are forbidden in strict mode, making this approach useless for 99.999% of modern JavaScript. Bit of a shame, because this is the only halfway-decent use I've found for the with feature. 😀
I have similar to Ethan Brown's solution, but even shorter - pick function. Another function pick2 is a bit longer (and slower), but allows to rename properties in the similar to ES6 manner.
const pick = (o, ...props) => props.reduce((r, p) => p in o ? {...r, [p]: o[p]} : r, {})
const pick2 = (o, ...props) => props.reduce((r, expr) => {
const [p, np] = expr.split(":").map( e => e.trim() )
return p in o ? {...r, [np || p]: o[p]} : r
}, {})
Here is the usage example:
const d = { a: "1", c: "2" }
console.log(pick(d, "a", "b", "c")) // -> { a: "1", c: "2" }
console.log(pick2(d, "a: x", "b: y", "c")) // -> { x: "1", c: "2" }
I required this sollution but I didn't knew if the proposed keys were available. So, I took #torazaburo answer and improved for my use case:
function pick(o, ...props) {
return Object.assign({}, ...props.map(prop => {
if (o[prop]) return {[prop]: o[prop]};
}));
}
// Example:
var person = { name: 'John', age: 29 };
var myObj = pick(person, 'name', 'sex'); // { name: 'John' }
Some great solutions above, didn't see one for Typescript fleshed out, so here it goes. Based on #Ethan Browns solution above
const pick = < T extends object, K extends keyof T >(
obj: T,
...keys: K[]
): Pick< T, K > =>
keys.reduce< any >( ( r, key ) => {
r[ key ] = obj[ key ];
return r;
}, {} );
And for bonus, here is TS friendly es6 omit, and one that is much more performant below, but less es6.
const omit = < T extends object, K extends keyof T >(
obj: T,
...keys: K[]
): Omit< T, K > =>
keys.reduce( ( r, key ) => ( delete r[ key ], r ), {
...obj,
} );
Way more performant omit: http://jsben.ch/g6QCK
const omit = < T extends object, K extends keyof T >(
obj: T,
...keys: K[]
): Omit< T, K > => {
let r: any = {};
let length = keys.length;
while ( length-- ) {
const key = keys[ length ];
r[ key ] = obj[ key ];
}
return r;
};
inspired by the reduce approach of https://stackoverflow.com/users/865693/shesek:
const pick = (orig, keys) => keys.reduce((acc, key) => ({...acc, [key]: orig[key]}), {})
or even slightly shorter using the comma operator (https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Operators/Comma_Operator)
const pick = (obj, keys) => keys.reduce((acc, key) => ((acc[key] = obj[key]), acc), {});
usage:
pick({ model : 'F40', manufacturer: 'Ferrari', productionYear: 1987 }, 'model', 'productionYear')
results in:
{model: "F40", productionYear: 1987}
I am using Redux spread operator to hopefully mantain the state as immutable objects.
However, i am managing to make the most simple unit test fail.
I assume the error probably has to do with immutables, but am i not using the spread operator correctly?
Here is my unit test:
describe('app logic', () => {
it('initialises app', () => {
const newState = reducer(INITIAL_STATE, {type: "NEXT"})
const expectState = {
start: true,
render_component: null,
requests: {},
results: {},
}
console.log('newState', newState)
console.log('expected state', expectState)
expect(newState).to.equal(expectState)
})
})
and here is my reducer
export const INITIAL_STATE = {
start: false,
render_component: null,
requests: {},
results: {}
}
export const next = (state) => {
if (state === INITIAL_STATE) {
return {
...state,
start: true,
}
}
return state
}
export function reducer(state = INITIAL_STATE, action) {
switch (action.type) {
case 'NEXT':
return next(state)
default:
return state
}
}
I print the two objects, and they look the same.
i get the error :
1) app logic initialises app:
AssertionError: expected { Object (start, render_component, ...) } to equal { Object (start, render_component, ...) }
Not sure exactly which testing library you are using, but usually a name like .equal is used to test strict equality ( === ), which means (at least in the case of objects) that the two things being compared must actually reference the exact same object. So, for example,
const original = { a: 1 }; // creates a new object, assign it
const testMe = { a: 1 }; // creates another new object, assign it
console.log( original === testMe ) // false
evaluates to false, because while the objects have the same content, they do not reference the exact same object. They are separate, independently created, objects that happen to have the same content. Compare that to
const original = {a: 1}; // create a new object
const testMe = original; // create another reference to the same object
console.log( original === testMe ); // true
So when you return
return {
...state,
start: true,
}
you are creating and returning a new object, so it naturally can not reference the same object that you created and assigned to the variable name expectedState.
If what you are interested in is not strict equality, but rather just that the content in the two objects are the same, there exists other methods than .equal, usually named something with deep (since they go deep into the objects/arrays/whatever to check if the values are the same).
Chai.js has examples of both expect(x).to.equal(y) and expect(x).to.deep.equal(y) in their docs: http://chaijs.com/api/bdd/#method_equal
Your testing library probably has very similar, if not identical, syntax.
I create from a json source a csv that I want to use to populate a memsql database with the help of LOAD DATA INFILE.
I have written a typescript script for the conversation and use the library json2csv.
It leaves the values for nulled entries empty though, creating a string like:
foo, bar, , barz, 11 ,
Yet I expect my output to be:
foo, bar, \N , barz, 11 , \N
for my nulled fields. Otherwise, my database will fill in different default values, such as 0 for a number that should be NULL.
I discovered myself doing:
const someEntitites.map((entity: Entity) => {
entity.foo = entity.foo === null ? '\\N' : entity.foo;
entity.bar = entity.bar === null ? '\\N' : entity.bar;
...
return entity;
}
So basically I am hardcoding my approach to my entity, and I also am prone to bug, as I might have forgotten to check a nullable property. And if I am to export another table, I have to repeat this all over again.
How can I generalize this, so I can use this on different entities where the script "discovers" the nullable fields and sets the marker accordingly?
I created a function that iterates over its own properties and sets its value to \N if the according value is null:
const handleNullCases = (record: any): any => {
for (let key in record) {
if (record.hasOwnProperty(key)) {
const value = record[key];
if (value === null) {
record[key] = "\\N";
}
}
}
return record;
};
That way I can reuse that snipplet for other entities as well:
const processedEntities = entities.map(handleNullCases);
const processedEntities2 = entities2.map(handleNullCases);
...
I find it a bit dirty, as that I just typehint for any and cast the value to a string even though it might have been declared as another type.
I'm going to assume all properties in Entity may be null. If so, this typing is a bit safer:
type Nullable<T> = {[K in keyof T]: T[K] | null};
type CSVSafe<T> = {[K in keyof T]: T[K] | '\\N'};
const handleNullCases = <E>(record: Nullable<E>): CSVSafe<E> => {
let ret = Object.assign(record) as CSVSafe<E>;
Object.keys(ret).forEach((key: keyof E) => {
if (record[key] === null) {
ret[key] = '\\N';
}
});
return ret;
};
type Entity = Nullable<{ a: number, b: string, c: boolean, d: number, e: string }>;
const entity: Entity = { a: 1, b: null, c: false, d: null, e: 'e' };
const safeEntity = handleNullCases(entity);
// type CSVSafe<{ a: number; b: string; c: boolean; d: number; e: string; }>
The handleNullCases function will take any object whose values might be null, and return a new object which is just the same except that null values have been replaced with "\\N". The output type will be a CSVSafe<> version of the Nullable<> input type.
Hope that helps.
I am working with a dataset that cannot be modified on the server side. So I am trying to setup the local data model on the client in a way that I can easily traverse through the model when updating parts of the data.
Therefore I am trying to create a multi-leveled Map from multi-leveled Maps including Lists, that themselves include Maps, etc. (see schematics at the end of this post).
What I am trying to get is a Map containing other Maps, with the key of the included Map being the value of the object (again please see schematics at the end of this post).
I got it to work on the first level:
const firstLevel = data.toMap().mapKeys((key, value) => value.get('value'));
See it in action here: https://jsfiddle.net/9f0djcb0/4/
But there is a maximum of 3 levels of nested data and I can't get my head around how to get the transformation done. Any help appreciated!
The schematic datasets:
// This is what I got
const dataset = [
{
field: 'lorem',
value: 'ipsum',
more: [
{
field: 'lorem_lvl1',
value: 'ispum_lvl1',
more: [
{
field: 'lorem_lvl2',
value: 'ispum_lvl2',
more: [
{
field: 'lorem_lvl3',
value: 'ispum_lvl3',
}
]
}
]
}
]
},
{
field: 'glorem',
value: 'blipsum'
},
{
field: 'halorem',
value: 'halipsum'
}
];
This is where I want to go:
// This is what I want
const dataset_wanted = {
ipsum: {
field: 'lorem',
value: 'ipsum',
more: {
lorem_lvl1: {
field: 'lorem_lvl1',
value: 'ispum_lvl1',
more: {
lorem_lvl2: {
field: 'lorem_lvl2',
value: 'ispum_lvl2',
more: {
lorem_lvl3: {
field: 'lorem_lvl3',
value: 'ispum_lvl3',
}
}
}
}
}
}
},
glorem: {
field: 'glorem',
value: 'blipsum'
},
halorem: {
field: 'halorem',
value: 'halipsum'
}
};
Retrieve nested structures using "getIn" is beter.
const data = Immutable.fromJS(dataset[0]);
const firstLevel = data.getIn(['more']);
const twoLevel = firstLevel.getIn([0,'more']);
const threeLevel = twoLevel.getIn([0,'more']);
console.log(firstLevel.toJS(),twoLevel.toJS(),threeLevel.toJS());
As for a more generative solution, I re-wrote the answer before to a recursive approach:
function mapDeep(firstLevel) {
return firstLevel.map((obj) => {
if (obj.has('more')) {
const sec = obj.get('more').toMap().mapKeys((key, value) => value.get('value'));
const objNext = mapDeep(sec);
obj = obj.set('more', objNext);
}
return obj;
});
}
The first level still needs to be mapped manually before.
const firstLevel = data.toMap().mapKeys((key, value) => value.get('value'));
const secondLevel = mapDeep(firstLevel);
Again, see it in action: https://jsfiddle.net/9f0djcb0/12/
This is good enough for me for now. Still feels like this can be solved smarter (and more performant).. Cheers :)
So after some time passed I came up with a solution that works for me:
let sec, third, objThird;
// 1st level: simple mapping
const firstLevel = data.toMap().mapKeys((key, value) => value.get('value'));
// 2nd level: walk through updated firstLevel's subobjects and do the mapping again:
const secondLevel = firstLevel.map((obj) => {
if (obj.has('more')) {
sec = obj.get('more').toMap().mapKeys((key, value) => value.get('value'));
// 3nd level: walk through updated secondLevel's subobjects and do the mapping again:
objThird = sec.map((o) => {
if (o.has('more')) {
third = o.get('more').toMap().mapKeys((key, value) => value.get('value'));
o = o.set('more', third);
}
return o;
});
obj = obj.set('more', objThird);
}
return obj;
});
See it in action here: https://jsfiddle.net/9f0djcb0/7/
This has been working nicely so far, thur pretty hard-coded. If anyone has a more elegant solution to this, I am happy to learn about it!