Say I have a Map type that stores functions/methods inside of it, like so:
Map triggerHandler = {
'x' : (t) => hello(t)
};
Now, what I want to do next is to declare that the Type of these Map values are to be functions.
I can always do this:
Map<String, dynamic> triggerHandler = {
'x' : (t) => hello(t)
};
But this doesn't help stop programmers from putting in non-functions into the Map values. 'x' could then be a String, or an Integer.
The reason I want to do this is because I have a function that needs to only accept Maps with functions as the values. Calling the array key of 'x' is done by triggerHandlerx when it is a function, rather than triggerHandler[x]. I haven't tested if this causes an error to occur or not, but this doesn't seem semantically correct to me.
It would seem that the most logical way to do this would be to set the Type for the value as a function. Such as giving a Map a type casted value of 'delegate':
Map<String, delegate> triggerHandler = { // Note, this won't work
'x' : (t) => hello(t)
};
What is the correct way to do this in Dart? (If there is a way)
Thank you.
You can use Function as type or if you also want to specify the arguments and return type of these functions you can create typedefs.
Map<String, Function> triggerHandler = {
'x' : (t) => hello(t)
};
typedef int SomeName(SomeType arg1);
Map<String, SomeName> triggerHandler = {
'x' : (t) => hello(t)
};
Related
When parsing a JSON-formatted string I get a linter error:
let mqttMessage = JSON.parse(message.toString())
// ESLint: Unsafe assignment of an `any` value. (#typescript-eslint/no-unsafe-assignment)
I control the content of message so I would like to tell TS that what comes out of JSON.parse() is actually an Object. How can I do that?
Note: I could silence the warning, but I would like to understand if there is a better way to approach the problem.
The problem is that JSON.parse returns an any type.
That's fair enough right - TypeScript doesn't know if it's going to parse out to a string, a number, or an object.
You have a linting rule saying 'Don't allow assigning variables as any'.
So yeah, you could coerce the result of your JSON.parse
type SomeObjectIKnowAbout = {
};
const result = JSON.parse(message.toString()) as SomeObjectIKnowAbout;
What I tend to like doing in this scenario is create a specific parsing function, that will assert at runtime that the obj really is of the shape you are saying, and will do the type casting to you can treat it while you're writing your code as that object.
type SomeObjectIKnowAbout = {
userId: string;
}
type ToStringable = {
toString: () => string;
}
function parseMessage(message: ToStringable ) : SomeObjectIKnowAbout {
const obj = JSON.parse(message.toString()); //I'm not sure why you are parsing after toStringing tbh.
if (typeof obj === 'object' && obj.userId && typeof obj.userId === 'string') {
return obj as SomeObjectIKnowAbout;
}
else {
throw new Error ("message was not a valid SomeObjectIKnowAbout");
}
}
JSON.parse isn't generic, so we can't supply a generic argument to do it.
You have a couple of options.
The simple thing is that since JSON.parse returns any, you can just define the type of what you're assigning it to:
let mqttMessage: MQTTMessage = JSON.parse(message.toString());
(I've used MQTTMessage as a stand-in for the appropriate type.)
That may not be typesafe enough for everyone, though, since it makes the assumption that the string defines what you expect it to define. And it has the problem that if you do it elsewhere, you repeat the assumption.
Instead, you could define a function:
function parseMQTTMessageJSON(json: string): MQTTMessage {
const x: object = JSON.parse(json);
if (x && /*...appropriate checks for properties here...*/"someProp" in x) {
return x as MQTTMessage;
}
throw new Error(`Incorrect JSON for 'MQTTMessage' type`);
}
Then your code is:
let mqttMessage = parseMQTTMessageJSON(message.toString());
As an alternative to type assertions and runtime wrapper functions, you can utilize declaration merging to augment the global JSON object with a generic overload for the parse method. This will allow you to pass through the expected type and give you improved IntelliSense in case you use a reviver when parsing:
interface JSON {
parse<T = unknown>(text: string, reviver?: (this: any, key: keyof T & string, value: T[keyof T]) => unknown): T
}
type Test = { a: 1, b: "", c: false };
const { a, b, c } = JSON.parse<Test>(
"{\"a\":1,\"b\":\"\",\"c\":false}",
//k is "a"|"b"|"c", v is false | "" | 1
(k,v) => v
);
Or, if you are relying on declaration files to augment global interfaces:
declare global {
interface JSON {
parse<T = unknown>(text: string, reviver?: (this: any, key: keyof T & string,
value: T[keyof T]) => unknown): T
}
}
Playground
Error:
Cannot assign a type argument (resp: Producto []) => void to the type parameter (value: Object) => void.
The resp and value parameter types are not supported.
The type Object can be assigned to very few types. Did you mean to actually use the type any?
enter image description here
You can use any But it is better to use generic this.http.get<Producto[]> like this:
private cargarProductos(){
this.http.get<Producto[]>('https://angular-htm1-25cf9.firebaseio.com/productos_idx.json')
.subscribe((resp: Producto[]) => {
console.log(resp);
this.productos = resp;
this.cargando = false;
});
}
productos:Array<Producto>=[]
private cargarProductos():any {
this.http.get('https://angular-htm1-25cf9.firebaseio.com/productos_idx.json')
.subscribe((resp:any)=>{
console.log(resp);
this.productos = resp;
this.cargando = false;
});
}
It's better to use the service component.
let arr = [1,2,3,4,5,6,7,8];
let a = arr.filter( data => {
return data > 5;
}).map ( (data,index) => {
// while in here -- is there away to know that only 3 elements came out of the filter?
});
Yes, I know I can wait for the a.length when this is over, but while IN the map -- can I find out how many items made it through the filter?
The map method's callback accepts a third argument that gives you the array being mapped, you can use the length property on that.
let arr = [1,2,3,4,5,6,7,8];
let a = arr.filter( data => {
return data > 5;
}).map ( (data,index,arr) => {
console.log(arr.length);
});
Is it possible to specify that a field in GraphQL should be a blackbox, similar to how Flow has an "any" type? I have a field in my schema that should be able to accept any arbitrary value, which could be a String, Boolean, Object, Array, etc.
I've come up with a middle-ground solution. Rather than trying to push this complexity onto GraphQL, I'm opting to just use the String type and JSON.stringifying my data before setting it on the field. So everything gets stringified, and later in my application when I need to consume this field, I JSON.parse the result to get back the desired object/array/boolean/ etc.
#mpen's answer is great, but I opted for a more compact solution:
const { GraphQLScalarType } = require('graphql')
const { Kind } = require('graphql/language')
const ObjectScalarType = new GraphQLScalarType({
name: 'Object',
description: 'Arbitrary object',
parseValue: (value) => {
return typeof value === 'object' ? value
: typeof value === 'string' ? JSON.parse(value)
: null
},
serialize: (value) => {
return typeof value === 'object' ? value
: typeof value === 'string' ? JSON.parse(value)
: null
},
parseLiteral: (ast) => {
switch (ast.kind) {
case Kind.STRING: return JSON.parse(ast.value)
case Kind.OBJECT: throw new Error(`Not sure what to do with OBJECT for ObjectScalarType`)
default: return null
}
}
})
Then my resolvers looks like:
{
Object: ObjectScalarType,
RootQuery: ...
RootMutation: ...
}
And my .gql looks like:
scalar Object
type Foo {
id: ID!
values: Object!
}
Yes. Just create a new GraphQLScalarType that allows anything.
Here's one I wrote that allows objects. You can extend it a bit to allow more root types.
import {GraphQLScalarType} from 'graphql';
import {Kind} from 'graphql/language';
import {log} from '../debug';
import Json5 from 'json5';
export default new GraphQLScalarType({
name: "Object",
description: "Represents an arbitrary object.",
parseValue: toObject,
serialize: toObject,
parseLiteral(ast) {
switch(ast.kind) {
case Kind.STRING:
return ast.value.charAt(0) === '{' ? Json5.parse(ast.value) : null;
case Kind.OBJECT:
return parseObject(ast);
}
return null;
}
});
function toObject(value) {
if(typeof value === 'object') {
return value;
}
if(typeof value === 'string' && value.charAt(0) === '{') {
return Json5.parse(value);
}
return null;
}
function parseObject(ast) {
const value = Object.create(null);
ast.fields.forEach((field) => {
value[field.name.value] = parseAst(field.value);
});
return value;
}
function parseAst(ast) {
switch (ast.kind) {
case Kind.STRING:
case Kind.BOOLEAN:
return ast.value;
case Kind.INT:
case Kind.FLOAT:
return parseFloat(ast.value);
case Kind.OBJECT:
return parseObject(ast);
case Kind.LIST:
return ast.values.map(parseAst);
default:
return null;
}
}
For most use cases, you can use a JSON scalar type to achieve this sort of functionality. There's a number of existing libraries you can just import rather than writing your own scalar -- for example, graphql-type-json.
If you need a more fine-tuned approach, than you'll want to write your own scalar type. Here's a simple example that you can start with:
const { GraphQLScalarType, Kind } = require('graphql')
const Anything = new GraphQLScalarType({
name: 'Anything',
description: 'Any value.',
parseValue: (value) => value,
parseLiteral,
serialize: (value) => value,
})
function parseLiteral (ast) {
switch (ast.kind) {
case Kind.BOOLEAN:
case Kind.STRING:
return ast.value
case Kind.INT:
case Kind.FLOAT:
return Number(ast.value)
case Kind.LIST:
return ast.values.map(parseLiteral)
case Kind.OBJECT:
return ast.fields.reduce((accumulator, field) => {
accumulator[field.name.value] = parseLiteral(field.value)
return accumulator
}, {})
case Kind.NULL:
return null
default:
throw new Error(`Unexpected kind in parseLiteral: ${ast.kind}`)
}
}
Note that scalars are used both as outputs (when returned in your response) and as inputs (when used as values for field arguments). The serialize method tells GraphQL how to serialize a value returned in a resolver into the data that's returned in the response. The parseLiteral method tells GraphQL what to do with a literal value that's passed to an argument (like "foo", or 4.2 or [12, 20]). The parseValue method tells GraphQL what to do with the value of a variable that's passed to an argument.
For parseValue and serialize we can just return the value we're given. Because parseLiteral is given an AST node object representing the literal value, we have to do a little bit of work to convert it into the appropriate format.
You can take the above scalar and customize it to your needs by adding validation logic as needed. In any of the three methods, you can throw an error to indicate an invalid value. For example, if we want to allow most values but don't want to serialize functions, we can do something like:
if (typeof value == 'function') {
throw new TypeError('Cannot serialize a function!')
}
return value
Using the above scalar in your schema is simple. If you're using vanilla GraphQL.js, then use it just like you would any of the other scalar types (GraphQLString, GraphQLInt, etc.) If you're using Apollo, you'll need to include the scalar in your resolver map as well as in your SDL:
const resolvers = {
...
// The property name here must match the name you specified in the constructor
Anything,
}
const typeDefs = `
# NOTE: The name here must match the name you specified in the constructor
scalar Anything
# the rest of your schema
`
Just send a stringified value via GraphQL and parse it on the other side, e.g. use this wrapper class.
export class Dynamic {
#Field(type => String)
private value: string;
getValue(): any {
return JSON.parse(this.value);
}
setValue(value: any) {
this.value = JSON.stringify(value);
}
}
For similar problem I've created schema like this:
"""`MetadataEntry` model"""
type MetadataEntry {
"""Key of the entry"""
key: String!
"""Value of the entry"""
value: String!
}
"""Object with metadata"""
type MyObjectWithMetadata {
"""
... rest of my object fields
"""
"""
Key-value entries that you can attach to an object. This can be useful for
storing additional information about the object in a structured format
"""
metadata: [MetadataEntry!]!
"""Returns value of `MetadataEntry` for given key if it exists"""
metadataValue(
"""`MetadataEntry` key"""
key: String!
): String
}
And my queries can look like this:
query {
listMyObjects {
# fetch meta values by key
meta1Value: metadataValue(key: "meta1")
meta2Value: metadataValue(key: "meta2")
# ... or list them all
metadata {
key
value
}
}
}
I create from a json source a csv that I want to use to populate a memsql database with the help of LOAD DATA INFILE.
I have written a typescript script for the conversation and use the library json2csv.
It leaves the values for nulled entries empty though, creating a string like:
foo, bar, , barz, 11 ,
Yet I expect my output to be:
foo, bar, \N , barz, 11 , \N
for my nulled fields. Otherwise, my database will fill in different default values, such as 0 for a number that should be NULL.
I discovered myself doing:
const someEntitites.map((entity: Entity) => {
entity.foo = entity.foo === null ? '\\N' : entity.foo;
entity.bar = entity.bar === null ? '\\N' : entity.bar;
...
return entity;
}
So basically I am hardcoding my approach to my entity, and I also am prone to bug, as I might have forgotten to check a nullable property. And if I am to export another table, I have to repeat this all over again.
How can I generalize this, so I can use this on different entities where the script "discovers" the nullable fields and sets the marker accordingly?
I created a function that iterates over its own properties and sets its value to \N if the according value is null:
const handleNullCases = (record: any): any => {
for (let key in record) {
if (record.hasOwnProperty(key)) {
const value = record[key];
if (value === null) {
record[key] = "\\N";
}
}
}
return record;
};
That way I can reuse that snipplet for other entities as well:
const processedEntities = entities.map(handleNullCases);
const processedEntities2 = entities2.map(handleNullCases);
...
I find it a bit dirty, as that I just typehint for any and cast the value to a string even though it might have been declared as another type.
I'm going to assume all properties in Entity may be null. If so, this typing is a bit safer:
type Nullable<T> = {[K in keyof T]: T[K] | null};
type CSVSafe<T> = {[K in keyof T]: T[K] | '\\N'};
const handleNullCases = <E>(record: Nullable<E>): CSVSafe<E> => {
let ret = Object.assign(record) as CSVSafe<E>;
Object.keys(ret).forEach((key: keyof E) => {
if (record[key] === null) {
ret[key] = '\\N';
}
});
return ret;
};
type Entity = Nullable<{ a: number, b: string, c: boolean, d: number, e: string }>;
const entity: Entity = { a: 1, b: null, c: false, d: null, e: 'e' };
const safeEntity = handleNullCases(entity);
// type CSVSafe<{ a: number; b: string; c: boolean; d: number; e: string; }>
The handleNullCases function will take any object whose values might be null, and return a new object which is just the same except that null values have been replaced with "\\N". The output type will be a CSVSafe<> version of the Nullable<> input type.
Hope that helps.