Parse value as int in HLC files - configuration

I am writing the template for a parametrized HashiCorp Nomad job. One of its parameters is priority, which is supposed to be an integer between 0 and 100.
Like other tools, Nomad supports variable interpolation, so that a variable can be defined at some point and later referenced. Nomad also allows to define "meta" variables, which are passed at runtime and can be used within the HLC file.
What I'm trying to do looks as follows:
job "my-job" {
parametrized {
meta_required = ["TASK_PRIORITY"]
}
priority = "${NOMAD_META_TASK_PRIORITY}"
...
}
The only way I have found to read those variables are within strings. Since the priority stanza expects an integer, the following error is thrown:
error parsing 'job': 1 error(s) decoding: * cannot parse 'Priority' as int: strconv.ParseInt: parsing "${NOMAD_META_TASK_PRIORITY}": invalid syntax
Is there any way to "cast" the string to an integer? Or, alternatively, is there any other way of referencing the variable that would work?

I ended up raising an issue on Github. Their response is that it's not yet possible to interpolate the priority field. See issue.

Related

Can you set dynamic json struct field tags? [duplicate]

How would I use a variable in a Go struct tag?
This works:
type Shape struct {
Type string `json:"type"`
}
This does not work:
const (
TYPE = "type"
)
type Shape struct {
Type string fmt.Sprintf("json:\"%s\"", TYPE)
}
In the first example, I am using a string literal, which works. In the second example, I am building a string using fmt.Sprintf, and I seem to be getting an error:
syntax error: unexpected name, expecting }
Here it is on the Go playground:
https://play.golang.org/p/lUmylztaFg
How would I use a variable in a Go struct tag? You wouldn't, it's not allowed by the language. You can't use a statement that evaluates at runtime in place of a compile time string literal for as an annotation to a field on a struct. As far as I know nothing of the sort works in any compiled language.
With the introduction of go generate, it is possible to do achieve this.
However, go generate essentially makes the compilation a 2 phase process. Phase 1 generates the new code, phase 2 compiles and links etc.
There are a few limitations with using go generate:
Your library will not be 'go get'-able unless you run go generate every time it is needed and check in the result, since go generate needs to be explicitly run before go build
This is a compile time process, so you will not be able to do it at run time using run time information. If you really must do this at run time, and in your case, you are just adding JSON serialization annotations, you could consider using a map.
String const/variable is not allowed in tag value to keep things simple and I support that. However with this limit, we need to use reflection to retrieve the tag value which is costly OR type string literals everywhere in the project, which may lead to bugs because of typos.
Solution
We can generate the tag values as string constant and then use this constant further in the project. It doesn't use reflection(saves performance cost), is more maintainable and removes the possibility of any bug because of typos.
ast package is an amazing tool to analyse and generate the go code. For example -
type user struct {
Name string `json:"name"`
Age int `json:"age"`
}
We can generated constants for user struct as below -
const (
UserNameJson = "name"
UserAgeJson = "age"
)
You may find tgcon helpful to generate the field tag value as const.

How to simplify HTTP post of JSON to GraphQL mutation resolver

I would like to HTTP POST values directly as JSON to an addBook resolver already declared in my GraphQL Mutation.
However, the examples I've seen (and proven) use serialisation of parameters from JSON to SDL or re-declaration of variables in SDL to bind from a Query Variable.
Neither approach makes sense because the addBook mutation already has all parameters and validation declared. Using these approaches would lead to unnecessary query serialisation logic having to be created, debugged and maintained.
I have well-formed (schema- edited and -validated) JSON being constructed in the browser which conforms to the data of a declared GraphQLObjectType.
Can anyone explain how to avoid this unnecessary reserialisation or duplication when posting against a mutation resolver?
I've been experimenting with multiple ways of mapping a JSON data structure against the addBook mutation but can't find an example of simply sending the JSON so that property names are be bound against addBook parameter names without apparently pointless reserialisation or boilerplate.
The source code at https://github.com/cefn/graphql-gist/tree/master/mutation-map is a minimal reproducible example which demonstrates the problem. It has an addBook resolver which already has parameter names, types and nullability defined. I can't find a way to use JSON to simply POST parameters against addBook.
I'm using GraphiQL as a reference implementation to HTTP POST values.
I could write code to serialise JSON to SDL. It would end up looking like this which works through GraphiQL:
mutation {addBook(id:"4", name:"Education Course Guide", genre: "Education"){
id
}}
Alternatively I can write code to explicitly alias each parameter of addBook to a different query which then allows me to post values as a JSON query variable, also proven through GraphiQL:
mutation doAdd($id: String, $name: String!, $genre: String){
addBook(id:$id, name:$name, genre:$genre){
id
}
}
...with the query variable...
{
name: "Jonathan Livingstone Seagull",
id: "6"
}
However, I am sure there's some way to directly post this JSON against addBook, telling it to take parameters from a Query Variable. I'm imagining something like...
mutation {addBook($*){
id
}}
I would like a mutation call against addBook to succeed, taking named values from a JSON Query Variable, but without reserialisation or redeclaration of the properties to parameter names.
This boils down to schema design. Instead of having three arguments on your field
type Mutation {
addBook(id: ID, name: String!, genre: String!): Book
}
you can have a single argument that takes an input object type
type Mutation {
addBook(input: AddBookInput!): Book
}
input AddBookInput {
id: ID
name: String!
genre: String!
}
Then your query only has to provide a single variable:
mutation AddBook($input: AddBookInput!) {
addBook(input: $input) {
id
}
}
and your variables look something like:
{
"input": {
"name": "Jonathan Livingstone Seagull",
"genre": "Fable"
}
}
Variables have to be explicitly defining as part of the operation definition because GraphQL and JSON are not interchangeable. A JSON string value could be a String, an ID or some custom scalar (like DateTime) in GraphQL. The variable definitions tell GraphQL how to correctly serialize and validate the provided JSON values. Because variables can be used multiple times throughout a document, their types likewise cannot simply be inferred from the types of the arguments they are used with.
EDIT:
Variables are only declared once per document. Once declared, they may be referred to any number of times throughout the document. Imagine a query like
mutation MyMutation ($id: ID!) {
flagSomething(somethingId: $id)
addPropertyToSomething(id: $id, property: "WOW")
}
We declare the variable once and tell GraphQL it's an ID scalar and it's non-nullable (i.e. required). We then use the variable twice -- once as the value of somethingId on flagSomething and again as the value of id on addPropertyToSomething. The same variable could also be used as the value to a directive's argument too -- it's not limited to just field arguments. Notice also that nothing says the variable name has to match the field name -- this is typically only done out of convenience.
The other notable thing here is that there's two validation steps happening here.
First, GraphQL will check if the provided variable (i.e. the JSON value) can be serialized into the type specified. Since we declared the variable as non-null (using !), GraphQL will also verify the variable actually exists and is not equal to null.
GraphQL will also verify that the type you specified for the variable matches the types of the arguments where it's actually used. So an Int variable will throw if it's passed to a String argument and so on. Moreover, nullability is checked here too. So an argument that is an Int! (non-null integer) will only accept variables that are also Int!. However, an argument that is Int (i.e. nullable) will accept either Int or Int! variables.
The syntax that exists is there for a reason. The kind of syntax you're imagining would only make sense in a specific scenario where you're only querying a single root field and using all the variables as arguments to that one field and the variable names match the argument names and you don't need to dynamically set any directive arguments.

how to check null /empty or value based on response from previous Sampler in Jmeter?

I have extracted the value from Json response for one of the Key. It has two possible values as either
Key=[] or
Key=[{"combination":[{"code":"size","value":"Small"}]},{"combination":
[{"code":"size","value":"Medium"}]}]
I need to check whether Key is [] or it has some values. Could you please help me what is wrong with below implementation:
if ("${Key}"=="[]") {
vars.put('size', 'empty')
} else {
vars.put('size', 'notempty')
}
My Switch controller is not navigating to Else Part based on above implementation . Help is useful!
Don't ever inline JMeter Functions or Variables into script body as they may resolve into something which will cause compilation failure or unexpected behaviour. Either use "Parameters" section like:
or use vars.get('Key') statement instead.
Don't compare string literals using ==, go for .equals() method instead
See Apache Groovy - Why and How You Should Use It article for more information on using Groovy scripting in JMeter tests
If you have an already extracted value Key and you are using an if controller with the default JavaScript you can do the following in the condition field:
"${Key}".length > 0
The "${Key}" is evaluating to your JavaScript object which is an array. You can check the length of the array to see if there are objects in it.

Elixir - Capitalized keys in structs

I am trying to write a CLI client in Elixir for an API so that I can login to the API system, fetch the data I need for my calculation and then logout. I have defined a Packet.Login struct that supposed to be my internal data structure that I end up with after parsing the JSON I receive.
I am using Poison to parse the JSON. The problem is that it seems like, because of the API returning capitalised properties, I can't match them when printing or parsing, as Poison will return a map with these capitalized keys. The problem is that it seems impossible for me to use the alias like this. If I try to use another syntax,
packet[:Token]
it still does not work and instead gives me an error. But this time about Packet.Login not implementing the Access behaviour. I can understand that part, but not the first issue. And I'm trying to keep the code stupid simple.
defmodule Packet.Login do
defstruct [:Data, :Token]
end
defimpl String.Chars, for: Packet.Login do
def to_string(packet) do
"Packet:\n---Token:\t\t#{packet.Token}\n---Data:\t#{packet.Data}"
end
end
loginPacket = Poison.decode!(json, as: %Packet.Login{})
IO.puts "#{loginPacket}"
When trying to compile the above I get this:
** (CompileError) lib/packet.ex:31: invalid alias: "packet.Token". If you wanted to define an alias, an alias must expand to an atom at compile time but it did not, you may use Module.concat/2 to build it at runtime. If instead you wanted to invoke a function or access a field, wrap the function or field name in double quotes
(elixir) expanding macro: Kernel.to_string/1
Is there a way for me to fix this somehow? I have thought of parsing the map and de-capitalizing all fields first, but I would rather not.
Why can't I have capitalized keys for a struct? It seems like I can though, as long as I don't try to use them.
In order to access a field of a map which is an atom starting with an uppercase letter, you need to either put the key in quotes, e.g. foo."Bar" or use the bracket syntax, e.g. foo[:Bar]. foo.Bar in Elixir is parsed as an alias. With structs, you cannot use the bracket syntax, so the easiest way is to use quotes around the field name. In your code, you'll therefore need to change:
"Packet:\n---Token:\t\t#{packet.Token}\n---Data:\t#{packet.Data}"
to:
"Packet:\n---Token:\t\t#{packet."Token"}\n---Data:\t#{packet."Data"}"
I could not find this documented clearly anywhere but Elixir's source mentions this in some places and also uses this syntax to access some functions in :erlang which have names that are not valid identifiers in Elixir, e.g. :erlang."=<".
Fun fact: you can define functions in Elixir that can only be called with this quote syntax as well:
iex(1)> defmodule Foo do
...(1)> def unquote(:"!##")(), do: :ok
...(1)> end
iex(2)> Foo."!##"()
:ok

Using a variable in the struct tag

How would I use a variable in a Go struct tag?
This works:
type Shape struct {
Type string `json:"type"`
}
This does not work:
const (
TYPE = "type"
)
type Shape struct {
Type string fmt.Sprintf("json:\"%s\"", TYPE)
}
In the first example, I am using a string literal, which works. In the second example, I am building a string using fmt.Sprintf, and I seem to be getting an error:
syntax error: unexpected name, expecting }
Here it is on the Go playground:
https://play.golang.org/p/lUmylztaFg
How would I use a variable in a Go struct tag? You wouldn't, it's not allowed by the language. You can't use a statement that evaluates at runtime in place of a compile time string literal for as an annotation to a field on a struct. As far as I know nothing of the sort works in any compiled language.
With the introduction of go generate, it is possible to do achieve this.
However, go generate essentially makes the compilation a 2 phase process. Phase 1 generates the new code, phase 2 compiles and links etc.
There are a few limitations with using go generate:
Your library will not be 'go get'-able unless you run go generate every time it is needed and check in the result, since go generate needs to be explicitly run before go build
This is a compile time process, so you will not be able to do it at run time using run time information. If you really must do this at run time, and in your case, you are just adding JSON serialization annotations, you could consider using a map.
String const/variable is not allowed in tag value to keep things simple and I support that. However with this limit, we need to use reflection to retrieve the tag value which is costly OR type string literals everywhere in the project, which may lead to bugs because of typos.
Solution
We can generate the tag values as string constant and then use this constant further in the project. It doesn't use reflection(saves performance cost), is more maintainable and removes the possibility of any bug because of typos.
ast package is an amazing tool to analyse and generate the go code. For example -
type user struct {
Name string `json:"name"`
Age int `json:"age"`
}
We can generated constants for user struct as below -
const (
UserNameJson = "name"
UserAgeJson = "age"
)
You may find tgcon helpful to generate the field tag value as const.