I am currently in the process of migrating an Express app to Heroku.
To keep sensitive information out of source, Heroku uses config vars which are assigned by to process variables of the same name.
Currently, I am loading my keys using .json, such as:
{
"key": "thisismykey",
"secret": "thisismysecret"
}
However, if I try to load the variables in via Heroku's format:
{
"key": process.env.KEY
"secret": process.env.SECRET
}
Obviously, I get an error here. I would assume that it is possible to load these values into JSON, but I'm not sure. How could I do this?
To generate JSON with these values, you would first create a JavaScript object and then use JSON.stringify to turn it into JSON:
var obj = { "key": process.env.KEY
"secret": process.env.SECRET };
var json = JSON.stringify(obj);
// => '{"key":"ABCDEFGH...","secret":"MNOPQRST..."}'
Related
I am attempting to automate downloading files from various sections of an FTP Server.
Thinking of having a JSON file that will have the relevant credentials, which would need to be iterated through to obtain the information and pass this as parameters to the FTP connection.
I've tried going through:
"Lookup" to the JSON file to obtain the array
"Set variable" for the JSON array
ForEach - to run through the JSON file
But I can't seem to get this to work.
I've followed the steps in here as a starting point but to no avail.
The main achievement of this exercise is to iterate over the JSON file and pass through the values as parameters in a ForEach Loop.
The JSON file is structured as follows (example):
{
"FilesToGet": [
{
"Description": "<Desc>",
"Username": "<Username>",
"Password": "<Password>",
"Subfolder": "<FTP_Subfolder>",
},
{
"Description": "<Desc>",
"Username": "<Username>",
"Password": "<Password>",
"Subfolder": "<FTP_Subfolder>",
}
]
}
This is the set up I have tried so far.
Another point to note is that I am thinking of storing the JSON file in Azure Key Vault (as it will contain sensitive info) - would iterating over this and passing the info to parameters still be viable?
Let's say that I have a JSON file called data.json in Github. I can view it in raw in a Github URL like this: https://raw.githubusercontent.com/data.json (This is a hypothetical URL. It's not real)
And let's say that URL contains JSON data like this:
"users_1": [
{
"id": 1234,
"name": "Bob"
},
{
"id": 5678,
"name": "Alice"
}
]
How do I extract the whole JSON data from that URL and store it in a variable in a Cypress test? I know that Cypress doesn't really use Promises, so I'm finding it difficult to implement this. So far I got this in Typescript:
let users; // I want this variable to store JSON data from the URL
const dataUrl = "https://raw.githubusercontent.com/data.json";
cy.request(dataUrl).then((response) => {
users = JSON.parse(response); // This errors out because response is type Cypress.Response<any>
})
I'm planning to do something like this in the future for my project when migrating from Protractor to Cypress. I have a Protractor test that extracts JSON data from a Github file and stores it a variable by using a Promise. I want to do the same kind of task with Cypress.
I think you should use response.body, and it should have been serialized.
A request body to be sent in the request. Cypress sets the Accepts request header and serializes the response body by the encoding option. (https://docs.cypress.io/api/commands/request#Usage)
I'm trying to authenticate RADIUS Requests against a RESTful API (provided by Customer) using rlm_rest.
The problem I am facing is that
response JSON format (of REST API provided by Customer), is different from rlm_rest default format (indicated in etc/raddb/mods-enabled/rest).
My Virtual Server configuration as below:
Default
authorize {
...
...
rest
if (ok) {
update control {
Auth-Type := rest
}
}
}
mods-enabled/rest
authorize {
uri = "https://3rd-party-API/auth"
method = 'post'
body = 'json'
chunk = 0
tls = ${..tls}
data = '{
"code": 1,
"identifier": %I,
"avps": {
"User-Name": ["%{User-Name}"],
"NAS-IP-Address": ["%{NAS-IP-Address}"],
"Called-Station-Id": ["%{Called-Station-Id}"],
"Calling-Station-Id": ["%{Calling-Station-Id}"],
"NAS-Identifier": ["%{NAS-Identifier}"]
}
}'
}
Result
/sbin/radiusd -Xxx
HTTP response code
200
JSON Body
{
"code": "2",
"identifier": "91",
"avps": {
"Customer-Attributes": "Hello"
...
...
"Acct-Interim-Interval": "300"
}
}
The JSON structure is different from the example, and xlat parse
"code"
"identifier"
"avps"
And, of course, xlat finds no attributes match with the dictionary, while it cannot find "avps" and won't dig deeper.
So I was wondering is there anyway to either
Define the response JSON structure for xlat to parsing
Insert a "is_json" or "do_xlat" flag into the JSON ("avps"), and hope xlat will then dig deeper
Save the JSON and parse with exec/rlm_exec (using JQ or any other bash/JSON tools)
Please advise if there is any workaround. Thanks!
In FreeRADIUS version 4, there's a rlm_json module, which implements a custom node query language based on xpath (jpath), it is extremely limited and only supports some very basic queries (feel free to enhance it via PR :) ).
Below is an example I pulled out of my library of customer configurations. You can see here it's pulling out two keys (externalID and macAddress) from the root level of the JSON doc and assigning them to a couple of custom attributes (Subscriber-ID and Provisioned-MAC).
map json "%{rest_api:https://${modules.rest[rest_api].server}/admin/api/${modules.rest[rest_api].api_key}/external/getDeviceBySerialNumber?certificateSerialNumber=%{lpad:&TLS-Client-Cert-Serial 40 0}}" {
&Subscriber-ID := '$.externalId'
&Provisioned-MAC := '$.macAddress'
}
The xlat expansion can also be modified to send HTTP body data. Just put a space after the URL and pass your custom JSON blob.
I am trying to build a query filter as an array.
So, To make a GET call with some filter in the postman, I'd built an query like:
"query": [
{
"key": "type",
"value": 3
},
{
"key": "type",
"value": 4
},
{
"key": "type",
"value": 5
}]
It made the URLs with filter, like
/api/3/vehicles/?type=3&type=4&type=5
But these filters should be getting from previous API call.
So, I'd built some script that builds the query like above and save it in the environment variable.
query = []
for (i = 0; i < data.length; i++){
query.push({'key': 'type', 'value': data[i].id})
}
postman.setEnvironmentVariable("query", query);
And, in the JSON file, I used it like:
"query" : {{query}}
But it seems postman can't recognize it as an environment variable.
I can't even import JSON file to the postman. I am getting a format error.
Is this something you faced before? How I can solve this problem?
So when you check the environment variables the "query" variable is not there, right?
Also I am not sure about formatting. For declaring environment variable I use: pm.environment.set("query", query);
You can also add console.log(query) after your for loop, open your Postman console(Ctrl+Alt+C) and verify what query looks like. Maybe it will give you a hint.
I need to validate some object in my NodeJS app. I have already used an awesome library express-validator, it works perfectly, but now I need to validate different object, not only requests and as far as express validator leverages validator library, that in turn doesn't support types other than the string type.
I have found different variants like Jsonschema, Ajv
They offer great features, but I need to be able to set error message and than just catch an exception or parse it from return object.
Like that
var schema = {
"id": "/SimplePerson",
"type": "object",
"properties": {
"name": {"type": "string", "error": "A name should be provided"},
"address": {"$ref": "/SimpleAddress"},
"votes": {"type": "integer", "minimum": 1}
}
};
So I can set an error message for every property.
Is there any existing solution to achieve this functionality ?
POSSIBLE SOLUTION
I have found a great library JSEN It provides necessary features.
Three powerful and popular libraries you can use for JSON validation are
AJV: https://github.com/epoberezkin/ajv
JOI: https://github.com/hapijs/joi
JSON validator: https://github.com/tdegrunt/jsonschema
All of these libraries allow you to validate different data types, do conditional validation, as well as set custom error messages.
One solution is to use Joi library :
https://github.com/hapijs/joi
This library is well maintained, used and offer lots of flexibility and possible actions.
Example :
const Joi = require('joi');
const schema = Joi.object().keys({
name: Joi.string().error(new Error('A name should be provided')),
address: Joi.ref('$SimpleAddress'),
votes: Joi.number().min(1),
});
// Return result.
const result = Joi.validate(yourObject, schema);
I use Json Pattern Validator
npm install jpv --save
usage
const jpv = require('jpv');
// your json object
var json = {
status: "OK",
id: 123,
type: {}
}
// validation pattern
var pattern = {
status: /OK/i,
id: '(number)',
type: '(object)'
};
var result = jpv.validate( json , pattern)
You can also try nonvalid, a library that supports callback-based validation with custom checks and errors (disclaimer: it is written by me).
I'm about to embark on validation of JSON submissions to my web service and will be using tcomb-validation. It's a lightweight alternative to JSON schema and is based on type combinators.
Example of 'intersections':
var t = require('tcomb-validation');
var Min = t.refinement(t.String, function (s) { return s.length > 2; }, 'Min');
var Max = t.refinement(t.String, function (s) { return s.length < 5; }, 'Max');
var MinMax = t.intersection([Min, Max], 'MinMax');
MinMax.is('abc'); // => true
MinMax.is('a'); // => false
MinMax.is('abcde'); // => false