How to access the contents of a JSON file without a key? - json

Basically, I am setting up a web server via Node.js and Express (I am a beginner at this) to retrieve data by reading a JSON file.
For example, this is my data.json file:
[{
"color": "black",
"category": "hue",
"type": "primary"
},
{
"color": "red",
"category": "hue",
"type": "primary"
}
]
I am trying to retrieve all of the colors by implementing this code for it to display on localhost:
router.get('/colors', function (req, res) {
fs.readFile(__dirname + '/data.json', 'utf8', function (err, data) {
data = JSON.parse(data);
res.json(data); //this displays all of the contents of data.json
})
});
router.get('/colors:name', function (req, res) {
fs.readFile(__dirname + '/data.json', 'utf8', function (err, data) {
data = JSON.parse(data);
for (var i = 0; i < data.length; i++) {
res.json(data[i][1]); //trying to display the values of color
}
})
});
How do I go about doing this?

What you are trying to do is actually pretty simple once you break it into smaller problems. Here is one way to break it down:
Load your JSON data into memory for use by your API.
Define an API route which extracts only the colours from your JSON data and sends them to the client as a JSON.
var data = [];
try {
data = JSON.parse(fs.readFileSync('/path/to/json'));
} catch (e) {
// Handle JSON parse error or file not exists error etc
data = [{
"color": "black",
"category": "hue",
"type": "primary"
},
{
"color": "red",
"category": "hue",
"type": "primary"
}
]
}
router.get('/colors', function (req, res, next) {
var colors = data.map(function (item) {
return item.color
}); // This will look look like: ["black","red"]
res.json(colors); // Send your array as a JSON array to the client calling this API
})
Some improvements in this method:
The file is read only once synchronously when the application is started and the data is cached in memory for future use.
Using Array.prototype.map Docs to extract an array of colors from the object.
Note:
You can structure the array of colors however you like and send it down as a JSON in that structure.
Examples:
var colors = data.map(function(item){return {color:item.color};}); // [{"color":"black"},{"color":"red"}]
var colors = {colors: data.map(function(item){return item.color;})} // { "colors" : ["black" ,"red"] }
Some gotchas in your code:
You are using res.json in a for loop which is incorrect as the response should only be sent once. Ideally, you would build the JS object in the structure you need by iterating over your data and send the completed object once with res.json (which I'm guessing internally JSON.stringifys the object and sends it as a response after setting the correct headers)
Reading files is an expensive operation. If you can afford to read it once and cache that data in memory, it would be efficient (Provided your data is not prohibitively large - in which case using files to store info might be inefficient to begin with)

in express, you can do in this way
router.get('/colors/:name', (req, res) => {
const key = req.params.name
const content = fs.readFileSync(__dirname + '/data.json', 'utf8')
const data = JSON.parse(content)
const values = data.reduce((values, value) => {
values.push(value[key])
return values
}, [])
// values => ['black', 'red']
res.send(values)
});
and then curl http://localhost/colors/color,
you can get ['black', 'red']

What you're looking to do is:
res.json(data[i]['color']);

If you don't really want to use the keys in the json you may want to use the Object.values function.
...
data = JSON.parse(data)
var values = []
for (var i = 0; i < data.length; i++) {
values.push(Object.values(data[i])[0]) // 0 - color, 1 - category, 2 - type
}
res.json(values) // ["black","red"]
...

You should never use fs.readFileSync in production. Any sync function will block the event loop until the execution is complete hence delaying everything afterwords (use with caution if deemed necessary). A few days back I had the worst experience myself and learnt that in a hard way.
In express you can define a route with param or query and use that to map the contents inside fs.readFile callback function.
/**
* get color by name
*
* #param {String} name name of the color
* #return {Array} array of the color data matching param
*/
router.get('/colors/:name', (req, res) => {
const color = req.params.name
const filename = __dirname + '/data.json';
fs.readFile('/etc/passwd', 'utf8', (err, data) => {
if(err){
return res.send([]); // handle any error returned by readFile function here
}
try{
data = JSON.parse(data); // parse the JSON string to array
let filtered = []; // initialise empty array
if(data.length > 0){ // we got an ARRAY of objects, right? make your check here for the array or else any map, filter, reduce, forEach function will break the app
filtered = data.filter((obj) => {
return obj.color === color; // return the object if the condition is true
});
}
return res.send(filtered); // send the response
}
catch(e){
return res.send([]); // handle any error returned from JSON.parse function here
}
});
});
To summarise, use fs.readFile asynchronous function so that the event loop is not clogged up. Inside the callback parse through the content and then return the response. return is really important or else you might end up getting Error: Can't set headers after they are sent
DISCLAIMER This code above is untested but should work. This is just to demonstrate the idea.

I think you can’t access JSON without key. You can use Foreach loop for(var name : object){} check about foreach it may help you

Related

Retrieve a request using the requestId after the response is received in Firefox extension

I am writing a Firefox extension using the WebRequest. I am in a situation that when I receive the response, I want to look back and find the request associated with this response to retrieve a custom request header. My current code is like this:
browser.webRequest.onBeforeSendHeaders.addListener(
addCustomHeader, // here I add custom_header:value
{urls: ["<all_urls>"]},
["blocking", "requestHeaders"]
);
...
browser.webRequest.onHeadersReceived.addListener(
process_response, // here I want to get back to the request and retrieve the custom header value
{urls: ["<all_urls>"]},
["blocking", "responseHeaders"]
);
The value of custom_header is set as a global variable, which changes per each request. And when I receive the response of, say, request_1, I want to retrieve the header value from request_1 in the process_response() function. However, if I directly use the value, I found it may have been changed by subsequent requests, say request_2 or request_3.
I noticed the response has a requestId property, and I guess I can use it to find the corresponding request. However, I am not able to find any document or example that tells me how. I'd appreciate for any hint!
Use a global map variable:
const reqMap = (() => {
const data = new Map();
const MAX_AGE = 10 * 60e3; // 10 minutes
const cleanup = () => {
const cutOff = performance.now() - MAX_AGE;
data.forEach(({ time }, id) => time < cutOff && data.delete(id));
};
return {
set(id, value) {
cleanup();
data.set(id, {value, time: performance.now()});
},
pop(id) {
const {value} = data.get(id) || {};
data.delete(id);
return value;
},
};
})();
function onBeforeSendHeaders(details) {
reqMap.set(details.requestId, {any: 'data'});
}
function onHeadersReceived(details) {
const data = reqMap.pop(details.requestId);
if (data) {
// ............
}
}

Unable to retrieve JSON array with AngularJS $resource

I've been trying retrieve values from JSON and so far, been unsuccessful. It does get called on the front-end when I refresh the page, but the information is not passing to the next method. I think the issue might be down to the promises.push... line, as I've tried to debug the method underneath and the information is not being passed on at all.
AngularJS:
var promises = [];
promises.push(SpringDataRestService.get({"collection": "subjects"}).$promise);
// Require each of these queries to complete before continuing
$q.all(promises).then(function (data) {
// Grab the first result
$scope.available = data[0].subjects;
$scope.selected = [];
// If this is an update, get the second result in set
if (data.length > 1) {
// For each permission that is assigned to this role, add ID (name) to selected
for (var i = 0; i < data[1].data.subjects.length; i++) {
var perm = data[1].data.subjects[i];
$scope.selected.push(perm.name);
}
}
$scope.tableEditOptions = new NgTableParams({}, {
dataset: $scope.available
});
$scope.available, 'name');
}).catch(function (data) {
// ERROR
});
JSON:
[
{
"name": "FWGWG",
"description": "WGWGWG",
"lockId": 0
},
{
"name": "QFQFQF",
"description": "QFQFQFQ",
"lockId": 0
}
]
I'm confident as well my for loop is wrong due to assigning the values as well, since I don't think it should be data.subjects, but I understand these threads are only 1 issue per question. Any help would be greatly appreicated.
Use the query method for arrays:
var promise = SpringDataRestService.query({"collection": "subjects"}).$promise;
promise.then(function (dataArr) {
console.log(dataArr);
//...
}).catch(function (errorResponse) {
console.log(errorResponse);
});
With the REST services, the get method returns a JavaScript object and the query method returns a JavaScript array.
From the Docs:
$resource Returns
A resource "class" object with methods for the default set of resource actions optionally extended with custom actions. The default set contains these actions:
{
'get': {method: 'GET'},
'save': {method: 'POST'},
'query': {method: 'GET', isArray: true},
'remove': {method: 'DELETE'},
'delete': {method: 'DELETE'}
}
...
It is important to realize that invoking a $resource object method immediately returns an empty reference (object or array depending on isArray). Once the data is returned from the server the existing reference is populated with the actual data.
For more information, see
AngularJS $resource Service API Reference

Parse JSON file containing multiple objects

I have a JSON file that contains multiple objects of the same structure that look like this:
{
"id": "123",
"type": "alpha"
}
{
"id": "321",
"type": "beta"
}
I'm using node.js to read the file.
fs.readFile(__dirname + "/filename.json", 'utf8', function(err, data) {
var content = JSON.parse(JSON.stringify(data));
If I do a console.log(content) things look good. I see the content of the json file. I'm trying to iterate over each object but I'm not sure how to do that. I've tried using
for(var doc in content)
but the doc isn't each object as I was expecting. How do I loop over the content to get each object in a json format so that I can parse it?
If content is an array, you can use
content.forEach(function (obj, index) { /* your code */ })
See documentation for Array.prototype.forEach()
if you need to just iterate, a forEach loop would work or a normal for loop :
for(var i = 0; i<content.length(); i++){
//perform whatever you need on the following object
var myobject = content[i];
}
Depend of the files, the two current answer (Osama and Daniel) assume you have a JSON Array:
[
{
"id": "123",
"type": "alpha"
},
{
"id": "456",
"type": "beta"
}
]
In which case, you can use any array iterator:
var async = require('async'),
content = require(__dirname + "/filename.json");
async.each(content, function (item, callback) {
//...
});
But in your case, it seems to not be JSON (no bracket to indicate array, and no comma to separate the objects), so in the case JSON.parse doesn t throw up any error, you'll need to isolate your objects first:
var fs = require('fs'),
async = require('async');
fs.readFile(__dirname + "/filename.notjson", 'utf8', function(err, data) {
var content = data.split('}');
async.map(content, function (item, callback) {
callback(null, JSON.parse(item));
}, function (err, content) {
console.log(content);
};
});

How can I stream a JSON Array from NodeJS to postgres

I am trying to insert couple of millions records (with approximately 6 fields/columns) by receiving in requests from clients 10,000 records per bulk insert attempt (using sequelize.js and bulkCreate())
This obviously was a bad idea, so I tried looking into node-pg-copy-streams
However, I do not want to initiate a change on the client side, where a json array is sent as such
# python
data = [
{
"column a":"a values",
"column b":"b values",
},
...
# 10,000 items
...
]
request.post(data=json.dumps(data), url=url)
On the Server side in nodejs, how would I stream the received request.body in the following skeleton ?
.post(function(req, res){
// old sequelize code
/* table5.bulkCreate(
req.body, {raw:true}
).then(function(){
return table5.findAll();
}).then(function(result){
res.json(result.count);
});*/
// new pg-copy-streams code
pg.connect(function(err, client, done) {
var stream = client.query(copyFrom('COPY my_table FROM STDIN'));
// My question is here, how would I stream or pipe the request body ?
// ?.on('error', done);
// ?.pipe(stream).on('finish', done).on('error', done);
});
});
Here's how I solved my problem,
First a function to convert my req.body dict to a TSV (not a part of the initial problem)
/**
* Converts a dictionary and set of keys to a Tab Separated Value blob of text
* #param {Dictionary object} dict
* #param {Array of Keys} keys
* #return {Concatenated Tab Separated Values} String
*/
function convertDictsToTSV(dicts, keys){
// ...
}
Second the rest of my original .post function
.post(function(req, res){
// ...
/* requires 'stream' as
* var stream = require('stream');
* var copyFrom = require('pg-copy-streams').from;
*/
var read_stream_string = new stream.Readable();
read_stream_string.read = function noop() {};
var keys = [...]; // set of dictionary keys to extract from req.body
read_stream_string.push(convertDictsToTSV(req.body, keys));
read_stream_string.push(null);
pg.connect(connectionString, function(err, client, done) {
// ...
// error handling
// ...
var copy_string = 'Copy tablename (' + keys.join(',') + ') FROM STDIN'
var pg_copy_stream = client.query( copyFrom( copy_string ) );
read_stream_string.pipe(pg_copy_stream).on('finish', function(finished){
// handle finished and done appropriately
}).on('error', function(errored){
// handle errored and done appropriately
});
});
pg.end();
});
Technically, there is no streaming here, not in terms of how NodeJS streaming works.
You are sending a chunk of 10,000 records each time and expect your server-side to insert those and return an OK to the client to send another 10,000 records. That's throttling/paging data in, not streaming.
Once your server has received the next 10,000 records, insert them (usually as a transaction), and then respond with OK back to the client so it can send the next 10,000 records.
Writing transactions with node-postgres isn't an easy task, as it is too low-level for that.
Below is an example of how to do that with the help of pg-promise:
function insertRecords(records) {
return db.tx(t=> {
var inserts = [];
records.forEach(r=> {
var query = t.none("INSERT INTO table(fieldA, ...) VALUES(${propA}, ...)", r);
inserts.push(query);
});
return t.batch(inserts);
});
}
Then inside your HTTP handler, you would write:
function myPostHandler(req, res) {
// var records = get records from the request;
insertRecords(records)
.then(data=> {
// set response as success;
})
.catch(error=> {
// set response as error;
});
}

node.js routes validate json body

Im using express, body-parser and moongose to build a RESTful web service with Node.js. Im getting json data in the body of a POST request, that function looks like this:
router.route('/match')
// create a match (accessed at POST http://localhost:3000/api/match)
.post(function(req, res) {
if (req._body == true && req.is('application/json') == 'application/json' ) {
var match = new Match(); // create a new instance of the match model
match.name = req.body.name; // set the match name and so on...
match.host = req.body.host;
match.clients = req.body.clients;
match.status = req.body.status;
// save the match and check for errors
match.save(function(err) {
if (err) {
//res.send(err);
res.json({ status: 'ERROR' });
} else {
res.json({ status: 'OK', Match_ID: match._id });
}
});
} else {
res.json({ status: 'ERROR', msg: 'not application/json type'});
}
});
The model Im using for storing a match in the database looks like this:
// app/models/match.js
var mongoose = require('mongoose');
var Schema = mongoose.Schema;
var MatchSchema = new Schema({
name: String,
host: String,
clients: { type: [String]},
date: { type: Date, default: Date.now },
status: { type: String, default: 'started' }
});
module.exports = mongoose.model('Match', MatchSchema);
But how do I validate that the json data in the body of the POST request has the key/value fields I want? For clarification, I dont want to insert data in the database that is incomplete. If I test to skip a key/value pair in the json data I get a missing field in the database and when I tries to read req.body.MISSING_FIELD parameter in my code I get undefined. All fields except date in the model is required.
Im using json strings like this to add matches in the database
{"name": "SOCCER", "host": "HOST_ID1", "clients": ["CLIENT_ID1", "CLIENT_ID2"], "status": "started"}
I use a very simple function that takes an array of keys, then loops through it and ensures that req.body[key] is not a falsy value. It is trivial to modify it to accommodate only undefined values however.
In app.js
app.use( (req, res, next ) => {
req.require = ( keys = [] ) => {
keys.forEach( key => {
// NOTE: This will throw an error for ALL falsy values.
// if this is not the desired outcome, use
// if( typeof req.body[key] === "undefined" )
if( !req.body[key] )
throw new Error( "Missing required fields" );
})
}
})
in your route handler
try{
// Immediately throws an error if the provided keys are not in req.body
req.require( [ "requiredKey1", "requiredKey2" ] );
// Other code, typically async/await for simplicity
} catch( e ){
//Handle errors, or
next( e ); // Use the error-handling middleware defined in app.js
}
This only checks to ensure that the body contains the specified keys. IT won't validate the data sent in any meaningful way. This is fine for my use case because if the data is totally borked then I'll just handle the error in the catch block and throw an HTTP error code back at the client. (consider sending a meaningful payload as well)
If you want to validate the data in a more sophisticated way, (like, say, ensuring that an email is the correct format, etc) you might want to look into a validation middle-ware, like https://express-validator.github.io/docs/