Search items in external JSON - json

I have the URL of a JSON file and I want to get all the items with the same value.
Example:
http://sampleurl.com has this JSON
`{
"posts":[
{
"authors":[
{
{"name":"John",
"age": 30
},
{"name":"John",
"age": 35
}
}
]
}
]
}`
What I want to do is to list all those authors with the same name together with their age.
I have tried this with no success:
`var allposts = "http://sampleurl.com";
$.each(allposts.posts.authors, function(i, v) {
if (v.name == "John") {
alert("Ok");
return;
}
});`
Thanks

You need to get the data via an Ajax call - $.getJSON:
const authors = {};
$.getJSON( "http://sampleurl.com", data =>
data.posts.authors.forEach(author => {
authors[author.name] = authors[author.name] || []
authors[author.name].push(author)
});
);
At the end you have an object keyed on unique author names, with each key containing as its value an array of the authors with that name. You can do further processing to transform that to the data structure you need.
This example doesn't deal with data coming back that isn't in the shape you expect. For example, if some author records are missing a name, you will end up with a key undefined. And if there is no authors key or no posts key in the returned object you will get an exception.
So you have to decide how your program should behave in those cases. Should it explode? Or return an empty object? If you want it to continue with an empty object:
const authors = {};
$.getJSON( "http://sampleurl.com", data =>
if (data.posts && data.posts.authors) {
authors.forEach(author => {
const name = author.name || 'unknown';
authors[author.name] = authors[author.name] || []
authors[author.name].push(author)
});
} else {
console.log('Warning! Data from API did not contain posts.authors!')
}
);
Note that neither of these examples deal with the AJAX call itself failing. For that you need to chain a .fail() handler:
const authors = {};
const url = "http://sampleurl.com"
$.getJSON( url, data =>
if (data.posts && data.posts.authors) {
authors.forEach(author => {
const name = author.name || 'unknown';
authors[author.name] = authors[author.name] || []
authors[author.name].push(author)
});
} else {
console.log('Warning! Data from API did not contain posts.authors!')
}
).fail(res => console.log(`Ajax call to ${url} failed with message ${res.responseText}!`);
10% of programming is getting it to work. The other 90% is coding for what happens when it doesn't work.

Related

I'm reading in a JSON of nested JavaScript Objects - I want to 'sort' it so objects that have a field with a certain value go to the top

I'm reading in a JSON that has a map of javascript objects. So.. for example :
{
offers : {
"1":{"id":"1", "category":"a", "offerType":"LS"},
"2":{"id":"2", "category":"a", "offerType":"EX"},
"3":{"id":"3", "category":"a", "offerType":"EX"},
"4":{"id":"4", "category":"a", "offerType":"LS"}
}
}
When I read this JSON, I am storing it in local storage. I want to "sort" is so that all offers that have offerType of "LS" show up on the TOP of my object in local storage.
The reason I want to do this is so when I display these offers on my site, the ones with offerType "LS" will display first.
I am doing this is Angular :
let offers = data.offers;
if (offers != null) {
for (var index in offers) {
var offer = offers[index];
if ( offer != undefined) {
if (offer.offerType == 'LS'){
offersLS = [...offersLS, offer];
}
}
}
if (offersLS != null){
offersLS.forEach(offerLS => {
let key = offerLS['id'];
listOffers = offers[key], listOffers;
});
}
listOffers = listOffers, offers;
}
listOffers is what ends up getting saved as my local storage object. I have tried to do it like : listOffers = [...offersLS, ...offers] but that obviously saves it in my localStorage as an array and I need it to be a 'map' of these objects or object of objects..not exactly sure what the correct terminology would be.
First, you have to know that sort an object may be a bad idea, it's not designed for that. Think about use arrays.
Anyway the solution is quite simple, it's about sort the keys then redo an object based on this new keys order:
const offers = {
"1":{"id":"1", "category":"a", "offerType":"LS"},
"2":{"id":"2", "category":"a", "offerType":"EX"},
"3":{"id":"3", "category":"a", "offerType":"EX"},
"4":{"id":"4", "category":"a", "offerType":"LS"}
};
let value1;
const sortedOffers = Object.keys(offers)
.sort((k1, k2) => {
// your sort rules here
value1 = offers[k1];
return value1.offerType === 'LS' ? -1 : 1;
})
.reduce((o, k) => {
o[k] = offers[k];
return o;
}, {});

Variable scope & Callback woes

This program is reading through the nested object searching for a specific key & values. Once this data is found it has to initiate callback to send back the data. The object looks like this:
{
"name": "joel",
"title": "CTO",
"edu": {
"school": "RMB",
"college": "GNK",
"pg": "CDAC",
"extract": "This is a large text ..."
}
}
Here as I come from synchronous programming background I am not able to understand when I have to initiate the callback and also ensure variables are in scope
function parseData(str, callback) {
function recursiveFunction(obj) {
var keysArray = Object.keys(obj);
for (var i = 0; i < keysArray.length; i++) {
var key = keysArray[i];
var value = obj[key];
if (value === Object(value)) {
recursiveFunction(value);
}
else {
if (key == 'title') {
var title = value;
}
if (key == 'extract') {
var extract = value.replace(/(\r\n|\n|\r)/gm," ");
callback(null, JSON.stringify({title: title, text: extract}));
}
}
}
}
recursiveFunction(str, callback(null, JSON.stringify({title: title, text: extract})));
};
when this code is executed we get following error
/parseData.js:29
recursiveFunction(str, callback(null, JSON.stringify({title: title, text: extract})));
^
ReferenceError: title is not defined
Okay. So you want a function that retrieves the first property named title and the first property named extract from a nested object, no matter how deeply nested these properties are.
"Extract a property value from an object" is basically is a task in its own right, we could write a function for it.
There are three cases to handle:
The argument is not an object - return undefined
The argument contains the key in question - return the associated value
Otherwise, recurse into the object and repeat steps 1 and 2 - return according result
It could look like this:
function pluck(obj, searchKey) {
var val;
if (!obj || typeof obj !== "object") return;
if (obj.hasOwnProperty(searchKey)) return obj[searchKey];
Object.keys(obj).forEach(function (key) {
if (val) return;
val = pluck(obj[key], searchKey);
});
return val;
}
Now we can call pluck() on any object and with any key and it will return to us the first value it finds anywhere in the object.
Now the rest of your task becomes very easy:
var obj = {
"name": "joel",
"title": "CTO",
"edu": {
"school": "RMB",
"college": "GNK",
"pg": "CDAC",
"extract": "This is a large text ..."
}
}
var data = {
title: pluck(obj, "title"),
text: pluck(obj, "extract")
};
This function that you 've posted above has nothing to do with async programming. I will respond in the context of the chunk of code that you 've posted. The error that you have is because you are calling the recursiveFunction(str, callback(null, JSON.stringify({title: title, text: extract}))); but the title variable is nowhere defined. I can see a definition of the title but it is in the the context of the recursiveFunction function. The variables that you define in there are not visible outside of the scope of that function and that's why you have this error.
You are trying to do something strange in this line:
recursiveFunction(str, callback(null, JSON.stringify({title: title, text: extract})));
This line will invoke the callback and will pass in the recursiveFunction the results of this function. I would expect to see something like that in this line:
recursiveFunction(str, callback);

Decypher ES6 const destructuring declaration

Can someone help me decypher this ES6 statement?
const {
isFetching,
lastUpdated,
items: posts
} = postsByReddit[selectedReddit] || {
isFetching: true,
items: []
}
I pulled it from the Redux async example - https://github.com/reactjs/redux/blob/master/examples/async/containers/App.js#L81
The code is simply declaring three constants, getting them from similarly named properties on an object if it is non-empty, otherwise get them from an object literal that acts as default values.
I trust that you are confused over the object like syntax rather than the const keyword.
var|let|const { ... } = ... is an object destructuring declaration.
var|let|const [ ... ] = ... is an array destructuring declaration.
Both are short hand for "break down right hand side and assign to left hand side".
Destructuring can be done on array or object using different brackets.
It can be part of a declaration or as stand-alone assignment.
const { isFetching } = obj; // Same as const isFetching = obj.isFetching
var [ a, b ] = ary; // Same as var a = ary[0], b = ary[1]
[ a ] = [ 1 ]; // Same as a = 1
For object destructuring, you can specify the property name.
For array, you can skip elements by leaving blank commas.
Destructuring can also form a hierarchy and be mixed.
const { items: posts } = obj; // Same as const posts = obj.items
var [ , , c ] = ary; // Same as var c = ary[2]
let { foo: [ { bar } ], bas } = obj; // Same as let bar = obj.foo[0].bar, bas = obj.bas
When destructuring null or undefined, or array destructure on non-iterable, it will throw TypeError.
Otherwise, if a matching part cannot be found, its value is undefined, unless a default is set.
let { err1 } = null; // TypeError
let [ err3 ] = {}; // TypeError
let [ { err2 } ] = [ undefined ]; // TypeError
let [ no ] = []; // undefined
let { body } = {}; // undefined
let { here = this } = {}; // here === this
let { valueOf } = 0; // Surprise! valueOf === Number.prototype.valueOf
Array destructuring works on any "iterable" objects, such as Map, Set, or NodeList.
Of course, these iterable objects can also be destructed as objects.
const doc = document;
let [ a0, a1, a2 ] = doc.querySelectorAll( 'a' ); // Get first three <a> into a0, a1, a2
let { 0: a, length } = doc.querySelectorAll( 'a' ); // Get first <a> and number of <a>
Finally, don't forget that destructuring can be used in any declarations, not just in function body:
function log ({ method = 'log', message }) {
console[ method ]( message );
}
log({ method: "info", message: "This calls console.info" });
log({ message: "This defaults to console.log" });
for ( let i = 0, list = frames, { length } = frames ; i < length ; i++ ) {
console.log( list[ i ] ); // Log each frame
}
Note that because destructuring depends on left hand side to specify how to destructre right hand side,
you cannot use destructring to assign to object properties.
This also excludes the usage of calculated property name in destructuring.
As you have seen, destructuring is a simple shorthand concept that will help you do more with less code.
It is well supported in Chrome, Edge, Firefox, Node.js, and Safari,
so you can start learn and use it now!
For EcmaScript5 (IE11) compatibility, Babel and Traceur transpilers
can turn most ES6/ES7 code into ES5, including destructuring.
If still unclear, feel free to come to StackOverflow JavaScript chatroom.
As the second most popular room on SO, experts are available 24/7 :)
This is an additional response to the already given. Destructuring also supports default values, which enables us to simplify the code:
const {
isFetching = true,
lastUpdated,
items = []
} = postsByReddit[selectedReddit] || {};
Basically:
var isFecthing;
var lastUpdated;
var posts;
if (postsByReddit[selectedReddit]) {
isFecthing = postsByReddit[selectedReddit].isFecthing;
lastUpdated = postsByReddit[selectedReddit].lastUpdated;
posts = postsByReddit[selectedReddit].items.posts;
} else {
isFecthing = true;
items = [];
}

Node JS: Make a flat json from a tree json

I was writing a node.js script to combine all the json files in a directory and store the result as a new json file. I tried do the job to a great extent but it has few flaws.
A.json
[
{
"id": "addEmoticon1",
"description": "Message to greet the user.",
"defaultMessage": "Hello, {name}!"
},
{
"id": "addPhoto1",
"description": "How are youu.",
"defaultMessage": "How are you??"
}
]
B.json
[
{
"id": "close1",
"description": "Close it.",
"defaultMessage": "Close!"
}
]
What I finally need is:
result.json
{
"addEmoticon1": "Hello, {name}!",
"addPhoto1": "How are you??",
"close1": "Close!"
}
I wrote a node.js script:
var fs = require('fs');
function readFiles(dirname, onFileContent, onError) {
fs.readdir(dirname, function(err, filenames) {
if (err) {
onError(err);
return;
}
filenames.forEach(function(filename) {
fs.readFile(dirname + filename, 'utf-8', function(err, content) {
if (err) {
onError(err);
return;
}
onFileContent(filename, content);
});
});
});
}
var data = {};
readFiles('C:/node/test/', function(filename, content) {
data[filename] = content;
var lines = content.split('\n');
lines.forEach(function(line) {
var parts = line.split('"');
if (parts[1] == 'id') {
fs.appendFile('result.json', parts[3]+': ', function (err) {});
}
if (parts[1] == 'defaultMessage') {
fs.appendFile('result.json', parts[3]+',\n', function (err) {});
}
});
}, function(err) {
throw err;
});
It extracts the 'id' and 'defaultMessage' but is not able to append correctly.
What I get:
result.json
addEmoticon1: addPhoto1: Hello, {name}!,
close1: How are you??,
Close!,
This output is different every time I run my script.
Aim 1: Surround items in double quotes,
Aim 2: Add curly braces at the top and at the end
Aim 3: No comma at the end of last element
Aim 4: Same output every time I run my script
I'll start with the finished solution...
There's a big explanation at the end of this answer. Let's try to think big-picture for a little bit first tho.
readdirp('.')
.fmap(filter(match(/\.json$/)))
.fmap(map(readfilep))
.fmap(map(fmap(JSON.parse)))
.fmap(concatp)
.fmap(flatten)
.fmap(reduce(createMap)({}))
.fmap(data=> JSON.stringify(data, null, '\t'))
.fmap(writefilep(resolve(__dirname, 'result.json')))
.then(filename=> console.log('wrote results to %s', filename), err=>console.error(err));
Console output
wrote results to /path/to/result.json
result.json (I added a c.json with some data to show that this works with more than 2 files)
{
"addEmoticon1": "Hello, {name}!",
"addPhoto1": "How are you??",
"close1": "Close!",
"somethingelse": "Something!"
}
Implementation
I made Promise-based interfaces for readdir and readFile and writeFile
import {readdir, readFile, writeFile} from 'fs';
const readdirp = dir=>
new Promise((pass,fail)=>
readdir(dir, (err, filenames) =>
err ? fail(err) : pass(mapResolve (dir) (filenames))));
const readfilep = path=>
new Promise((pass,fail)=>
readFile(path, 'utf8', (err,data)=>
err ? fail(err) : pass(data)));
const writefilep = path=> data=>
new Promise((pass,fail)=>
writeFile(path, data, err=>
err ? fail(err) : pass(path)));
In order to map functions to our Promises, we needed an fmap utility. Notice how we take care to bubble errors up.
Promise.prototype.fmap = function fmap(f) {
return new Promise((pass,fail) =>
this.then(x=> pass(f(x)), fail));
};
And here's the rest of the utilities
const fmap = f=> x=> x.fmap(f);
const mapResolve = dir=> map(x=>resolve(dir,x));
const map = f=> xs=> xs.map(x=> f(x));
const filter = f=> xs=> xs.filter(x=> f(x));
const match = re=> s=> re.test(s);
const concatp = xs=> Promise.all(xs);
const reduce = f=> y=> xs=> xs.reduce((y,x)=> f(y)(x), y);
const flatten = reduce(y=> x=> y.concat(Array.isArray(x) ? flatten (x) : x)) ([]);
Lastly, the one custom function that does your work
const createMap = map=> ({id, defaultMessage})=>
Object.assign(map, {[id]: defaultMessage});
And here's c.json
[
{
"id": "somethingelse",
"description": "something",
"defaultMessage": "Something!"
}
]
"Why so many little functions ?"
Well despite what you may think, you have a pretty big problem. And big problems are solved by combining several small solutions. The most prominent advantage of this code is that each function has a very distinct purpose and it will always produce the same results for the same inputs. This means each function can be used other places in your program. Another advantage is that smaller functions are easier to read, reason with, and debug.
Compare all of this to the other answers given here; #BlazeSahlen's in particular. That's over 60 lines of code that's basically only usable to solve this one particular problem. And it doesn't even filter out non-JSON files. So the next time you need to create a sequence of actions on reading/writing files, you'll have to rewrite most of those 60 lines each time. It creates lots of duplicated code and hard-to-find bugs because of exhausting boilerplate. And all that manual error-handling... wow, just kill me now. And he/she thought callback hell was bad ? haha, he/she just created yet another circle of hell all on his/her own.
All the code together...
Functions appear (roughly) in the order they are used
import {readdir, readFile, writeFile} from 'fs';
import {resolve} from 'path';
// logp: Promise<Value> -> Void
const logp = p=> p.then(x=> console.log(x), x=> console.err(x));
// fmap : Promise<a> -> (a->b) -> Promise<b>
Promise.prototype.fmap = function fmap(f) {
return new Promise((pass,fail) =>
this.then(x=> pass(f(x)), fail));
};
// fmap : (a->b) -> F<a> -> F<b>
const fmap = f=> x=> x.fmap(f);
// readdirp : String -> Promise<Array<String>>
const readdirp = dir=>
new Promise((pass,fail)=>
readdir(dir, (err, filenames) =>
err ? fail(err) : pass(mapResolve (dir) (filenames))));
// mapResolve : String -> Array<String> -> Array<String>
const mapResolve = dir=> map(x=>resolve(dir,x));
// map : (a->b) -> Array<a> -> Array<b>
const map = f=> xs=> xs.map(x=> f(x));
// filter : (Value -> Boolean) -> Array<Value> -> Array<Value>
const filter = f=> xs=> xs.filter(x=> f(x));
// match : RegExp -> String -> Boolean
const match = re=> s=> re.test(s);
// readfilep : String -> Promise<String>
const readfilep = path=>
new Promise((pass,fail)=>
readFile(path, 'utf8', (err,data)=>
err ? fail(err) : pass(data)));
// concatp : Array<Promise<Value>> -> Array<Value>
const concatp = xs=> Promise.all(xs);
// reduce : (b->a->b) -> b -> Array<a> -> b
const reduce = f=> y=> xs=> xs.reduce((y,x)=> f(y)(x), y);
// flatten : Array<Array<Value>> -> Array<Value>
const flatten = reduce(y=> x=> y.concat(Array.isArray(x) ? flatten (x) : x)) ([]);
// writefilep : String -> Value -> Promise<String>
const writefilep = path=> data=>
new Promise((pass,fail)=>
writeFile(path, data, err=>
err ? fail(err) : pass(path)));
// -----------------------------------------------------------------------------
// createMap : Object -> Object -> Object
const createMap = map=> ({id, defaultMessage})=>
Object.assign(map, {[id]: defaultMessage});
// do it !
readdirp('.')
.fmap(filter(match(/\.json$/)))
.fmap(map(readfilep))
.fmap(map(fmap(JSON.parse)))
.fmap(concatp)
.fmap(flatten)
.fmap(reduce(createMap)({}))
.fmap(data=> JSON.stringify(data, null, '\t'))
.fmap(writefilep(resolve(__dirname, 'result.json')))
.then(filename=> console.log('wrote results to %s', filename), err=>console.error(err));
Still having trouble following along?
It's not easy to see how these things work at first. This is a particularly squirrely problem because the data gets nested very quickly. Thankfully that doesn't mean our code has to be a big nested mess just to solve the problem ! Notice the code stays nice and flat even when we're dealing with things like a Promise of an Array of Promises of JSON...
// Here we are reading directory '.'
// We will get a Promise<Array<String>>
// Let's say the files are 'a.json', 'b.json', 'c.json', and 'run.js'
// Promise will look like this:
// Promise<['a.json', 'b.json', 'c.json', 'run.js']>
readdirp('.')
// Now we're going to strip out any non-JSON files
// Promise<['a.json', 'b.json', 'c.json']>
.fmap(filter(match(/\.json$/)))
// call `readfilep` on each of the files
// We will get <Promise<Array<Promise<JSON>>>>
// Don't freak out, it's not that bad!
// Promise<[Promise<JSON>, Promise<JSON>. Promise<JSON>]>
.fmap(map(readfilep))
// for each file's Promise, we want to parse the data as JSON
// JSON.parse returns an object, so the structure will be the same
// except JSON will be an object!
// Promise<[Promise<Object>, Promise<Object>, Promise<Object>]>
.fmap(map(fmap(JSON.parse)))
// Now we can start collapsing some of the structure
// `concatp` will convert Array<Promise<Value>> to Array<Value>
// We will get
// Promise<[Object, Object, Object]>
// Remember, we have 3 Objects; one for each parsed JSON file
.fmap(concatp)
// Your particular JSON structures are Arrays, which are also Objects
// so that means `concatp` will actually return Promise<[Array, Array, Array]
// but we'd like to flatten that
// that way each parsed JSON file gets mushed into a single data set
// after flatten, we will have
// Promise<Array<Object>>
.fmap(flatten)
// Here's where it all comes together
// now that we have a single Promise of an Array containing all of your objects ...
// We can simply reduce the array and create the mapping of key:values that you wish
// `createMap` is custom tailored for the mapping you need
// we initialize the `reduce` with an empty object, {}
// after it runs, we will have Promise<Object>
// where Object is your result
.fmap(reduce(createMap)({}))
// It's all downhill from here
// We currently have Promise<Object>
// but before we write that to a file, we need to convert it to JSON
// JSON.stringify(data, null, '\t') will pretty print the JSON using tab to indent
// After this, we will have Promise<JSON>
.fmap(data=> JSON.stringify(data, null, '\t'))
// Now that we have a JSON, we can easily write this to a file
// We'll use `writefilep` to write the result to `result.json` in the current working directory
// I wrote `writefilep` to pass the filename on success
// so when this finishes, we will have
// Promise<Path>
// You could have it return Promise<Void> like writeFile sends void to the callback. up to you.
.fmap(writefilep(resolve(__dirname, 'result.json')))
// the grand finale
// alert the user that everything is done (or if an error occurred)
// Remember `.then` is like a fork in the road:
// the code will go to the left function on success, and the right on failure
// Here, we're using a generic function to say we wrote the file out
// If a failure happens, we write that to console.error
.then(filename=> console.log('wrote results to %s', filename), err=>console.error(err));
All done !
Assumed files is list of arrays; [a, b, ...];
var res = {};
files.reduce((a, b) => a.concat(b), []).forEach(o => res[o.id] = o.defaultMessage);
But you need not to get all files at once.
Just add this code to onFileContent callback.
JSON.parse(fileContent).forEach(o => res[o.id] = o.defaultMessage);
Also, you should to add any final callback to your readFiles.
And in this callback:
fs.writeFile('result.json', JSON.stringify(res));
So, final solution for you:
var fs = require('fs');
function task(dir, it, cb) {
fs.readdir(dir, (err, names) => {
if (err) return cb([err]);
var errors = [], c = names.length;
names.forEach(name => {
fs.readFile(dir + name, 'utf-8', (err, data) => {
if (err) return errors.push(err);
try {
it(JSON.parse(data)); // We get a file data!
} catch(e) {
errors.push('Invalid json in ' + name + ': '+e.message);
}
if (!--c) cb(errors); // We are finish
});
});
});
}
var res = {};
task('C:/node/test/', (data) => data.forEach(o => res[o.id] = o.defaultMessage), (errors) => {
// Some files can be wrong
errors.forEach(err => console.error(err));
// But we anyway write received data
fs.writeFile('C:/node/test/result.json', JSON.stringify(res), (err) => {
if (err) console.error(err);
else console.log('Task finished. see results.json');
})
});
this should do it once you have your json in variables a and b:
var a = [
{
"id": "addEmoticon1",
"description": "Message to greet the user.",
"defaultMessage": "Hello, {name}!"
},
{
"id": "addPhoto1",
"description": "How are youu.",
"defaultMessage": "How are you??"
}
];
var b = [
{
"id": "close1",
"description": "Close it.",
"defaultMessage": "Close!"
}
];
var c = a.concat(b);
var res = []
for (var i = 0; i < c.length; i++){
res[ c[i].id ] = c[i].defaultMessage;
}
console.log(res);
Here's my solution:
function readFiles(dirname, onFileContent, onError) {
fs.readdir(dirname, function(err, filenames) {
/**
* We'll store the parsed JSON data in this array
* #type {Array}
*/
var fileContent = [];
if (err) {
onError(err);
} else {
filenames.forEach(function(filename) {
// Reading the file (synchronously) and storing the parsed JSON output (parsing from string to JSON object)
var jsonObject = JSON.parse(fs.readFileSync(dirname + filename, 'utf-8'));
// Pushing the parsed JSON output into array
fileContent.push(jsonObject);
});
// Calling the callback
onFileContent(fileContent);
}
});
}
readFiles('./files/',function(fileContent) {
/**
* We'll store the final output object here
* #type {Object}
*/
var output = {};
// Loop over the JSON objects
fileContent.forEach(function(each) {
// Looping within each object
for (var index in each) {
// Copying the `id` as key and the `defaultMessage` as value and storing in output object
output[each[index].id] = each[index].defaultMessage;
}
});
// Writing the file (synchronously) after converting the JSON object back to string
fs.writeFileSync('result.json', JSON.stringify(output));
}, function(err) {
throw err;
});
Notable difference is that I've not used the asynchronous readFile and writeFile functions as they'd needlessly complicate the example. This example is meant to showcase the use of JSON.parse and JSON.stringify to do what OP wants.
UPDATE:
var fs = require('fs');
function readFiles(dirname, onEachFilename, onComplete) {
fs.readdir(dirname, function(err, filenames) {
if (err) {
throw err;
} else {
// Prepending the dirname to each filename
filenames.forEach(function(each, index, array) {
array[index] = dirname + each;
});
// Calling aync.map which accepts these parameters:
// filenames <-------- array of filenames
// onEachFilename <--- function which will be applied on each filename
// onComplete <------- function to call when the all elements of filenames array have been processed
require('async').map(filenames, onEachFilename, onComplete);
}
});
}
readFiles('./files/', function(item, callback) {
// Read the file asynchronously
fs.readFile(item, function(err, data) {
if (err) {
callback(err);
} else {
callback(null, JSON.parse(data));
}
});
}, function(err, results) {
/**
* We'll store the final output object here
* #type {Object}
*/
var output = {};
if (err) {
throw err;
} else {
// Loop over the JSON objects
results.forEach(function(each) {
// Looping within each object
for (var index in each) {
// Copying the `id` as key and the `defaultMessage` as value and storing in output object
output[each[index].id] = each[index].defaultMessage;
}
});
// Writing the file (synchronously) after converting the JSON object back to string
fs.writeFileSync('result.json', JSON.stringify(output));
}
});
This is a simple asynchronous implementation of the same, using readFile. For more information, async.map.

Serializing rows of key/value pairs into a JSON object

I need help transforming this table of data:
[
{property:"key", content:"1"},
{property:"key", content:"2"},
{property:"key2", content:"3"},
{property:"key2:subkey", content:"4"},
{property:"key2:someother:key", content:"5"},
{property:"foo", content:"6"},
{property:"foo", content:"7"},
{property:"foo:bar", content:"8"}
]
into a JSON object with the following structure:
{
key: ["1", "2"],
key2: {
'': "3"
subkey: "4"
someother: {
key: "5"
}
},
foo: [
"6",
{
'': "7"
bar: "8"
}
]
}
Here are the rules. Note: all rules apply to any level in the JSON object (json.levelOne, json.level.two, json.level.three.even, etc)
For each row.property like "a:b:c" should translate into json.a.b.c = row.content.
When row.property = "x" and json.x !== undefined then json.x = [json.x, row.content]
Whenever json.x === "string" and row.property = "x:y" then json.x = {'': json.x, y: row.content}
Whenever Array.isArray(json.x) && json.x[json.x.length-1] === "string" and row.property = "x:y" then json.x[json.x.length-1] = {'': json.x[json.x.length-1], y: row.content}
Hopefully that gives you some idea as to the criteria of what I need to do to translate the data into this JSON object format.
Why?
I'm trying to take Open Graph meta data and serialize it into a JSON object. I feel like the format above would best reflect the Open Graph meta data structure. I need help writing this algorithm though. This is for an open source Node.js project that I'm working on.
All help is appreciated. Thanks!
edit
So there are some issue left to the parser. Arrays occur at leaf nodes in some cases.
Here is the project on GitHub: https://github.com/samholmes/node-open-graph Feel free to fork it, build a better parse, and send me a pull request.
Updated per our discussion on IRC
var data = [
{property:"key", content:"1"},
{property:"key", content:"2"},
{property:"key2", content:"3"},
{property:"key2:subkey", content:"4"},
{property:"key2:someother:key", content:"5"},
{property:"foo", content:"6"},
{property:"foo", content:"7"},
{property:"foo:bar", content:"8"},
{property:"foo:baz", content:"9"}
];
var transformed = {};
data.forEach(function (item) {
var key, tmp,
ptr = transformed,
keys = item.property.split(':');
// we want to leave one key to assign to so we always use references
// as long as there's one key left, we're dealing with a sub-node and not a value
while (keys.length > 1) {
key = keys.shift();
if (Array.isArray(ptr[key])) {
// the last index of ptr[key] should become
// the object we are examining.
tmp = ptr[key].length-1;
ptr = ptr[key];
key = tmp;
}
if (typeof ptr[key] === 'string') {
// if it's a string, convert it
ptr[key] = { '': ptr[key] };
} else if (ptr[key] === undefined) {
// create a new key
ptr[key] = {};
}
// move our pointer to the next subnode
ptr = ptr[key];
}
// deal with the last key
key = keys.shift();
if (ptr[key] === undefined) {
ptr[key] = item.content;
} else if (Array.isArray(ptr[key])) {
ptr[key].push(item.content);
} else {
ptr[key] = [ ptr[key], item.content ];
}
});
console.log(transformed);
Outputs:
{
key: ['1', '2'],
key2: {
'': '3',
subkey: '4',
someother: {
key: '5'
}
},
foo: ['6', {
'': '7',
bar: '8'
baz: '9'
}]
}