CSV File object to JSON in Angular - json

On the front end of my app I wanted to parse some data related to a CSV they upload. Through the file upload tool, I first get a FileList object and then pull the 1 file out of it.
I want to turn it into a json object which I could then iterate. I was thinking to user csv-parser from node, but I dont see a way to leverage a File object stored in memory.
How Can I accomplish this?
At first I was doing:
let f = fileList.item(0);
let decoder = new window.TextDecoder('utf-8');
f.arrayBuffer().then( data => {
let _data = decoder.decode(data)
console.log("Dataset", data, _data)
});
And that was passing the array buffer, and decoding the string. While I Could write a generic tool which process this string data based on \n and ',' I wanted this to be a bit more easier to read.
I wanted to do something like:
let json = csvParser(f)
is there a way to user csv-parser from node, (3.0.0) or is there another tool i should leverage? I was thinking that levering modules based on the browser ( new window.TextDecoder(...) ) is poor form since it has the opportunity to fail.
Is there a tool that does this? im trying to create some sample data and given a File picked from an input type="file" i would want to have this be simple and straight forward.
This example below works, but i feel the window dependancy and a gut feeling makes me think this is naive.
const f : File = fileList.item(0)
console.log("[FOO] File", f)
let decoder = new window.TextDecoder('utf-8');
f.arrayBuffer().then( data => {
let _data = decoder.decode(data)
console.log("Dataset", data, _data)
let lines = _data.split("\n")
let headers = lines[0].split(',')
let results = []
for ( let i = 1; i < lines.length; i++) {
let line = lines[i]
let row = {}
line.split(",").forEach( (item, idx) => {
row[headers[idx]] = item;
})
results.push(row)
}
console.log("JSON ARRAY", results)
})
The issue i run when i stop and do: ng serve is that it does not like using the arrayBuffer function and accessing TextDecoder from window, since that thost functions/classes are not a part of File and window respectively during build.
Any thoughts?

This is what I ended up doing, given the file input being passed into this function:
updateTranscoders(project: Project, fileList: FileList, choice: string = 'replace') {
const f: File = fileList.item(0)
//Reads a File into a string.
function readToString(file) : Promise<any> {
const reader = new FileReader();
const future = new Promise( (resolve,reject) => {
reader.addEventListener("load", () => {
resolve(reader.result);
}, false)
reader.addEventListener("error", (event) => {
console.error("ERROR", event)
reject(event)
}, false)
reader.readAsText(file)
});
return future;
}
readToString(f).then( data => {
let lines = data.split("\n")
let headers = lines[0].split(',')
let results = []
for (let i = 1; i < lines.length; i++) {
let line = lines[i]
let row = {}
line.split(",").forEach((item, idx) => {
row[headers[idx]] = item;
})
results.push(row)
}
if (choice.toLowerCase() === 'replace'){
let rows = project.csvListContents.toJson().rows.filter( row => row.isDeployed)
rows.push( ...results)
project.csvListContents = CsvDataset.fromJson({ rows: rows })
}else if (choice.toLowerCase() === 'append') {
let r = project.csvListContents.toJson();
r.rows.push(...results);
project.csvListContents = CsvDataset.fromJson(r);
}else {
alert("Invalid option for Choice.")
}
this.saveProject(project)
})
}
Now the CHOICE portion of the code is where I have a binary option to do a hard replace on CSV contents or just append to it. I would then save the project accordingly. This is also understanding that the first row contains column headers.

Related

Rescript manipulate JSON file

I have this JSON file.
Using rescript I want to :
Read the file.
Extract data from the file.
Write result in a new file.
{
"name": "name",
"examples": [
{
"input": [1,2],
"result": 1
},
{
"input": [3,4],
"result": 3
}
],
}
I was able to acheive this using JavaScript
var file = Fs.readFileSync("file.json", "utf8");
var data = JSON.parse(file);
var name = data.name
var examples = data.examples
for (let i = 0; i< examples.length; i++){
let example = examples[i]
let input = example.input
let result = example.result
let finalResult = `example ${name}, ${input[0]}, ${input[1]}, ${result} \n`
Fs.appendFileSync('result.txt',finalResult)
}
These are my attempts at writing it in Rescript and the issues I ran into.
let file = Node.Fs.readFileSync("file.json", #utf8)
let data = Js.Json.parseExn(file)
let name = data.name //this doesn't work. The record field name can't be found
So I have tried a different approach (which is a little bit limited because I am specifying the type of the data that I want to extract).
#module("fs")
external readFileSync: (
~name: string,
[#utf8],
) => string = "readFileSync"
type data = {name: string, examples: array<Js_dict.t<t>>}
#scope("JSON") #val
external parseIntoMyData: string => data = "parse"
let file = readFileSync(~name="file.json", #utf8)
let parsedData = parseIntoMyData(file)
let name = parsedData.name
let example = parsedData.examples[0]
let input = parsedData.examples[0].input //this wouldn't work
Also tried to use Node.Fs.appendFileSync(...) and I get The value appendFileSync can't be found in Node.Fs
Is there another way to accomplish this?
It's not clear to me why you're using Js.Dict.t<t> for your examples, and what the t in that type refers to. You certainly could use a Js.Dict.t here, and that might make sense if the shape of the data isn't static, but then you'd have to access the data using Js.Dict.get. Seems you want to use record field access instead, and if the data structure is static you can do so if you just define the types properly. From the example you give, it looks like these type definitions should accomplish what you want:
type example {
input: (int, int), // or array<int> if it's not always two elements
result: int,
}
type data = {
name: string,
examples: array<example>,
}

Write key and value to JSON file based on the data provided using Cypress

I have to write the captured data from the application in JSON file as like below:
let expectedKey = 'PaperCode';
cy.get('app-screen').find('#code-details').invoke('val').as(code);
cy.get('#code').then(code) => {
cy.readFile('cypress/fixtures/applicationDetails.json').then((appDetails) => {
if(expectedKey === 'StudentCode'){
appDetails.StudentCode = code;
}
if(expectedKey === 'DepartmentCode'){
appDetails.DepartmentCode = code;
}
if(expectedKey === 'PaperCode'){
appDetails.PaperCode = code;
}
if(expectedKey === 'ResultsCode'){
appDetails.ResultsCode = code;
}
})
})
Here, the key and its value are added to json in multiple if blocks. Still, there are many if blocks to implement based on different codes. I want to remove the if blocks and need to add the key and its value to json file based on the expectedKey. Any help please?
Using bracket notation appDetails[expectedKey] to define the new property,
let expectedKey = 'studentCode';
cy.readFile('cypress/fixtures/applicationDetails.json').then((appDetails) => {
cy.get('app-screen').find('#code-details').invoke('val')
.then(code) => {
appDetails[expectedKey] = code; // add (or overwrite) new key
cy.writeFile('cypress/fixtures/applicationDetails.json', appDetails)
})
})
Clearing the file of all keys except the one you want
const expectedKey = 'studentCode';
const appDetails = {}
cy.get('app-screen').find('#code-details').invoke('val')
.then(code) => {
appDetails[expectedKey] = code; // add (or overwrite) new key
cy.writeFile('cypress/fixtures/applicationDetails.json', appDetails)
})

Getting different results when using Puppeteer page.evaluate()

Why is it that my script will produce the correct results when doing this:
let data = await page.evaluate(async () => {
let multipleVideosUnorderedList = await document
.querySelector('article > div')
.querySelector('ul');
let video = [];
if (multipleVideosUnorderedList != null) {
let multipleVideosList = multipleVideosUnorderedList.children;
console.log(multipleVideosList);
for (i = 0; i < multipleVideosList.length; i++) {
let rightBtn = document.querySelector(
'button > div.coreSpriteRightChevron'
);
if (rightBtn) {
await rightBtn.parentNode.click();
}
let videoUrl = multipleVideosList[i].querySelector('video');
if (videoUrl) {
video.push(videoUrl.getAttribute('src'));
}
}
} else {
video.push(document.querySelector('video').getAttribute('src'));
}
return {
video
};
});
console.log(data);
But when it deduce it down to just this:
let er = await page.evaluate(() => {
let multipleVideosUnorderedList = document.querySelector('article > div').querySelector('ul');
return {
multipleVideosUnorderedList
}
});
console.log(er);
the result is undefined. I know there's a lot more code in the former, but I just wanted to see it produce the correct element before I move on to grabbing everything else.
The idea was to take out the document.querySelector in code block and clean it up, to try to use page.$(selector) instead.
Only serializable objects can go into and out of page.evaluate, a NodeList and a Node, which are found with querySelectorAll/querySelector, are not such things.
You probably would like to find an unordered list wich may contain several videos. If this is the case you could rewrite the code in the following way:
let outerVideos = await page.evaluate(() => {
// convert the NodeList to an array
let videos = [...document.querySelectorAll('article > div video')]
// for each member of the array replace the video node with its src value
.map(video => video.getAttribute('src'));
return videos;
});
console.log(outerVideos);

How to read a text file and return it as a JSON object in Node JS?

I have a text file. I need to read the file inside a function and return it as a JSON object. The following is throwing an error "Unexpected token V in JSON at position 0" .
Server.js
fs.readfile('result.txt', 'utf8', function(err,data) {
if(err) throw err;
obj = JSON.parse(data);
console.log(obj);
});
result.txt looks like the following
VO1: 10 5 2
VO2: 5 3 2
I think I cannot use JSON.parse directly. How do I proceed?
Assuming the following:
Every line is separated by a newline character (\n)
Every line is separated by a : where the part in front of it is the key and the part behind it is a (space) separated string that should indicate the keys values as an array.
Below should work for your format:
fs.readfile('result.txt', 'utf8', function(err,data) {
if(err) throw err;
let obj = {};
let splitted = data.toString().split("\n");
for (let i = 0; i<splitted.length; i++) {
let splitLine = splitted[i].split(":");
obj[splitLine[0]] = splitLine[1].trim();
}
console.log(obj);
});
It could be issue with UTF-8 string format, Tried below code and it works
const resultBuffer = fs.readFileSync('result.txt');
const resultData = JSON.parse(resultBuffer.toString().trim());
Thanks to Baao for providing that answer.
As another flavor of solution, if you don't have any ":" for perhaps a list of files you could always code in a key like so:
var data = fs.readFileSync(pathAndFilename);
var testData = {};
var splitList = data.toString().split('\r\n');
for (var i = 0; i < splitList.length; i++) {
testData['fileNumber' + i.toString()] = splitList[i];
}
You need to parse the text file by yourself. You can use RegExp or some other means to extract the values, create an object out of that and then JSON.stringify it.
improving upon #baao answer:
const fs = require("fs")
fs.readFile('.czrc', 'utf8', function (err, data) {
if (err) {
console.error(err)
throw "unable to read .czrc file.";
}
const obj = JSON.parse(data)
});
Your result.txt is not valid json.
Valid json would look like this.
{
"VO1": [10, 5, 2],
"VO2": [5, 3, 2]
}

Node JS: Make a flat json from a tree json

I was writing a node.js script to combine all the json files in a directory and store the result as a new json file. I tried do the job to a great extent but it has few flaws.
A.json
[
{
"id": "addEmoticon1",
"description": "Message to greet the user.",
"defaultMessage": "Hello, {name}!"
},
{
"id": "addPhoto1",
"description": "How are youu.",
"defaultMessage": "How are you??"
}
]
B.json
[
{
"id": "close1",
"description": "Close it.",
"defaultMessage": "Close!"
}
]
What I finally need is:
result.json
{
"addEmoticon1": "Hello, {name}!",
"addPhoto1": "How are you??",
"close1": "Close!"
}
I wrote a node.js script:
var fs = require('fs');
function readFiles(dirname, onFileContent, onError) {
fs.readdir(dirname, function(err, filenames) {
if (err) {
onError(err);
return;
}
filenames.forEach(function(filename) {
fs.readFile(dirname + filename, 'utf-8', function(err, content) {
if (err) {
onError(err);
return;
}
onFileContent(filename, content);
});
});
});
}
var data = {};
readFiles('C:/node/test/', function(filename, content) {
data[filename] = content;
var lines = content.split('\n');
lines.forEach(function(line) {
var parts = line.split('"');
if (parts[1] == 'id') {
fs.appendFile('result.json', parts[3]+': ', function (err) {});
}
if (parts[1] == 'defaultMessage') {
fs.appendFile('result.json', parts[3]+',\n', function (err) {});
}
});
}, function(err) {
throw err;
});
It extracts the 'id' and 'defaultMessage' but is not able to append correctly.
What I get:
result.json
addEmoticon1: addPhoto1: Hello, {name}!,
close1: How are you??,
Close!,
This output is different every time I run my script.
Aim 1: Surround items in double quotes,
Aim 2: Add curly braces at the top and at the end
Aim 3: No comma at the end of last element
Aim 4: Same output every time I run my script
I'll start with the finished solution...
There's a big explanation at the end of this answer. Let's try to think big-picture for a little bit first tho.
readdirp('.')
.fmap(filter(match(/\.json$/)))
.fmap(map(readfilep))
.fmap(map(fmap(JSON.parse)))
.fmap(concatp)
.fmap(flatten)
.fmap(reduce(createMap)({}))
.fmap(data=> JSON.stringify(data, null, '\t'))
.fmap(writefilep(resolve(__dirname, 'result.json')))
.then(filename=> console.log('wrote results to %s', filename), err=>console.error(err));
Console output
wrote results to /path/to/result.json
result.json (I added a c.json with some data to show that this works with more than 2 files)
{
"addEmoticon1": "Hello, {name}!",
"addPhoto1": "How are you??",
"close1": "Close!",
"somethingelse": "Something!"
}
Implementation
I made Promise-based interfaces for readdir and readFile and writeFile
import {readdir, readFile, writeFile} from 'fs';
const readdirp = dir=>
new Promise((pass,fail)=>
readdir(dir, (err, filenames) =>
err ? fail(err) : pass(mapResolve (dir) (filenames))));
const readfilep = path=>
new Promise((pass,fail)=>
readFile(path, 'utf8', (err,data)=>
err ? fail(err) : pass(data)));
const writefilep = path=> data=>
new Promise((pass,fail)=>
writeFile(path, data, err=>
err ? fail(err) : pass(path)));
In order to map functions to our Promises, we needed an fmap utility. Notice how we take care to bubble errors up.
Promise.prototype.fmap = function fmap(f) {
return new Promise((pass,fail) =>
this.then(x=> pass(f(x)), fail));
};
And here's the rest of the utilities
const fmap = f=> x=> x.fmap(f);
const mapResolve = dir=> map(x=>resolve(dir,x));
const map = f=> xs=> xs.map(x=> f(x));
const filter = f=> xs=> xs.filter(x=> f(x));
const match = re=> s=> re.test(s);
const concatp = xs=> Promise.all(xs);
const reduce = f=> y=> xs=> xs.reduce((y,x)=> f(y)(x), y);
const flatten = reduce(y=> x=> y.concat(Array.isArray(x) ? flatten (x) : x)) ([]);
Lastly, the one custom function that does your work
const createMap = map=> ({id, defaultMessage})=>
Object.assign(map, {[id]: defaultMessage});
And here's c.json
[
{
"id": "somethingelse",
"description": "something",
"defaultMessage": "Something!"
}
]
"Why so many little functions ?"
Well despite what you may think, you have a pretty big problem. And big problems are solved by combining several small solutions. The most prominent advantage of this code is that each function has a very distinct purpose and it will always produce the same results for the same inputs. This means each function can be used other places in your program. Another advantage is that smaller functions are easier to read, reason with, and debug.
Compare all of this to the other answers given here; #BlazeSahlen's in particular. That's over 60 lines of code that's basically only usable to solve this one particular problem. And it doesn't even filter out non-JSON files. So the next time you need to create a sequence of actions on reading/writing files, you'll have to rewrite most of those 60 lines each time. It creates lots of duplicated code and hard-to-find bugs because of exhausting boilerplate. And all that manual error-handling... wow, just kill me now. And he/she thought callback hell was bad ? haha, he/she just created yet another circle of hell all on his/her own.
All the code together...
Functions appear (roughly) in the order they are used
import {readdir, readFile, writeFile} from 'fs';
import {resolve} from 'path';
// logp: Promise<Value> -> Void
const logp = p=> p.then(x=> console.log(x), x=> console.err(x));
// fmap : Promise<a> -> (a->b) -> Promise<b>
Promise.prototype.fmap = function fmap(f) {
return new Promise((pass,fail) =>
this.then(x=> pass(f(x)), fail));
};
// fmap : (a->b) -> F<a> -> F<b>
const fmap = f=> x=> x.fmap(f);
// readdirp : String -> Promise<Array<String>>
const readdirp = dir=>
new Promise((pass,fail)=>
readdir(dir, (err, filenames) =>
err ? fail(err) : pass(mapResolve (dir) (filenames))));
// mapResolve : String -> Array<String> -> Array<String>
const mapResolve = dir=> map(x=>resolve(dir,x));
// map : (a->b) -> Array<a> -> Array<b>
const map = f=> xs=> xs.map(x=> f(x));
// filter : (Value -> Boolean) -> Array<Value> -> Array<Value>
const filter = f=> xs=> xs.filter(x=> f(x));
// match : RegExp -> String -> Boolean
const match = re=> s=> re.test(s);
// readfilep : String -> Promise<String>
const readfilep = path=>
new Promise((pass,fail)=>
readFile(path, 'utf8', (err,data)=>
err ? fail(err) : pass(data)));
// concatp : Array<Promise<Value>> -> Array<Value>
const concatp = xs=> Promise.all(xs);
// reduce : (b->a->b) -> b -> Array<a> -> b
const reduce = f=> y=> xs=> xs.reduce((y,x)=> f(y)(x), y);
// flatten : Array<Array<Value>> -> Array<Value>
const flatten = reduce(y=> x=> y.concat(Array.isArray(x) ? flatten (x) : x)) ([]);
// writefilep : String -> Value -> Promise<String>
const writefilep = path=> data=>
new Promise((pass,fail)=>
writeFile(path, data, err=>
err ? fail(err) : pass(path)));
// -----------------------------------------------------------------------------
// createMap : Object -> Object -> Object
const createMap = map=> ({id, defaultMessage})=>
Object.assign(map, {[id]: defaultMessage});
// do it !
readdirp('.')
.fmap(filter(match(/\.json$/)))
.fmap(map(readfilep))
.fmap(map(fmap(JSON.parse)))
.fmap(concatp)
.fmap(flatten)
.fmap(reduce(createMap)({}))
.fmap(data=> JSON.stringify(data, null, '\t'))
.fmap(writefilep(resolve(__dirname, 'result.json')))
.then(filename=> console.log('wrote results to %s', filename), err=>console.error(err));
Still having trouble following along?
It's not easy to see how these things work at first. This is a particularly squirrely problem because the data gets nested very quickly. Thankfully that doesn't mean our code has to be a big nested mess just to solve the problem ! Notice the code stays nice and flat even when we're dealing with things like a Promise of an Array of Promises of JSON...
// Here we are reading directory '.'
// We will get a Promise<Array<String>>
// Let's say the files are 'a.json', 'b.json', 'c.json', and 'run.js'
// Promise will look like this:
// Promise<['a.json', 'b.json', 'c.json', 'run.js']>
readdirp('.')
// Now we're going to strip out any non-JSON files
// Promise<['a.json', 'b.json', 'c.json']>
.fmap(filter(match(/\.json$/)))
// call `readfilep` on each of the files
// We will get <Promise<Array<Promise<JSON>>>>
// Don't freak out, it's not that bad!
// Promise<[Promise<JSON>, Promise<JSON>. Promise<JSON>]>
.fmap(map(readfilep))
// for each file's Promise, we want to parse the data as JSON
// JSON.parse returns an object, so the structure will be the same
// except JSON will be an object!
// Promise<[Promise<Object>, Promise<Object>, Promise<Object>]>
.fmap(map(fmap(JSON.parse)))
// Now we can start collapsing some of the structure
// `concatp` will convert Array<Promise<Value>> to Array<Value>
// We will get
// Promise<[Object, Object, Object]>
// Remember, we have 3 Objects; one for each parsed JSON file
.fmap(concatp)
// Your particular JSON structures are Arrays, which are also Objects
// so that means `concatp` will actually return Promise<[Array, Array, Array]
// but we'd like to flatten that
// that way each parsed JSON file gets mushed into a single data set
// after flatten, we will have
// Promise<Array<Object>>
.fmap(flatten)
// Here's where it all comes together
// now that we have a single Promise of an Array containing all of your objects ...
// We can simply reduce the array and create the mapping of key:values that you wish
// `createMap` is custom tailored for the mapping you need
// we initialize the `reduce` with an empty object, {}
// after it runs, we will have Promise<Object>
// where Object is your result
.fmap(reduce(createMap)({}))
// It's all downhill from here
// We currently have Promise<Object>
// but before we write that to a file, we need to convert it to JSON
// JSON.stringify(data, null, '\t') will pretty print the JSON using tab to indent
// After this, we will have Promise<JSON>
.fmap(data=> JSON.stringify(data, null, '\t'))
// Now that we have a JSON, we can easily write this to a file
// We'll use `writefilep` to write the result to `result.json` in the current working directory
// I wrote `writefilep` to pass the filename on success
// so when this finishes, we will have
// Promise<Path>
// You could have it return Promise<Void> like writeFile sends void to the callback. up to you.
.fmap(writefilep(resolve(__dirname, 'result.json')))
// the grand finale
// alert the user that everything is done (or if an error occurred)
// Remember `.then` is like a fork in the road:
// the code will go to the left function on success, and the right on failure
// Here, we're using a generic function to say we wrote the file out
// If a failure happens, we write that to console.error
.then(filename=> console.log('wrote results to %s', filename), err=>console.error(err));
All done !
Assumed files is list of arrays; [a, b, ...];
var res = {};
files.reduce((a, b) => a.concat(b), []).forEach(o => res[o.id] = o.defaultMessage);
But you need not to get all files at once.
Just add this code to onFileContent callback.
JSON.parse(fileContent).forEach(o => res[o.id] = o.defaultMessage);
Also, you should to add any final callback to your readFiles.
And in this callback:
fs.writeFile('result.json', JSON.stringify(res));
So, final solution for you:
var fs = require('fs');
function task(dir, it, cb) {
fs.readdir(dir, (err, names) => {
if (err) return cb([err]);
var errors = [], c = names.length;
names.forEach(name => {
fs.readFile(dir + name, 'utf-8', (err, data) => {
if (err) return errors.push(err);
try {
it(JSON.parse(data)); // We get a file data!
} catch(e) {
errors.push('Invalid json in ' + name + ': '+e.message);
}
if (!--c) cb(errors); // We are finish
});
});
});
}
var res = {};
task('C:/node/test/', (data) => data.forEach(o => res[o.id] = o.defaultMessage), (errors) => {
// Some files can be wrong
errors.forEach(err => console.error(err));
// But we anyway write received data
fs.writeFile('C:/node/test/result.json', JSON.stringify(res), (err) => {
if (err) console.error(err);
else console.log('Task finished. see results.json');
})
});
this should do it once you have your json in variables a and b:
var a = [
{
"id": "addEmoticon1",
"description": "Message to greet the user.",
"defaultMessage": "Hello, {name}!"
},
{
"id": "addPhoto1",
"description": "How are youu.",
"defaultMessage": "How are you??"
}
];
var b = [
{
"id": "close1",
"description": "Close it.",
"defaultMessage": "Close!"
}
];
var c = a.concat(b);
var res = []
for (var i = 0; i < c.length; i++){
res[ c[i].id ] = c[i].defaultMessage;
}
console.log(res);
Here's my solution:
function readFiles(dirname, onFileContent, onError) {
fs.readdir(dirname, function(err, filenames) {
/**
* We'll store the parsed JSON data in this array
* #type {Array}
*/
var fileContent = [];
if (err) {
onError(err);
} else {
filenames.forEach(function(filename) {
// Reading the file (synchronously) and storing the parsed JSON output (parsing from string to JSON object)
var jsonObject = JSON.parse(fs.readFileSync(dirname + filename, 'utf-8'));
// Pushing the parsed JSON output into array
fileContent.push(jsonObject);
});
// Calling the callback
onFileContent(fileContent);
}
});
}
readFiles('./files/',function(fileContent) {
/**
* We'll store the final output object here
* #type {Object}
*/
var output = {};
// Loop over the JSON objects
fileContent.forEach(function(each) {
// Looping within each object
for (var index in each) {
// Copying the `id` as key and the `defaultMessage` as value and storing in output object
output[each[index].id] = each[index].defaultMessage;
}
});
// Writing the file (synchronously) after converting the JSON object back to string
fs.writeFileSync('result.json', JSON.stringify(output));
}, function(err) {
throw err;
});
Notable difference is that I've not used the asynchronous readFile and writeFile functions as they'd needlessly complicate the example. This example is meant to showcase the use of JSON.parse and JSON.stringify to do what OP wants.
UPDATE:
var fs = require('fs');
function readFiles(dirname, onEachFilename, onComplete) {
fs.readdir(dirname, function(err, filenames) {
if (err) {
throw err;
} else {
// Prepending the dirname to each filename
filenames.forEach(function(each, index, array) {
array[index] = dirname + each;
});
// Calling aync.map which accepts these parameters:
// filenames <-------- array of filenames
// onEachFilename <--- function which will be applied on each filename
// onComplete <------- function to call when the all elements of filenames array have been processed
require('async').map(filenames, onEachFilename, onComplete);
}
});
}
readFiles('./files/', function(item, callback) {
// Read the file asynchronously
fs.readFile(item, function(err, data) {
if (err) {
callback(err);
} else {
callback(null, JSON.parse(data));
}
});
}, function(err, results) {
/**
* We'll store the final output object here
* #type {Object}
*/
var output = {};
if (err) {
throw err;
} else {
// Loop over the JSON objects
results.forEach(function(each) {
// Looping within each object
for (var index in each) {
// Copying the `id` as key and the `defaultMessage` as value and storing in output object
output[each[index].id] = each[index].defaultMessage;
}
});
// Writing the file (synchronously) after converting the JSON object back to string
fs.writeFileSync('result.json', JSON.stringify(output));
}
});
This is a simple asynchronous implementation of the same, using readFile. For more information, async.map.