Actually, I download the json file with more than 10,000 records from the server and extract the file. But I can not read the json file and convert the data to an object and save it in Realm. I do a lot of searching on npmjs and found below modules : bfi big-json json stringify large object optimization But none of them not work for me in React Native. Invalid fs.createReadStream()
const filepath = "./Basket.json";
const fs = require("fs");
var s = fs.createReadStream(filepath);
error is : fs.createReadStream is not a function
other way :
const bfj = require("bfj");
const filepath = "./Basket.json";
const stream = await RNFetchBlob.fs.readStream(
"./Basket.json",
"base64",
4095
);
console.log(stream);
var b = await bfj.parse(stream);
error : Invalid stream argument
Related
I have express api with json data file, every time I do test for api routes (with super test) it overwrite data, one of the solutions is to create a sepreate json file just to test.
controllers has routes for CRUD api methods and each route calling Object module to retrive data from db.json file
API running on localhost:3000
API test running on localhost: 4000
in my route I need to pass the path= req.get('host') as module which has the following:
const fs = require("fs");
const fileDataPath = "./data.json"
const testFilePath = "./testData.json"
const saveData = (data) => {
const stringifyData = JSON.stringify(data, null, 2)
fs.writeFileSync(fileDataPath, stringifyData)
}
const getEntryData = () => {
const jsonData = fs.readFileSync(fileDataPath, "utf-8")
return JSON.parse(jsonData)
}
if I do modules. exports = path
and then in module when I'm imorting it again I'm getting the following erorr
"Accessing non-existent property 'path' of module exports inside circular dependency"
and path undifined
How I can do supertest for my api without effecting my json data file?
Getting below error while trying to read csv file in cypress. file is having data but some how xlsx plugin not able to read and convert to json file.
below is the code
const fs = require('fs');
const XLSX = require('xlsx');
const read = ({file, sheet}) => {
const buf = fs.readFileSync(file);
const workbook = XLSX.read(buf, { type: 'buffer' });
const rows = XLSX.utils.sheet_to_json(workbook.Sheets[sheet]);
return rows
}
and error is given below.
From Node.js Internals:TypeError: Cannot read property 'length' of
undefinedat Object.sheet_add_json
(D:\project\Shobhnaautomation\Internal-ProofOfConcept-BackOffice-Remedy-Web\node_modules\xlsx\xlsx.js:22252:52)at
read
(D:\project\Shobhnaautomation\Internal-ProofOfConcept-BackOffice-Remedy-Web\tests\e2e\plugins\Read-xlsx.js:8:26)at
invoke
(D:\Users\shobhnag\AppData\Local\Cypress\Cache\4.12.1\Cypress\resources\app\packages\server\lib\plugins\child\task.js:41:15)at
(D:\Users\shobhnag\AppData\Local\Cypress\Cache\4.12.1\Cypress\resources\app\packages\server\lib\plugins\util.js:41:15)at tryCatcher
(D:\Users\shobhnag\AppData\Local\Cypress\Cache\4.12.1\Cypress\resources\app\packages\server\node_modules\bluebird\js\release\util.js:16:24)at
Function.Promise.attempt.Promise.try
(D:\Users\shobhnag\AppData\Local\Cypress\Cache\4.12.1\Cypress\resources\app\packages\server\node_modules\bluebird\js\release\method.js:39:30)at
Object.wrapChildPromise
(D:\Users\shobhnag\AppData\Local\Cypress\Cache\4.12.1\Cypress\resources\app\packages\server\lib\plugins\util.js:40:24)at Object.wrap
(D:\Users\shobhnag\AppData\Local\Cypress\Cache\4.12.1\Cypress\resources\app\packages\server\lib\plugins\child\task.js:47:9)at
execute
(D:\Users\shobhnag\AppData\Local\Cypress\Cache\4.12.1\Cypress\resources\app\packages\server\lib\plugins\child\run_plugins.js:142:13)at
EventEmitter.
(D:\Users\shobhnag\AppData\Local\Cypress\Cache\4.12.1\Cypress\resources\app\packages\server\lib\plugins\child\run_plugins.js:235:6)at
EventEmitter.emit (events.js:210:6)at process.
(D:\Users\shobhnag\AppData\Local\Cypress\Cache\4.12.1\Cypress\resources\app\packages\server\lib\plugins\util.js:19:23)at process.emit (events.js:210:6)at process.emit
(D:\Users\shobhnag\AppData\Local\Cypress\Cache\4.12.1\Cypress\resources\app\packages\server\node_modules\source-map-support\source-map-support.js:495:22)at
emit (internal/child_process.js:876:13)at processTicksAndRejections
(internal/process/task_queues.js:81:22)
I'm attempting to parse a fairly large JSON file (~500Mb) in NodeJS. My implementation is based on the Async approach given in this answer:
var fileStream = require('fs');
var jsonObj;
fileStream.readFile('./data/exporttest2.json', fileCallback);
function fileCallback (err, data) {
return err ? (console.log(err), !1):(jsonObj = JSON.parse(data));
//Process JSON data here
}
That's all well and good, but I'm getting hit with the following error message:
buffer.js:495
throw new Error('"toString()" failed');
^
Error: "toString()" failed
at Buffer.toString (buffer.js:495:11)
at Object.parse (native)
at fileCallback (C:\Users\1700675\Research\Experiments\NodeJS\rf_EU.js:49:18)
at FSReqWrap.readFileAfterClose [as oncomplete] (fs.js:445:3)
I understand from this answer that this is caused by the maximum buffer length in the V8 engine set at 256Mb.
My question then is this, is there a way I can asynchronously read my JSON file in chunks that do not exceed the buffer length of 256Mb, without manually disseminating my JSON data into several files?
is there a way I can asynchronously read my JSON file in chunks that do not exceed the buffer length of 256Mb, without manually disseminating my JSON data into several files?
This is acommon problem and there are several modules than can help you with that:
https://www.npmjs.com/package/JSONStream
https://www.npmjs.com/package/stream-json
https://www.npmjs.com/package/json-stream
https://www.npmjs.com/package/json-parse-stream
https://www.npmjs.com/package/json-streams
https://www.npmjs.com/package/jsonparse
Example with JSONStream:
const JSONStream = require('JSONStream');
const fs = require('fs');
fs.createReadStrem('./data/exporttest2.json')
.pipe(JSONStream.parse('...'))...
See the docs for details of all of the arguments.
Try using streams:
let fs = require("fs");
let s = fs.createReadStream('./a.json');
let data = [];
s.on('data', function (chunk) {
data.push(chunk);
}).on('end', function () {
let json = Buffer.concat(data).toString();
console.log(JSON.parse(json));
});
I am trying to write a JSON object to a JSON file. The code executes without errors, but instead of the content of the object been written, all that gets written into the JSON file is:
[object Object]
This is the code that actually does the writing:
fs.writeFileSync('../data/phraseFreqs.json', output)
'output' is a JSON object, and the file already exists. Please let me know if more information is required.
You need to stringify the object.
fs.writeFileSync('../data/phraseFreqs.json', JSON.stringify(output));
I don't think you should use the synchronous approach, asynchronously writing data to a file is better also stringify the output if it's an object.
Note: If output is a string, then specify the encoding and remember the flag options as well.:
const fs = require('fs');
const content = JSON.stringify(output);
fs.writeFile('/tmp/phraseFreqs.json', content, 'utf8', function (err) {
if (err) {
return console.log(err);
}
console.log("The file was saved!");
});
Added Synchronous method of writing data to a file, but please consider your use case. Asynchronous vs synchronous execution, what does it really mean?
const fs = require('fs');
const content = JSON.stringify(output);
fs.writeFileSync('/tmp/phraseFreqs.json', content);
Make the json human readable by passing a third argument to stringify:
fs.writeFileSync('../data/phraseFreqs.json', JSON.stringify(output, null, 4));
When sending data to a web server, the data has to be a string (here). You can convert a JavaScript object into a string with JSON.stringify().
Here is a working example:
var fs = require('fs');
var originalNote = {
title: 'Meeting',
description: 'Meeting John Doe at 10:30 am'
};
var originalNoteString = JSON.stringify(originalNote);
fs.writeFileSync('notes.json', originalNoteString);
var noteString = fs.readFileSync('notes.json');
var note = JSON.parse(noteString);
console.log(`TITLE: ${note.title} DESCRIPTION: ${note.description}`);
Hope it could help.
Here's a variation, using the version of fs that uses promises:
const fs = require('fs');
await fs.promises.writeFile('../data/phraseFreqs.json', JSON.stringify(output)); // UTF-8 is default
I'd like to convert a CSV file to a JSON object using NodeJS. The problem is that my CSV file is hosted on a special URL.
URL : My CSV here
var fs = require("fs");
var Converter = require("csvtojson").Converter;
var fileStream = fs.createReadStream("myurl");
var converter = new Converter({constructResult:true});
converter.on("end_parsed", function (jsonObj) {
console.log(jsonObj);
});
fileStream.pipe(converter);
Issue :
Error: ENOENT, open 'C:\xampp\htdocs\GestionDettes\http:\www.banque-france.fr\fileadmin\user_upload\banque_de_france\Economie_et_Statistiques\Changes_et_Taux\page3_quot.csv'
at Error (native)
Edit #1 :
Request.get(myurl, function (error, Response, body) {
var converter = new Converter({
constructResult: true,
delimiter: ';'
});
converter.fromString(body,function(err, taux){
console.log(taux); // it works
});
});
I did just that in a module reading and writing on different protocol in different data formats. I used request to get http resources.
If you want take a look at alinex-datastore. With this module it should work like:
const DataStore = require('#alinex/datastore').default;
async function transform() {
const ds = new DataStore('http://any-server.org/my.csv');
await ds.load();
await ds.save('file:/etc/my-app.json');
}
That should do it.