Getting error while reading excel trough cypress - csv

Getting below error while trying to read csv file in cypress. file is having data but some how xlsx plugin not able to read and convert to json file.
below is the code
const fs = require('fs');
const XLSX = require('xlsx');
const read = ({file, sheet}) => {
const buf = fs.readFileSync(file);
const workbook = XLSX.read(buf, { type: 'buffer' });
const rows = XLSX.utils.sheet_to_json(workbook.Sheets[sheet]);
return rows
}
and error is given below.
From Node.js Internals:TypeError: Cannot read property 'length' of
undefinedat Object.sheet_add_json
(D:\project\Shobhnaautomation\Internal-ProofOfConcept-BackOffice-Remedy-Web\node_modules\xlsx\xlsx.js:22252:52)at
read
(D:\project\Shobhnaautomation\Internal-ProofOfConcept-BackOffice-Remedy-Web\tests\e2e\plugins\Read-xlsx.js:8:26)at
invoke
(D:\Users\shobhnag\AppData\Local\Cypress\Cache\4.12.1\Cypress\resources\app\packages\server\lib\plugins\child\task.js:41:15)at
(D:\Users\shobhnag\AppData\Local\Cypress\Cache\4.12.1\Cypress\resources\app\packages\server\lib\plugins\util.js:41:15)at tryCatcher
(D:\Users\shobhnag\AppData\Local\Cypress\Cache\4.12.1\Cypress\resources\app\packages\server\node_modules\bluebird\js\release\util.js:16:24)at
Function.Promise.attempt.Promise.try
(D:\Users\shobhnag\AppData\Local\Cypress\Cache\4.12.1\Cypress\resources\app\packages\server\node_modules\bluebird\js\release\method.js:39:30)at
Object.wrapChildPromise
(D:\Users\shobhnag\AppData\Local\Cypress\Cache\4.12.1\Cypress\resources\app\packages\server\lib\plugins\util.js:40:24)at Object.wrap
(D:\Users\shobhnag\AppData\Local\Cypress\Cache\4.12.1\Cypress\resources\app\packages\server\lib\plugins\child\task.js:47:9)at
execute
(D:\Users\shobhnag\AppData\Local\Cypress\Cache\4.12.1\Cypress\resources\app\packages\server\lib\plugins\child\run_plugins.js:142:13)at
EventEmitter.
(D:\Users\shobhnag\AppData\Local\Cypress\Cache\4.12.1\Cypress\resources\app\packages\server\lib\plugins\child\run_plugins.js:235:6)at
EventEmitter.emit (events.js:210:6)at process.
(D:\Users\shobhnag\AppData\Local\Cypress\Cache\4.12.1\Cypress\resources\app\packages\server\lib\plugins\util.js:19:23)at process.emit (events.js:210:6)at process.emit
(D:\Users\shobhnag\AppData\Local\Cypress\Cache\4.12.1\Cypress\resources\app\packages\server\node_modules\source-map-support\source-map-support.js:495:22)at
emit (internal/child_process.js:876:13)at processTicksAndRejections
(internal/process/task_queues.js:81:22)

Related

How to test express api using a json file data different from the acctual data that retrive as aresponse

I have express api with json data file, every time I do test for api routes (with super test) it overwrite data, one of the solutions is to create a sepreate json file just to test.
controllers has routes for CRUD api methods and each route calling Object module to retrive data from db.json file
API running on localhost:3000
API test running on localhost: 4000
in my route I need to pass the path= req.get('host') as module which has the following:
const fs = require("fs");
const fileDataPath = "./data.json"
const testFilePath = "./testData.json"
const saveData = (data) => {
const stringifyData = JSON.stringify(data, null, 2)
fs.writeFileSync(fileDataPath, stringifyData)
}
const getEntryData = () => {
const jsonData = fs.readFileSync(fileDataPath, "utf-8")
return JSON.parse(jsonData)
}
if I do modules. exports = path
and then in module when I'm imorting it again I'm getting the following erorr
"Accessing non-existent property 'path' of module exports inside circular dependency"
and path undifined
How I can do supertest for my api without effecting my json data file?

Read json file more than 70 MB size

Actually, I download the json file with more than 10,000 records from the server and extract the file. But I can not read the json file and convert the data to an object and save it in Realm. I do a lot of searching on npmjs and found below modules : bfi big-json json stringify large object optimization But none of them not work for me in React Native. Invalid fs.createReadStream()
const filepath = "./Basket.json";
const fs = require("fs");
var s = fs.createReadStream(filepath);
error is : fs.createReadStream is not a function
other way :
const bfj = require("bfj");
const filepath = "./Basket.json";
const stream = await RNFetchBlob.fs.readStream(
"./Basket.json",
"base64",
4095
);
console.log(stream);
var b = await bfj.parse(stream);
error : Invalid stream argument

NodeJS Async JSON parsing causing Buffer.toString() failure

I'm attempting to parse a fairly large JSON file (~500Mb) in NodeJS. My implementation is based on the Async approach given in this answer:
var fileStream = require('fs');
var jsonObj;
fileStream.readFile('./data/exporttest2.json', fileCallback);
function fileCallback (err, data) {
return err ? (console.log(err), !1):(jsonObj = JSON.parse(data));
//Process JSON data here
}
That's all well and good, but I'm getting hit with the following error message:
buffer.js:495
throw new Error('"toString()" failed');
^
Error: "toString()" failed
at Buffer.toString (buffer.js:495:11)
at Object.parse (native)
at fileCallback (C:\Users\1700675\Research\Experiments\NodeJS\rf_EU.js:49:18)
at FSReqWrap.readFileAfterClose [as oncomplete] (fs.js:445:3)
I understand from this answer that this is caused by the maximum buffer length in the V8 engine set at 256Mb.
My question then is this, is there a way I can asynchronously read my JSON file in chunks that do not exceed the buffer length of 256Mb, without manually disseminating my JSON data into several files?
is there a way I can asynchronously read my JSON file in chunks that do not exceed the buffer length of 256Mb, without manually disseminating my JSON data into several files?
This is acommon problem and there are several modules than can help you with that:
https://www.npmjs.com/package/JSONStream
https://www.npmjs.com/package/stream-json
https://www.npmjs.com/package/json-stream
https://www.npmjs.com/package/json-parse-stream
https://www.npmjs.com/package/json-streams
https://www.npmjs.com/package/jsonparse
Example with JSONStream:
const JSONStream = require('JSONStream');
const fs = require('fs');
fs.createReadStrem('./data/exporttest2.json')
.pipe(JSONStream.parse('...'))...
See the docs for details of all of the arguments.
Try using streams:
let fs = require("fs");
let s = fs.createReadStream('./a.json');
let data = [];
s.on('data', function (chunk) {
data.push(chunk);
}).on('end', function () {
let json = Buffer.concat(data).toString();
console.log(JSON.parse(json));
});

CSV to JSON using NodeJS

I'd like to convert a CSV file to a JSON object using NodeJS. The problem is that my CSV file is hosted on a special URL.
URL : My CSV here
var fs = require("fs");
var Converter = require("csvtojson").Converter;
var fileStream = fs.createReadStream("myurl");
var converter = new Converter({constructResult:true});
converter.on("end_parsed", function (jsonObj) {
console.log(jsonObj);
});
fileStream.pipe(converter);
Issue :
Error: ENOENT, open 'C:\xampp\htdocs\GestionDettes\http:\www.banque-france.fr\fileadmin\user_upload\banque_de_france\Economie_et_Statistiques\Changes_et_Taux\page3_quot.csv'
at Error (native)
Edit #1 :
Request.get(myurl, function (error, Response, body) {
var converter = new Converter({
constructResult: true,
delimiter: ';'
});
converter.fromString(body,function(err, taux){
console.log(taux); // it works
});
});
I did just that in a module reading and writing on different protocol in different data formats. I used request to get http resources.
If you want take a look at alinex-datastore. With this module it should work like:
const DataStore = require('#alinex/datastore').default;
async function transform() {
const ds = new DataStore('http://any-server.org/my.csv');
await ds.load();
await ds.save('file:/etc/my-app.json');
}
That should do it.

Using Streams in MySQL with Node

Following the example on Piping results with Streams2, I'm trying to stream results from MySQL to stdout in node.js.
Code looks like this:
connection.query('SELECT * FROM table')
.stream()
.pipe(process.stdout);
I get this error: TypeError: invalid data
Explanation
From this github issue for the project:
.stream() returns stream in "objectMode". You can't pipe it to stdout or network
socket because "data" events have rows as payload, not Buffer chunks
Fix
You can fix this using the csv-stringify module.
var stringify = require('csv-stringify');
var stringifier = stringify();
connection.query('SELECT * FROM table')
.stream()
.pipe(stringifier).pipe(process.stdout);
notice the extra .pipe(stringifier) before the .pipe(process.stdout)
There is another solution now with the introduction of pipelinein Node v10 (view documentation).
The pipeline method does several things:
Allows you to pipe through as many streams as you like.
Provides a callback once completed.
Importantly, it provides automatic clean up. Which is a benefit over the standard pipe method.
const fs = require('fs')
const mysql = require('mysql')
const {pipeline} = require('stream')
const stringify = require('csv-stringify')
const stringifier = stringify()
const output = fs.createWriteStream('query.csv')
const connection = mysql.createConnection(...)
const input = connection.query('SELECT * FROM table').stream()
pipeline(input, stringifier, process.stdout, err => {
if (err) {
console.log(err)
} else {
console.log('Output complete')
}
}