I am trying to process a CSV file in NestJS using Multer and Papa Parse. I do not want to store the file locally. I just want to parse CSV files to extract some information.
However, I am unable to process it, I have tried two different ways. In the first one, I passed the file buffer to Papa.parse function. However, I get the error: ReferenceError: FileReaderSync is not defined
#Post('1')
#UseInterceptors(
FileInterceptor('file', {})
)
async uploadFile(#UploadedFile() file: Express.Multer.File ){
const csvData = papa.parse(file.buffer, {
header: false,
worker: true,
delimiter: ",",
step: function (row){
console.log("Row: ", row.data);
}
});
}
So tried calling the readFileSync() as shown below, but this time I got the error, ERROR [ExceptionsHandler] ENAMETOOLONG: name too long, open
#Post('2')
#UseInterceptors(
FileInterceptor('file', {})
)
async uploadFile(#UploadedFile() file: Express.Multer.File ){
const $file = readFileSync(file.buffer);
const csvData = papa.parse($file, {
header: false,
worker: true,
delimiter: ",",
step: function (row){
console.log("Row: ", row.data);
}
});
}
will appreciate any help to resolve this issue.
As pointed out by #skink, the file buffer needs to be converted to stream before it can be used by papa parse.
const { Readable } = require('stream');
And updated the function, where converting the file.buffer to stream before calling parse()
#Post('1')
#UseInterceptors(
FileInterceptor('file', {})
)
async uploadFile(#UploadedFile() file: Express.Multer.File ){
const stream = Readable.from(file.buffer);
const csvData = papa.parse(stream, {
header: false,
worker: true,
delimiter: ",",
step: function (row){
console.log("Row: ", row.data);
}
});
}
Related
gulp.task('default', function(done) {
inquirer.prompt([{
type: `input`,
message: `Enter the path`,
default: `./admin/admin.json`,
name: `path`
}]).then(function(answers) {
console.log(answers.path);
console.log('answers');
mydefaultTaskTwo(null, answers.path).pipe(pipedFunction());
done();
})
});
function mydefaultTaskTwo(cb, path) {
let data = '';
try {
data = fs.readFileSync(path, 'utf-8');
} catch (e) {
console.log(`Error: ${e}`);
}
return data;
}
function pipedFunction() {
let object = JSON.parse(data);
object['main'] = 'admin';
data = JSON.stringify(object);
const readable = Readable.from(data)
return readable;
}
I understand that src returns a stream and pipe takes that stream and return a stream, but how do you feed in the stream into the pipedFunction called inside of pipe? I am unsure how it works. I get the following error:
ReferenceError: data is not defined.
Is there something I am misunderstanding about gulp scripts?
Basically you define data as a local scope-level variable and try to reach it from a different scope, where it's undefined. So, you need to make use of the fact that data is returned and pass it, like:
var data = mydefaultTaskTwo(null, answers.path);
data.pipe(pipedFunction(data));
I have the following json:
.../src/app/assets/i18n/en.json
{
"TEST": "This is a some test data in a json file"
}
I have the following code:
const folderName = "assets/i18n/en.json";
knownFolders.currentApp().getFile(folderName).readText().then(a => console.log("json file: "+ JSON.parse(a))));
It gives me the following error:
ERROR Error: Uncaught (in promise): SyntaxError: Unexpected end of JSON input
JS: SyntaxError: Unexpected end of JSON input
JS: at JSON.parse (<anonymous>)
I have tried:
Set folderName to "/assets/i18n/en.json"
Rebuild, and reconnect my testing phone
Use HTTP and with this.http.get("~/app/assets/i18n/en.json").toPromise().then(res => console.log("http???", res)).catch(err => console.log("err:", err));
print the object file without parsing it (it's empty...)
But the error stays the same.
update1
it seems, that the file, sadly, does not exists.
this code:
const folderName = "assets/i18n";
const fileName = "en.json";
console.log("exists?", File.exists(folderName + "/" + fileName));
returns a false.
Although see the picture provided from the project files.
(the code proided is in app.component.ts, in the constructor of AppComponent)
What can be the problem here?
update2:
updated my webpack.config.js to copy .json files:
new CopyWebpackPlugin([
{ from: { glob: "fonts/**" } },
{ from: { glob: "**/*.jpg" } },
{ from: { glob: "**/*.json" } },
{ from: { glob: "**/*.png" } },
], { ignore: [`${relative(appPath, appResourcesFullPath)}/**`] }),
but still no luck. The file still does not extist...
Update3:
This is getting ridiculous... the file can be imported as a standard json file, but the Nativescript lib still does not see it.
import { File } from "tns-core-modules/file-system";
import config from "./assets/i18n/hu.json";
....
const folderName = "./assets/i18n";
const fileName = "hu.json";
console.log("config:", config.test)
console.log("exists?", File.exists(folderName + "/" + fileName));
this produces the following output:
JS: config: This is a translated line
JS: exists? false
AFAIK, the path must be split; you can't request a file with a relative path directly.
const folderName = "assets/i18n";
const fileName = "en.json";
console.log(
knownFolders.currentApp().getFolder(folderName).getFile(fileName).readTextSync(),
);
I face similar situation while working on an app. the issue was that the file permissions was not granted.
other way around which worked perfectly, without needing any permission was to fetch the JSON from a url and work through it.
I have JSON data stored in the variable 'data'.
I want to write this to a text file.
Can I do this with Node? I am a beginner
You can use this NPM module: https://www.npmjs.com/package/jsonfile
var jsonfile = require('jsonfile')
var file = '/tmp/data.json'
var obj = {name: 'JP'}
jsonfile.writeFile(file, obj, function (err) {
console.error(err)
})
I'm trying to convert a TSV file into JSON and write it to disk using text2json.
Input data
There is an empty line at the end
U+2B695 shī
U+2B699 pū
U+2B6DB zhī
U+2B6DE jué
U+2B6E2 níng
U+2B6F6 chì
U+2B6F8 tí
Test
I've this test running with ava
import fs from "fs";
import test from "ava";
import jsonfile from "jsonfile";
import dataminer from "../src/dataminer";
test("extractPronunciation()", t => {
const filepath = "src/codepoint-ruby.tsv";
const data = dataminer.convertToJson(filepath, { separator: " " });
t.is(data > 0);
});
Code
And this method based on text2json:
import jsonfile from "jsonfile";
import text2json from "text2json";
export default {
convertToJson: (filepath, options) => {
const data = [];
const parse = new text2json.Parser({ ...options, hasHeader: true });
parse
.text2json(filepath)
.on("row", row => {
data.push(row);
})
.on("end", () => {
console.log("Done >>>>>");
return data;
});
},
};
Question
I see not trace of the end event being triggered, and the convertToJson() return nothing so my test fail, am I missing something?
In your approach, you're filling the data array by reading asynchronously from a stream, instead of putting the whole file in memory and doing it synchronously. (And that's why you've got to use the on event to push data to your array).
This means that, in the same way you use
parse.text2json(filepath).on("row", row => {
data.push(row);
});
you also need to use an end event to log the final result
parse.text2json(filepath)
.on("row", row => {
data.push(row);
})
.on('end', () => {
console.log(data)
});
I'd like to convert a CSV file to a JSON object using NodeJS. The problem is that my CSV file is hosted on a special URL.
URL : My CSV here
var fs = require("fs");
var Converter = require("csvtojson").Converter;
var fileStream = fs.createReadStream("myurl");
var converter = new Converter({constructResult:true});
converter.on("end_parsed", function (jsonObj) {
console.log(jsonObj);
});
fileStream.pipe(converter);
Issue :
Error: ENOENT, open 'C:\xampp\htdocs\GestionDettes\http:\www.banque-france.fr\fileadmin\user_upload\banque_de_france\Economie_et_Statistiques\Changes_et_Taux\page3_quot.csv'
at Error (native)
Edit #1 :
Request.get(myurl, function (error, Response, body) {
var converter = new Converter({
constructResult: true,
delimiter: ';'
});
converter.fromString(body,function(err, taux){
console.log(taux); // it works
});
});
I did just that in a module reading and writing on different protocol in different data formats. I used request to get http resources.
If you want take a look at alinex-datastore. With this module it should work like:
const DataStore = require('#alinex/datastore').default;
async function transform() {
const ds = new DataStore('http://any-server.org/my.csv');
await ds.load();
await ds.save('file:/etc/my-app.json');
}
That should do it.