I am trying to import JSON file into Sample variable but only first few characters are displayed from Sample variable.
The sample.json is 20,00,000 characters, When i print Sample variable on Console only first 3,756 characters are printed.Is there any limitations on the characters that can be printed through console.log?
Complete data persists in Sample variable, I verified it by searching for strings that occur at the end of sample.json file
var Sample = require('./sample.json');
export default class proj extends Component {
constructor(props) {
super(props);
this.state = {
locations: [],
};
}
loadOnEvent() {
console.log(Sample);
//this.state={ locations : Sample };
}
}
Is there any other way to print data in Sample variable.
You have to convert json to string using JSON.stringify before logging.
/* ... */
loadOnEvent() {
console.log(JSON.stringify(Sample));
//this.state={ locations : Sample };
}
/* ... */
Try to use another way to load. Use fetch if file is remote or use fs if file is local.
If it is memory problem supposed by #Shota consider to use server side processing requests to json file. It is good solution to setup microservice which load json file at startup and handle requests to data struct parsed from json file.
Answer for webpack use case:
Configure webpack to use file-loader or copy-webpack-plugin for specifically this file because it enough big. Consider to load it in parallel with webpack bundle. If your application have big parts which need not each case they must be moved to separated bundles.
Related
Using Nuxt 3, I am struggling to do something that appears simple: I would like to get a list of restaurants from an api served by nuxt/nitro in order to use it on the client side. My original file, is a .csv file stored under assets/: assets/list.csv.
Here is what I have in my component .vuefile:
//...
const { restaurants } = await useFetch('/api/restaurants')
//...
And the content of server/api/restaurants.js:
import csv from 'csvtojson'
export default defineEventHandler(async(event) => {
const data = await csv().fromFile('~/assets/list.csv')
return { data }
})
But I get an error "[500] File does not exist". I've tried many variants but always get an error here or there. Could you help me figure out the solution? Thanks.
Actually solved by realizing the following:
As the doc suggests, the assets/ directory is for assets that are processed by the bundler (Vite or Webpack). Nuxt won't serve files in the assets/ directory unless nuxt.config.ts is configured with the appropriate loader (e.g. a csv loader). Hence the 500 error.
Nuxt Content, on the other hand, is useful to automatically parse a .csv file located in the content/ directory:
In nuxt.config.ts:
modules: ["#nuxt/content"]
In the component .vue file, the following will expose the parsed csv in data.body:
const { data } = await useAsyncData("list", () => queryContent("/list").findOne())
The beauty of nuxt is that you don't need to import anything, it does it for you.
I am using lib in npm for project angular call json2csv for export JSON to file.csv (excel)
Here is my code:
const fields = ['id', 'name'];
const opts = { fields };
try {
const parser = new json2csv.Parser(opts);
const csv = parser.parse(this.data);
console.log(csv);
} catch (err) {
console.error(err);
}
Printing object is correct data but not create file.
Can anyone help me when add filename and path in my code?
The json2csv node_module that you're trying to use in your Angular App is not supposed to be used there. It's supposed to be used on your NodeJS backend.
Your Frontend/Client is not responsible for writing files to the system. Your Backend/Server is.
Ideally, you should be creating a REST API to which, you'll be passing the JSON to be written in a CSV file as a Request Payload.
Your NodeJS Backend can then respond to that request with the downloadable CSV file that is generated by using the json2csv node_module
I want to keep my test data in a JSON file that I need to import in cucumber-protractor custom framework. I read we can directly require a JSON file or even use protractor params. However that doesn't work. I don't see the JSON file listed when requiring from a particular folder.
testdata.json
{
"name":"testdata",
"version":"1.0.0",
"username":"1020201",
"password":"1020201"
}
Code in the Config.js
onPrepare: function() {
var data = require('./testdata.json');
},
I don't see the testdata.json file when giving path in require though its available at the location.
I wish to access JSON data using data.name, data.version etc.
Following is my folder structure:
You should make sure your json file is located in the current directory & and in the same folder where your config file resides as you are giving this path require('./testdata.json'); -
There are many ways of setting your data variables and accessing them globally in your test scripts -
1st method: Preferred method is to use node's global object -
onPrepare: function() {
global.data = require('./testdata.json');
},
Now you could access data anywhere in your scripts.
2nd Method Is to use protractor's param object -
exports.config = {
params: {
data: require('./testdata.json');
}
};
you can then access it in the specs/test scripts using browser.params.data
I am currently working on a NodeJS (Express) project to edit images' metadata with Exiftool.
To edit images' metadata with Exiftool, I've to create a JSON file containing all metadata to modify then execute the command :
exiftool -j=metadata.json pathToTheImage/image.jpg
The json file must look like that :
[{"SourceFile":"pathToTheImage/image.jpg","XMP-dc:Title":"Image's title"}]
Here's my code to do that :
const {exec} = require('child_process');
let fs = require('fs');
let uploadPath = "uploads";
let uploadName = "image.jpg";
...
app.post('/metadata/editor', (req, res) => {
let jsonToImport = [...];
fs.writeFileSync("metadata.json", JSON.stringify(jsonToImport));
exec('exiftool -j=metadata.json ' + uploadPath + '/' + uploadName, (error, stdout, stderr) => {
if (error) {
console.error(error);
return;
}
res.redirect('/metadata/checker/' + uploadName);
});
});
The problem is at the level of "writeFileSync/exec".
Independently these two lines work well, that's to say that if I've just the first line, the JSON file is well created. And if I've just the second ligne, image's metadata are well updated.
But when I execute this two lines together, the JSON file is well created but the exec line do "nothing" (or something that I can't determine).
This code uses synchronous functions, I've test it with asynchronous functions, this is the same behavior.
Currently, to do what I need, I must execute the code above to create the JSON file, then I must comment the writeFileSync line and I must reexecute the code to update image's metadata correctly.
It's really strange, I've try to read the JSON file content before the exec line but everything is ok. I've use asynchronous functions, with and without promise... there is nothing to do it doesn't work.
Thank you for your help.
I'll answer my own question:
The problem was that I use nodemon, however by default nodemon watches JSON files. But in my code I created a JSON file to use it right after. So, I created the JSON file correctly, nodemon sees it, and restarts the node server; the rest of the code does not run.
To fix this, I added an option to ignore the created files in my package.json:
"nodemonConfig": {
"ignore": [
"path/to/files/to/ingore/*"
]
}
The title says everything.
I want to get an xls file from a third party server. (said service keeps fueling
records, and they do not expose any kind of api, only the excel file).
Then parse that file with a library like node-excel-to-json, and convert it into JSON format I can use to import the data in mongo.
I want to manipulate the file in-memory, without writing it to disk.
So, say I am getting the file with this code,
parseFuelingReport() {
let http = require('http');
let fs = require('fs');
// let excel2Json = require('node-excel-to-json');
let file = fs.createWriteStream("document.xls");
let request = http.get("http://www.everydayexcel.com/files/Excel_Test_Basic_1_cumulative_sum.xls", function (response) {
});
},
I want to load the response in memory and parse it with something like
excel2Json(/* this is supposed to be the path to the xls file */, {
'convert_all_sheet': false,
'return_type': 'File',
'sheetName': 'survey'
}, function (err, output) {
console.log('err, res', err, output);
});
I assume you are using https://github.com/kashifeqbal/node-excel-to-json, which is available as node package.
If you take a look at this line,
you can see, two things:
It calls XLSX.readFile(filePath);, what will load a file from disk. Hard to call with an in-memory object in.
Internally it uses a XLSX package, most likely this one: https://www.npmjs.com/package/xlsx
The XLSX API seems not as convenient as the excel2Json, but it provides a read() function which takes a JavaScript object:
/* Call XLSX */
var workbook = XLSX.read(bstr, {type:"binary"});
Hope this helps