NodeJS JSON file Read without newline characters - json

I am reading a JSON file using fs.readFileSync(fileName, 'utf8'); but the results include newline characters, and the output is getting like:
"{\r\n \"name\":\"Arka\",\r\n \"id\": \"13\"\r\n}"
How do I avoid these characters?
my local file looks like:
{
"name":"Arka",
"id": "13"
}

Its unnecessary to read JSON in using fs.readFileSync(). This requires you to also write a try/catch block around the fs.readFileSync() usage and then use JSON.parse() on the file data. Instead you can require JSON files in Node as if they were packages. They will get parsed as if you read the file in as a string and then used JSON.parse(), this simplifies the reading of JSON to one line.
let data = require(fileName)
console.log(data) // { name: 'Arka', id: '13' }
If you want to serialize the parsed JS object within data to a file without the new line & carriage return characters you can write the JSON string to a file using JSON.stringify() only passing in data.
const {promisify} = require('util')
const writeFile = util.promisify(require('fs').writeFile)
const data = require(fileName)
const serializeJSON = (dest, toJson) => writeFile(dest, JSON.stringify(toJson))
serializeJSON('./stringify-data.json', data)
.then(() => console.log('JSON written Successfully'))
.catch(err => console.log('Could not write JSON', err))

You could read the file and then remove them with a regex:
var rawJson = fs.readFileSync(fileName, 'utf8');
rawJson = rawJson.replace(/\r|\n/g, '');
Keep in mind though that for parsing JSON with JSON.parse, you don't need to do this. The result will be the same with and without the newlines.

Related

typescript - load json from url and get access to array of json objects

I just can't find a working solution and implement in my format.
There is a JSON file which is returned to me by URL. Its format is:
{"success":true,
"data":[
{
"loadTimestamp":"2022-07-20T15:12:35.097Z",
"seqNum":"9480969",
"price":"45.7",
"quantity":"0.2",
"makerClientOrderId":"1658329838469",
"takerClientOrderId":"1658329934701"
},
{
"loadTimestamp":"2022-07-20T14:49:11.446Z",
"seqNum":"9480410",
"price":"46",
"quantity":"0.1",
"makerClientOrderId":"1658328403394",
"takerClientOrderId":"0"
}]
}
Due to the fact that it is returned via the URL, it is not possible to directly use the object, for example:
myobj['data']['price']
I have either a string of data that I can convert using JSON.parse() or an object right away.
But for some reason I can't use it directly.
As far as I understand, this is a JSON file inside which is an array of JSON data.
My goal is to display all the data from the array, while taking for example 2 values: price, quantity
How can I access the values that I want to get?
Ok I find, what I was looking for.
I return result not in json, but in text response.text()
After I did this, I create a new constant called res and put in it JSON.parse(data)
const url = 'https://myurl.com/'+pub_key
const response = await fetch(url)
let data = ''
if (response.ok) { data = await response.text() } else { console.log("Error HTTP: " + response.status) }
const res = JSON.parse(data)
After all this manipulations I can use my data with 2 ways:
console.log(res["data"][0]["price"])
console.log(res.data[0].price)
Or I can make a cocktail from it, by using my favorite blender :)))
if(res.success==true){
for(let item in res.data){
console.log(res.data[item].price,res.data[item].quantity)
}
}

Is it possible to iterate over a json object having `\n` and \`` characters in typescript?

I have a json object which is something like below:
"\"{\\n \\\"Foo\\\" : \\\"1234\\\",\\n}\""
Is it somehow possible to iterate through this json object?I tried but my logic did not work which i feel is basically because of these \n and \ i am unable to iterate.How can i get rid of these unnecessary characters ?
The string you've shown is double-encoded JSON if we assume that you've removed some of the content (it has a trailing , in the JSON, which JSON doesn't allow).
If you run it through JSON.parse, you get a string containing JSON for an object.
If you run that through JSON.parse, you get an object.
E.g.:
const parsedOnce = JSON.parse(str);
const obj = JSON.parse(parsedOnce);
Then you loop through that object's properties in the normal way (for-in, Object.keys, Object.entries, etc.).
Live Example:
const str = "\"{\\n \\\"Foo\\\" : \\\"1234\\\"\\n}\"";
const parsedOnce = JSON.parse(str);
const obj = JSON.parse(parsedOnce);
for (const key in obj) {
console.log(`${key} = ${obj[key]}`);
}
That code is also valid TypeScript (playground link), though if you have a type you can apply to obj so it doesn't default to any, that would be good. (You could apply {[key: string]: any} to it at minimum.)

Converting xml text from webpage to json

I'm trying to convert xml on a webpage to json.
I used axios to grab the information from the URL and then used npm xml.js to try to convert the data to json.
let axios = require("axios");
let convert = require("xml-js");
let mtaURL = "http://advisory.mtanyct.info/eedevwebsvc/allequipments.aspx";
axios.get(mtaURL)
.then(response => {
let results = convert.xml2json(response, {compact: false, spaces: 4})
console.log(results);
})
It came back with the following:
Error: Text data outside of root node.
Line: 0
Column: 59
Char: x
You're trying to parse the Axios response object as XML.
You need to read the body of the response and treat that as XML.
response.data

Why cant parse JSON from file?

We have empty JSON file
I want to write new JSON objects in this file, and get from it Array of JSON objects (and after simply append new JSONs to array by 'push')
I write to the file incoming JSON object:
fs.writeFileSync(tasks, updatedJsonStr, encoding='utf8');
where
updatedJsonStr = JSON.stringify({"isCompleted":false,"task":"dfgdfg","date":"25.06.2015"});
So in the file we see added object.
After we get from the file our JSON objects:
tasksJsonObj = JSON.parse(fs.readFileSync("tasks.json", "utf-8"));
Append new JSON object as string and write it again:
updatedJsonStr = JSON.stringify(tasksJsonObj) + ',' + JSON.stringify(newJsonTask);
fs.writeFileSync(tasks, updatedJsonStr, encoding='utf8');
So we see 2 JSON objects in the file.
!But when I try to read file with 2 JSON objects - I got an error when reading JSON from the file ([SyntaxError: Unexpected token ,]):
try{
tasksJsonObj = JSON.parse(fs.readFileSync(tasks, "utf-8"));
console.log('aaaaa' + JSON.stringify(tasksJsonObj));
return true;
}catch (err) {
console.log("its not ok!");
console.log(err);
return false;
}
Your JSON formation is wrong,
{"isCompleted":false,"task":"dfgdfg","date":"25.06.2015"}, {"newisCompleted":false,"task":"dfgdfg","date":"25.06.2015"}
this is what you will get after concat two JSON stringify, which is also invalid.
Your JSON should be like this
[ {"isCompleted":false,"task":"dfgdfg","date":"25.06.2015"}, {"newisCompleted":false,"task":"dfgdfg","date":"25.06.2015"} ]
for that you can do something like this
var tasks = [];
tasks.push( {"isCompleted":false,"task":"dfgdfg","date":"25.06.2015"}) );
so your update jsonstr would be
updatedJsonStr = JSON.stringify( tasks );
again if you want to append a new json string, you can do like this
tasksJsonObj = JSON.parse(fs.readFileSync("tasks.json", "utf-8"));
tasksJsonObj.push( newJsonTask );
and then stringify it and write it back to file.
If you are trying to "extend" the object:
https://lodash.com/docs#assign
deep extend (like jQuery's) for nodeJS
If you are trying to "push" new objects into an array, then you're almost there:
var updatedJsonStr = JSON.stringify([{"isCompleted":false,"task":"dfgdfg","date":"25.06.2015"}]);
fs.writeFileSync(tasks, updatedJsonStr, encoding='utf8');
var tasksJsonArr = JSON.parse(fs.readFileSync("tasks.json", "utf-8"));
tasksJsonArr.push(newJsonTask);
There's a little problem with what you are writing. If you want to have multiple JSONs next to each other, you should consider putting them into an array:
updatedJsonStr = [];
updatedJsonStr.push(JSON.stringify(tasksJsonObj));
updatedJsonStr.push(JSON.stringify(newJsonTask));
and then write them into file!

how to parse a large, Newline-delimited JSON file by JSONStream module in node.js?

I have a large json file, its is Newline-delimited JSON, where multiple standard JSON objects are delimited by extra newlines, e.g.
{'name':'1','age':5}
{'name':'2','age':3}
{'name':'3','age':6}
I am now using JSONStream in node.js to parse a large json file, the reason I use JSONStream is because it is based on stream.
However,both parse syntax in the example can't help me to parse this json file with separated JSON in each line
var parser = JSONStream.parse(**['rows', true]**);
var parser = JSONStream.parse([**/./**]);
Can someone help me with that
Warning: Since this answer was written, the author of the JSONStream library removed the emit root event functionality, apparently to fix a memory leak.
Future users of this library, you can use the 0.x.x versions if you need the emit root functionality.
Below is the unmodified original answer:
From the readme:
JSONStream.parse(path)
path should be an array of property names, RegExps, booleans, and/or functions. Any object that matches the path will be emitted as 'data'.
A 'root' event is emitted when all data has been received. The 'root' event passes the root object & the count of matched objects.
In your case, since you want to get back the JSON objects as opposed to specific properties, you will be using the 'root' event and you don't need to specify a path.
Your code might look something like this:
var fs = require('fs'),
JSONStream = require('JSONStream');
var stream = fs.createReadStream('data.json', {encoding: 'utf8'}),
parser = JSONStream.parse();
stream.pipe(parser);
parser.on('root', function (obj) {
console.log(obj); // whatever you will do with each JSON object
});
JSONstream is intended for parsing a single huge JSON object, not many JSON objects. You want to split the stream at newlines, then parse them as JSON.
The NPM package split claims to do this splitting, and even has a feature to parse the JSON lines for you.
If your file is not enough large here is an easy, but not performant solution:
const fs = require('fs');
let rawdata = fs.readFileSync('fileName.json');
let convertedData = String(rawdata)
.replace(/\n/gi, ',')
.slice(0, -1);
let JsonData= JSON.parse(`[${convertedData}]`);
I created a package #jsonlines/core which parses jsonlines as object stream.
You can try the following code:
npm install #jsonlines/core
const fs = require("fs");
const { parse } = require("#jsonlines/core");
// create a duplex stream which parse input as lines of json
const parseStream = parse();
// read from the file and pipe into the parseStream
fs.createReadStream(yourLargeJsonLinesFilePath).pipe(parseStream);
// consume the parsed objects by listening to data event
parseStream.on("data", (value) => {
console.log(value);
});
Note that parseStream is a standard node duplex stream.
So you can also use for await ... of or other ways to consume it.
Here's another solution for when the file is small enough to fit into memory. It reads the whole file in one go, converts it into an array by splitting it at the newlines (removing the blank line at the end), and then parses each line.
import fs from "fs";
const parsed = fs
.readFileSync(`data.jsonl`, `utf8`)
.split(`\n`)
.slice(0, -1)
.map(JSON.parse)