In my react project, I'm trying to convert XML data from an API call into JSON (using a library called xml-js).
As per the documentation, I'm importing the library in my parent component as follows
const convert = require('xml-js')
and then attempting the convert the API data as follows
const beerList =
'<Product>
<Name>Island Life IPA</Name>
<Volume>300ml/473ml</Volume>
<Price>$10/$13</Price>
<ABV>6.3%</ABV>
<Handpump>No</Handpump>
<Brewery>Eddyline</Brewery>
<IBU/>
<ABV>6.3%</ABV>
<Image>islandlife.png</Image>
<Country>New Zealand</Country>
<Description>Fruited IPA</Description>
<Pouring>Next</Pouring>
<IBU/>
<TapBadge/>
<Comments/>
</Product>'
const beerJs = convert(beerList,{compact: true, spaces: 4})
The errors are telling me that 'convert' is not a function, which tells me that the library isn't being imported. So is the issue with using 'require' syntax, and if so, what alternative would work in react?
which tells me that the library isn't imported
No. If that were the case, you wouldn't even get that far, your require call would throw an error.
Instead, it tells you that convert is not a function - which it isn't! Look at it in a debugger or log it, and you'll see it's an object with several functions inside. You can't call an object like a function.
Take a look at the xml-js docs again:
This library provides 4 functions: js2xml(), json2xml(), xml2js(), and xml2json(). Here are the usages for each one (see more details in the following sections):
var convert = require('xml-js');
result = convert.js2xml(js, options); // to convert javascript object to xml text
result = convert.json2xml(json, options); // to convert json text to xml text
result = convert.xml2js(xml, options); // to convert xml text to javascript object
result = convert.xml2json(xml, options); // to convert xml text to json text
So the solution is to call convert.xml2json and not convert:
const beerJs = convert.xml2json(beerList, {compact: true, spaces: 4})
Or maybe you want an actual object and not a JSON string, then you'd use convert.xml2js (in which case the spaces option is useless):
const beerJs = convert.xml2js(beerList, {compact: true})
Related
I was trying to use the method found here (see most up-voted answer):
Google Apps Script Fastest way to find a row?
I currently use this while it does work I wanted to try the above linked method yet when I replace the below code
function AutoPopulate (evalue)
{
//uses google drive file irretator reads in JSON file and parses it to a Javascript object that we can work with
var iter = DriveApp.getFilesByName("units.json");
// iterate through all the files named units.json
while (iter.hasNext()) {
// define a File object variable and set the Media Tyep
var file = iter.next();
var jsonFile = file.getBlob().getDataAsString();
// log the contents of the file
//Logger.log(jsonFile);
}
var UnitDatabase = JSON.parse(jsonFile);
//Logger.log(UnitDatabase);
//Logger.log(UnitDatabase[1027]);
return UnitDatabase[evalue];
}
WITH THIS CODE:
function AutoPopulate (evalue)
{
//this method did not work for me but should have according to stackflow answer linked above I am trying to understand why or how I can find out why it may have thrown an error
var jsonFile = DriveApp.getFilesByName("units.json").next(),
UnitDatabase = UnitDatabase.getBlob().getDataAsString();
return UnitDatabase[evalue];
}
I get an error in the excecution indicating that there is a % at postion 0 in the JSON, between the methods I dont alter the JSON file in anyway so I dont understand why does the top method work but the bottom one does not?
For further information the idea behind the code is that I have a list of Unit numbers and model numbers that are in a spreadsheet. I then convert this to a JSON file, this however is only done when a new unit is added to the fleet. As I learned one can parse a whole JSON file into a javascript object which makes working with the data set much faster. This javascript object is used so that when a user enters a UNIT# the MODEL# is auto populated based on the JSON file.
I cannot share the JSON file as it contains client information.
Your code does not work for two reasons:
You have a typo in the line UnitDatabase = UnitDatabase.getBlob()... - it should be UnitDatabase = jsonFile.getBlob()...
If you want to retrieve a nested object from a json file - you need to parse the JSOn - otherwise it is considered a string and you can not access the nested structure
Modified working code:
function AutoPopulate2 (evalue)
{
var jsonFile = DriveApp.getFilesByName("units.json").next();
var UnitDatabase = JSON.parse(jsonFile.getBlob().getDataAsString());
return UnitDatabase[evalue];
}
Mind that this code will only work if you have a "units.json" file on your drive and if evalue is a valid 1st-level nested object of this json.
I am having output in following format as
"[{"a":"a1"},{"a":"a2"}]"
I want to actually extract it in array of json:
[
{
"a":"a1"
},
{
"a":"a2"
}
]
How to convert it?
You have tagged this with Node-RED - so my answer assumes that is the environment you are working in.
If you are passing a message to the Debug node and that is what you see in the Debug sidebar, that indicates your msg.payload is a String with the contents of [{"a":"a1"},{"a":"a2"}] - the Debug sidebar doesn't escape quotes when displaying strings like that.
So you likely already have exactly what you want - it just depends what you want to do with it next.
If you want to access the contents you need to parse it to a JavaScript Object. You can do this by passing your message through a JSON node.
Assuming your input contains the double quotes in the beginning and end, it is not possible to directly JSON.parse() the string.
In your case, you need to remove the first and last character (the double quotes) from your string before parsing it.
const unformattedString = '"[{"a":"a1"},{"a":"a2"}]"'
const formattedString = unformattedString.substr(1, unformattedString.length - 2)
const json = JSON.parse(formattedString)
The variable json now contains your JSON object.
I would suggest a different method which will get your work done without using any third party library.
var a = '[{"a":"a1"},{"a":"a2"}]';
var b = JSON.parse(a);
console.log(b); // b will return [{"a":"a1"},{"a":"a2"}]
Another way which is eval function which is generally not recommended
var d = eval(a);
If you want to use JQuery instead use :
var c = $.parseJSON(a);
I often work with large JavaScript objects and instead of manually opening and closing "branches", I would like to simply search for a particular string and show any key or value that matches.
Sort of like "grepping" for a keyword in a JavaScript object. Is this possible (especially in Chrome Dev Tool)?
Unfortunately I was hoping I could at least try the JSON.stringify() trick and then search on the raw JSON in a text editor, but I get the following error:
Uncaught TypeError: Converting circular structure to JSON
You can look at the object's keys and match against them:
function grepKeys(o, query){
var ret = {};
Object.keys(o).filter(function(key){
return key.includes(query);
}).forEach(function(key){ // can reduce instead
ret[key] = o[key]; // copy over
});
return ret;
}
Which'd let you return a partial object with all the keys that contain the string you specified. Note that this will not show any prototype keys but can be easily extended to allow it (by using a for... in instead of an Object.keys or by using recursion):
var o = grepKeys({buzz:5, fuzz:3, foo:4}, "zz");
o; // Object {buzz: 5, fuzz: 3}
I have the following code which publishes the json data in the specified url using mqtt.The initial data is retrieved from http.
var request = require('request');
var JSONStream = require('JSONStream');
var es = require('event-stream');
var mqtt = require('mqtt');
request({url: 'http://isaacs.couchone.com/registry/_all_docs'})
.pipe(JSONStream.parse('rows.*'))
.pipe(es.mapSync(function (data) {
console.info(data);
var client = mqtt.createClient(1883, 'localhost');
client.publish('NewTopic', JSON.stringify(data));
client.end();
return data;
}))
The following is the subscriber code which subscribes the data that is published (in the above code) through mqtt
var mqtt = require('mqtt');
var client = mqtt.createClient();
client.subscribe('NewTopic');
client.on('message', function(topic, message) {
console.info(message);
});
In the above code, I get all json data in the specified url in 'message'.I need to extract 'id' and 'value' from the received data and make it as a single JSON object and need to publish it to mqtt,so that another client can subscribe only the 'id' and 'value' as json data.
To convert a JSON text into an object, you can use the eval() function. eval() invokes the JavaScript compiler. Since JSON is a proper subset of JavaScript, the compiler will correctly parse the text and produce an object structure. The text must be wrapped in parens to avoid tripping on an ambiguity in JavaScript's syntax.
var myObject = eval(message);
The eval function is very fast. However, it can compile and execute any JavaScript program, so there can be security issues. The use of eval is indicated when the source is trusted and competent. It is much safer to use a JSON parser. In web applications over XMLHttpRequest, communication is permitted only to the same origin that provide that page, so it is trusted. But it might not be competent. If the server is not rigorous in its JSON encoding, or if it does not scrupulously validate all of its inputs, then it could deliver invalid JSON text that could be carrying dangerous script. The eval function would execute the script, unleashing its malice.
To defend against this, a JSON parser should be used. A JSON parser will recognize only JSON text, rejecting all scripts. In browsers that provide native JSON support, JSON parsers are also much faster than eval.
var myObject = JSON.parse(message);
And use it as a Object:
myObject.id;
myObject.value;
Create a object with just id and value :
var idAndValueObj = {};
idAndValueObj.id = myObject.id;
idAndValueObj.value = myObject.value;
Convert to JSON string:
var jsonStr = JSON.stringify(idAndValueObj);
I have a large json file, its is Newline-delimited JSON, where multiple standard JSON objects are delimited by extra newlines, e.g.
{'name':'1','age':5}
{'name':'2','age':3}
{'name':'3','age':6}
I am now using JSONStream in node.js to parse a large json file, the reason I use JSONStream is because it is based on stream.
However,both parse syntax in the example can't help me to parse this json file with separated JSON in each line
var parser = JSONStream.parse(**['rows', true]**);
var parser = JSONStream.parse([**/./**]);
Can someone help me with that
Warning: Since this answer was written, the author of the JSONStream library removed the emit root event functionality, apparently to fix a memory leak.
Future users of this library, you can use the 0.x.x versions if you need the emit root functionality.
Below is the unmodified original answer:
From the readme:
JSONStream.parse(path)
path should be an array of property names, RegExps, booleans, and/or functions. Any object that matches the path will be emitted as 'data'.
A 'root' event is emitted when all data has been received. The 'root' event passes the root object & the count of matched objects.
In your case, since you want to get back the JSON objects as opposed to specific properties, you will be using the 'root' event and you don't need to specify a path.
Your code might look something like this:
var fs = require('fs'),
JSONStream = require('JSONStream');
var stream = fs.createReadStream('data.json', {encoding: 'utf8'}),
parser = JSONStream.parse();
stream.pipe(parser);
parser.on('root', function (obj) {
console.log(obj); // whatever you will do with each JSON object
});
JSONstream is intended for parsing a single huge JSON object, not many JSON objects. You want to split the stream at newlines, then parse them as JSON.
The NPM package split claims to do this splitting, and even has a feature to parse the JSON lines for you.
If your file is not enough large here is an easy, but not performant solution:
const fs = require('fs');
let rawdata = fs.readFileSync('fileName.json');
let convertedData = String(rawdata)
.replace(/\n/gi, ',')
.slice(0, -1);
let JsonData= JSON.parse(`[${convertedData}]`);
I created a package #jsonlines/core which parses jsonlines as object stream.
You can try the following code:
npm install #jsonlines/core
const fs = require("fs");
const { parse } = require("#jsonlines/core");
// create a duplex stream which parse input as lines of json
const parseStream = parse();
// read from the file and pipe into the parseStream
fs.createReadStream(yourLargeJsonLinesFilePath).pipe(parseStream);
// consume the parsed objects by listening to data event
parseStream.on("data", (value) => {
console.log(value);
});
Note that parseStream is a standard node duplex stream.
So you can also use for await ... of or other ways to consume it.
Here's another solution for when the file is small enough to fit into memory. It reads the whole file in one go, converts it into an array by splitting it at the newlines (removing the blank line at the end), and then parses each line.
import fs from "fs";
const parsed = fs
.readFileSync(`data.jsonl`, `utf8`)
.split(`\n`)
.slice(0, -1)
.map(JSON.parse)