I am having json array object in my Angular 4 application and where i need to find out unique space type in my JSON array.
Json Array:
[{"label":{"Space_Type":"Office_PrivateOffice"}},{"label":{"Space_Type":"Office_PrivateOffice"}},{"label":{"Space_Type":"Office_PrivateOffice"}},{"label":{"Space_Type":"Hospital_Lab"}},
{"label":{"Space_Type":"Office_OpenOffice"}},
{"label":{"Space_Type":"Office_PrivateOffice"}},
{"label":{"Space_Type":"Office_PrivateOffice"}},{},
{"label":{"Space_Type":"Office_PrivateOffice"}},
{"label":{"Space_Type":"Office_PrivateOffice"}},{"label":{"Space_Type":"Office_PrivateOffice"}},{"label":{"Space_Type":"Hospital_Lab"}},{"label":{"Space_Type":"Office_OpenOffice"}},{"label":{"Space_Type":"Office_PrivateOffice"}},{"label":{"Space_Type":"Office_OpenOffice"}},{}]
I want to remove all the duplicate space type .Can anyone tell me how i can get.
I want the result :
[{Idx:0,Label:"Office_PrivateOffice"},{Idx:1,Label:"Office_OpenOffice"}]
You can use lodash method _.uniqWith
var jsonarray = [{"label":{"Space_Type":"Office_PrivateOffice"}},{"label":{"Space_Type":"Office_PrivateOffice"}},{"label":{"Space_Type":"Office_PrivateOffice"}},{"label":{"Space_Type":"Hospital_Lab"}},
{"label":{"Space_Type":"Office_OpenOffice"}},
{"label":{"Space_Type":"Office_PrivateOffice"}},
{"label":{"Space_Type":"Office_PrivateOffice"}},{},
{"label":{"Space_Type":"Office_PrivateOffice"}},
{"label":{"Space_Type":"Office_PrivateOffice"}},{"label":{"Space_Type":"Office_PrivateOffice"}},{"label":{"Space_Type":"Hospital_Lab"}},{"label":{"Space_Type":"Office_OpenOffice"}},{"label":{"Space_Type":"Office_PrivateOffice"}},{"label":{"Space_Type":"Office_OpenOffice"}},{}];
var filtered = _.uniqWith(jsonarray, _.isEqual);
console.log(filtered);
<script src='https://cdn.jsdelivr.net/lodash/4.17.2/lodash.min.js'></script>
The lodash method _.uniqBy(root, 'duplicateElement'); can also be used. The advantage of using this method is that you can tell lodash which element you would like to remove duplicate of.
var newJsonFile = _.uniqBy(label, 'Space_Type');
console.log(newJsonFile);
In Angular, you would need to download package lodash
npm install lodash
Then import it
import * as _ from 'lodash';
Related
I have to put data from json file to my reducer.
And after mapping this file I got an object which includes data that I need.
export const datas = data.properties.map(data => (<store key={Math.floor(Math.random()*1234)} author={data.value} comment={data.group} rate={data.type} />));
this is console log, I just need props from this
How to get just normall react-table, not an object which is not accepting by reducer state?
Or how to implement it to reducer state to get effect that I need?
I am sorry for stupid questions, I hope you will help me :)
The code which you posted takes an array of objects from a variable data.properties and maps them to an array of JSX elements created by the component store. Perhaps it is better readable with line breaks.
export const datas = data.properties.map(
data => (
<store
key={Math.floor(Math.random() * 1234)}
author={data.value}
comment={data.group}
rate={data.type}
/>
)
);
Your console.log of datas shows that array of React JSX element instances. In your array you have 5 objects which are each a JSX element ($$typeof: Symbol(react.element)) with type: "store" (the component type), the numeric key that you created for this element, and a props object which contains all of the props that you passed: author, comment, and rate (key is a special prop so it is not included here).
You are asking to just create an object with the props rather than creating a store element. We want to take the array data.properties and use the .map() method to map it to this props object.
The variable name that you use inside of your .map() callback can be anything, but as a best practice I recommend that you should not use the variable name data again. Reusing a name is called variable shadowing and it won't cause errors, but it can make your code very confusing. I will call it datum instead.
Here is the code that you want:
export const propsArray = data.properties.map(
datum => ({
key: Math.floor(Math.random() * 1234),
author: datum.value,
comment: datum.group,
rate: datum.type,
})
);
In my react project, I'm trying to convert XML data from an API call into JSON (using a library called xml-js).
As per the documentation, I'm importing the library in my parent component as follows
const convert = require('xml-js')
and then attempting the convert the API data as follows
const beerList =
'<Product>
<Name>Island Life IPA</Name>
<Volume>300ml/473ml</Volume>
<Price>$10/$13</Price>
<ABV>6.3%</ABV>
<Handpump>No</Handpump>
<Brewery>Eddyline</Brewery>
<IBU/>
<ABV>6.3%</ABV>
<Image>islandlife.png</Image>
<Country>New Zealand</Country>
<Description>Fruited IPA</Description>
<Pouring>Next</Pouring>
<IBU/>
<TapBadge/>
<Comments/>
</Product>'
const beerJs = convert(beerList,{compact: true, spaces: 4})
The errors are telling me that 'convert' is not a function, which tells me that the library isn't being imported. So is the issue with using 'require' syntax, and if so, what alternative would work in react?
which tells me that the library isn't imported
No. If that were the case, you wouldn't even get that far, your require call would throw an error.
Instead, it tells you that convert is not a function - which it isn't! Look at it in a debugger or log it, and you'll see it's an object with several functions inside. You can't call an object like a function.
Take a look at the xml-js docs again:
This library provides 4 functions: js2xml(), json2xml(), xml2js(), and xml2json(). Here are the usages for each one (see more details in the following sections):
var convert = require('xml-js');
result = convert.js2xml(js, options); // to convert javascript object to xml text
result = convert.json2xml(json, options); // to convert json text to xml text
result = convert.xml2js(xml, options); // to convert xml text to javascript object
result = convert.xml2json(xml, options); // to convert xml text to json text
So the solution is to call convert.xml2json and not convert:
const beerJs = convert.xml2json(beerList, {compact: true, spaces: 4})
Or maybe you want an actual object and not a JSON string, then you'd use convert.xml2js (in which case the spaces option is useless):
const beerJs = convert.xml2js(beerList, {compact: true})
I'm new to GDScript and am looking at how best to save data to a text file. to_json works well for basic types but I just get a reference id for any custom classes. I'd ideally like to pass a dictionary of data including some custom class elements to to_json and let it convert it all at once.
Like other languages provide a toString method for printing an object, is there anything that would let me specify how a class instance should be converted to JSON?
Yeah, you would just add something like the following to your class:
func to_json():
var data = {} #must create it as a dictionary or array
data["health"] = 5
#code to create json
var json
json = data.to_json() #dictionaries automatically have this function
return json
I think it really is that simple :)
Please note: I have not tested this code.
I'm trying to load a json file from a URL and parse it within Dart. So I had tried the following code as suggested from some links when I google for it:
HttpRequest.getString("hellknight2.js").then((response)
{
var model = new JSON.parse(response);
});
However, it seems to not work anymore on Dart SDK version 0.4.3.5_r20602. What is the current best way to get a Json file mapped to an object in Dart?
Simply use json of the dart:convert package. Here is an example :
import 'dart:convert';
main() {
final myJsonAsString = '{"a": 1, "b": "c"}';
final decoded = json.decode(myJsonAsString);
....
}
See Parsing JSON for more details.
in my case
JSON.decode
didn't work.
Instead I had to use :
import 'dart:convert' as JSON;
final json=JSON.jsonDecode(myJsonAsString);
Here is my solution :) At first, you need to import the convert package:
import 'dart:convert';
var res = json.decode(response.body);
then you can get values by key, like below:
print(res["message"]);
It depends on a lot of things.
Is the json text you get is an array or a map?
You can try with:
Map model = new parse(response);
Or
List model = new parse(response);
but you need to import JSONObject by Chris Buckett into your package
import "package:json_object/json_object.dart";
You can install it from pubspec adding this dependency
json_object
There's a new pub package for this:
Victor Savkin - Serializers.
I didn't use it but seems to me that it will suite you. Try it out
you can try this package. pub: g_json
dynamic model = JSON.parse(JsonStringFromAnywhere);
final name = model['name'].stringValue;
// OR
final name = model.name;
I have a large json file, its is Newline-delimited JSON, where multiple standard JSON objects are delimited by extra newlines, e.g.
{'name':'1','age':5}
{'name':'2','age':3}
{'name':'3','age':6}
I am now using JSONStream in node.js to parse a large json file, the reason I use JSONStream is because it is based on stream.
However,both parse syntax in the example can't help me to parse this json file with separated JSON in each line
var parser = JSONStream.parse(**['rows', true]**);
var parser = JSONStream.parse([**/./**]);
Can someone help me with that
Warning: Since this answer was written, the author of the JSONStream library removed the emit root event functionality, apparently to fix a memory leak.
Future users of this library, you can use the 0.x.x versions if you need the emit root functionality.
Below is the unmodified original answer:
From the readme:
JSONStream.parse(path)
path should be an array of property names, RegExps, booleans, and/or functions. Any object that matches the path will be emitted as 'data'.
A 'root' event is emitted when all data has been received. The 'root' event passes the root object & the count of matched objects.
In your case, since you want to get back the JSON objects as opposed to specific properties, you will be using the 'root' event and you don't need to specify a path.
Your code might look something like this:
var fs = require('fs'),
JSONStream = require('JSONStream');
var stream = fs.createReadStream('data.json', {encoding: 'utf8'}),
parser = JSONStream.parse();
stream.pipe(parser);
parser.on('root', function (obj) {
console.log(obj); // whatever you will do with each JSON object
});
JSONstream is intended for parsing a single huge JSON object, not many JSON objects. You want to split the stream at newlines, then parse them as JSON.
The NPM package split claims to do this splitting, and even has a feature to parse the JSON lines for you.
If your file is not enough large here is an easy, but not performant solution:
const fs = require('fs');
let rawdata = fs.readFileSync('fileName.json');
let convertedData = String(rawdata)
.replace(/\n/gi, ',')
.slice(0, -1);
let JsonData= JSON.parse(`[${convertedData}]`);
I created a package #jsonlines/core which parses jsonlines as object stream.
You can try the following code:
npm install #jsonlines/core
const fs = require("fs");
const { parse } = require("#jsonlines/core");
// create a duplex stream which parse input as lines of json
const parseStream = parse();
// read from the file and pipe into the parseStream
fs.createReadStream(yourLargeJsonLinesFilePath).pipe(parseStream);
// consume the parsed objects by listening to data event
parseStream.on("data", (value) => {
console.log(value);
});
Note that parseStream is a standard node duplex stream.
So you can also use for await ... of or other ways to consume it.
Here's another solution for when the file is small enough to fit into memory. It reads the whole file in one go, converts it into an array by splitting it at the newlines (removing the blank line at the end), and then parses each line.
import fs from "fs";
const parsed = fs
.readFileSync(`data.jsonl`, `utf8`)
.split(`\n`)
.slice(0, -1)
.map(JSON.parse)