Pushing Multiple values into array gives error using angular6 - angular6

I am working in angular 6 here i want to push multiple values into array which gives me the following error
here is my code
this._model.NomineeList.push(
{
'FirstName': this._nomineemodel.FirstName,
'CNIC': this._nomineemodel.CNIC,
'MiddleName': this._nomineemodel.MiddleName,
'LandlineNumber': this._nomineemodel.LandlineNumber,
'LastName': this._nomineemodel.LastName,
'MobileNumber': this._nomineemodel.MobileNumber,
'PermanentAddress': this._nomineemodel.PermanentAddress,
'PresentAddress': this._nomineemodel.PresentAddress,
'RelationId': this._nomineemodel.RelationId,
'RelationName': this._nomineemodel.RelationName,
'UPermanentAddress': '',
'UPresentAddress': ''
});
How to push into array using angular 6.

The error is self explanatory, the kind on model you are inserting in your array they are not identical. there are two possible reasons.
1) Either you haven't initiated the array with empty values. or
2) The model you are inserting is missing mandatory properties or the new model having few extra properties or spell mistake in properly name.
Check this stackblitz example.

Related

How can I query for multiple values after a wildcard?

I have a json object like so:
{
_id: "12345",
identifier: [
{
value: "1",
system: "system1",
text: "text!"
},
{
value: "2",
system: "system1"
}
]
}
How can I use the XDevAPI SearchConditionStr to look for the specific combination of value + system in the identifier array? Something like this, but this doesn't seem to work...
collection.find("'${identifier.value}' IN identifier[*].value && '${identifier.system} IN identifier[*].system")
By using the IN operator, what happens underneath the covers is basically a call to JSON_CONTAINS().
So, if you call:
collection.find(":v IN identifier[*].value && :s IN identifier[*].system")
.bind('v', '1')
.bind('s', 'system1')
.execute()
What gets executed, in the end, is (simplified):
JSON_CONTAINS('["1", "2"]', '"2"') AND JSON_CONTAINS('["system1", "system1"]', '"system1"')
In this case, both those conditions are true, and the document will be returned.
The atomic unit is the document (not a slice of that document). So, in your case, regardless of the value of value and/or system, you are still looking for the same document (the one whose _id is '12345'). Using such a statement, the document is either returned if all search values are part of it, and it is not returned if one is not.
For instance, the following would not yield any results:
collection.find(":v IN identifier[*].value && :s IN identifier[*].system")
.bind('v', '1')
.bind('s', 'system2')
.execute()
EDIT: Potential workaround
I don't think using the CRUD API will allow to perform this kind of "cherry-picking", but you can always use SQL. In that case, one strategy that comes to mind is to use JSON_SEARCH() for retrieving an array of paths corresponding to each value in the scope of identifier[*].value and identifier[*].system i.e. the array indexes and use JSON_OVERLAPS() to ensure they are equal.
session.sql(`select * from collection WHERE json_overlaps(json_search(json_extract(doc, '$.identifier[*].value'), 'all', ?), json_search(json_extract(doc, '$.identifier[*].system'), 'all', ?))`)
.bind('2', 'system1')
.execute()
In this case, the result set will only include documents where the identifier array contains at least one JSON object element where value is equal to '2' and system is equal to system1. The filter is effectively applied over individual array items and not in aggregate, like on a basic IN operation.
Disclaimer: I'm the lead developer of the MySQL X DevAPI Connector for Node.js

Implement CSV download using current filters and sort

I need to implement a download feature. It will read the data in the react-data-grid (adazzle), respecting the current columns, filters and sort, and create an array json (or comma separated strings) I can then pass to the react-csv module.
I have a data structure populated from the backend but it is not filtered nor sorted. I need to be able to ask the grid for it's data on a row-by-row basis. Can anyone point me in the right direction?
Without code or some context, I can't answer with certainty...
You supply the rowGetter prop with the collection to display, or the method to get the rows to display...I'm thinking if you filtering, then most likely you've got some sort of mechanism supporting that... Either way, you can use this property's value somehow to get exactly what you see in the grid.
If you literally want to interrogate the grid, you could try adding a reference to the grid, and then see if you can ask it for the row data. I can't remember with certainty that I saw a rows prop in the grids available props via the ref, but I imagine you should be able to (**,)
...
handleExport = async => {
const exportRows = rows;
// const exportRows = getRows(initialRows, filters);
// const exportRows = this.state.gridref.CurrentRows DISCLAIMER:CurrentRows is just for giving the idea... check out the ref yourself to see if it's possible to get the rows via the grid refs props.
downloadCSV( exportRows )
}
...
<ReactDataGrid
ref={input => {this.state.gridref = input}}
columns={columns}
rowGetter={i => rows[i]} // or maybe rowGetter={i => getRows(initialRows, filters)[i]}
rowsCount={rows.length}
onGridSort={(sortColumn, sortDirection) =>
setRows(sortRows(initialRows, sortColumn, sortDirection))
}
/>
I've only ever [set / initialised] the this.state.gridRef prop in my constructor, but I guess you could also [set / initialise] it in your componentDidMount as well...
initialise like this:
this.state.gridRef = React.createRef()

Parse complex Json string contained in Hadoop

I want to parse a string of complex JSON in Pig. Specifically, I want Pig to understand my JSON array as a bag instead of as a single chararray. I found that complex JSON can be parsed by using Twitter's Elephant Bird or Mozilla's Akela library. (I found some additional libraries, but I cannot use 'Loader' based approach since I use HCatalog Loader to load data from Hive.)
But, the problem is the structure of my data; each value of Map structure contains value part of complex JSON. For example,
1. My table looks like (WARNING: type of 'complex_data' is not STRING, a MAP of <STRING, STRING>!)
TABLE temp_table
(
user_id BIGINT COMMENT 'user ID.',
complex_data MAP <STRING, STRING> COMMENT 'complex json data'
)
COMMENT 'temp data.'
PARTITIONED BY(created_date STRING)
STORED AS RCFILE;
2. And 'complex_data' contains (a value that I want to get is marked with two *s, so basically #'d'#'f' from each PARSED_STRING(complex_data#'c') )
{ "a": "[]",
"b": "\"sdf\"",
"**c**":"[{\"**d**\":{\"e\":\"sdfsdf\"
,\"**f**\":\"sdfs\"
,\"g\":\"qweqweqwe\"},
\"c\":[{\"d\":21321,\"e\":\"ewrwer\"},
{\"d\":21321,\"e\":\"ewrwer\"},
{\"d\":21321,\"e\":\"ewrwer\"}]
},
{\"**d**\":{\"e\":\"sdfsdf\"
,\"**f**\":\"sdfs\"
,\"g\":\"qweqweqwe\"},
\"c\":[{\"d\":21321,\"e\":\"ewrwer\"},
{\"d\":21321,\"e\":\"ewrwer\"},
{\"d\":21321,\"e\":\"ewrwer\"}]
},]"
}
3. So, I tried... (same approach for Elephant Bird)
REGISTER '/path/to/akela-0.6-SNAPSHOT.jar';
DEFINE JsonTupleMap com.mozilla.pig.eval.json.JsonTupleMap();
data = LOAD temp_table USING org.apache.hive.hcatalog.pig.HCatLoader();
values_of_map = FOREACH data GENERATE complex_data#'c' AS attr:chararray; -- IT WORKS
-- dump values_of_map shows correct chararray data per each row
-- eg) ([{"d":{"e":"sdfsdf","f":"sdfs","g":"sdf"},... },
{"d":{"e":"sdfsdf","f":"sdfs","g":"sdf"},... },
{"d":{"e":"sdfsdf","f":"sdfs","g":"sdf"},... }])
([{"d":{"e":"sdfsdf","f":"sdfs","g":"sdf"},... },
{"d":{"e":"sdfsdf","f":"sdfs","g":"sdf"},... },
{"d":{"e":"sdfsdf","f":"sdfs","g":"sdf"},... }]) ...
attempt1 = FOREACH data GENERATE JsonTupleMap(complex_data#'c'); -- THIS LINE CAUSE AN ERROR
attempt2 = FOREACH data GENERATE JsonTupleMap(CONCAT(CONCAT('{\\"key\\":', complex_data#'c'), '}'); -- IT ALSO DOSE NOT WORK
I guessed that "attempt1" was failed because the value doesn't contain full JSON. However, when I CONCAT like "attempt2", I generate additional \ mark with. (so each line starts with {\"key\": ) I'm not sure that this additional marks breaks the parsing rule or not. In any case, I want to parse the given JSON string so that Pig can understand. If you have any method or solution, please Feel free to let me know.
I finally solved my problem by using jyson library with jython UDF.
I know that I can solve it by using JAVA or other languages.
But, I think that jython with jyson is the most simplist answer to this issue.

What's the best way to send in multiple coordinates in a JSON to RethinkDB in order to create an r.polygon?

I'm using an Express server with RethinkDB, and I want to send in multiple coordinates into my 'locations' table on RethinkDB and create an r.polygon(). I understand how to do the query via RethinkDB's data explorer , but I'm having trouble figuring out how to send it via JSON from the client to the server and insert it through my query there.
I basically want to do this:
r.db('places').table('locations').insert({
name: req.body.name,
bounds: r.polygon(req.body.bounds)
})
where req.body.bounds looks like this:
[long, lat],[long, lat], [long, lat]
I can't send it in as a string because then it gets read as one single input instead of three arrays. I'm sure there's a 'right in front of me' way, but I'm drawing a blank.
What's the best way to do this?
Edit: To clarify, my question is, what should my JSON look like and how should it be received on my server?
This is what RethinkDB wants in order to make a polygon:
r.polygon([lon1, lat1], [lon2, lat2], [lon3, lat3], ...) → polygon
As per the suggestion, I've added in r.args() to my code:
r.db('places').table('locations').insert({
name: req.body.name,
bounds: r.polygon(r.args(req.body.bounds))
})
Edit
Ok, I was dumb and had a typo in one of my coordinates!
Sending it as an array of arrays and wrapping it in r.args() on the server side works.
What you need is r.args to unpack the array into arguments for r.polygon. https://www.rethinkdb.com/api/javascript/args/
With assumption that req.body.bounds is:
[[long, lat],[long, lat], [long, lat]]
And you are submit a raw JSON string from client.
You first need to decode the JSON payload, and get the bounds field, wrap it with args as following:
var body = JSON.parse(req.body)
r.db('places').table('locations').insert({
name: req.body.name,
bounds: r.polygon(r.args(body.bounds))
})

CSV Parser through angularJS

I am building a CSV file parser through node and Angular . so basically a user upload a csv file , on my server side which is node the csv file is traversed and parsed using node-csv
. This works fine and it returns me an array of object based on csv file given as input , Now on angular end I need to display two table one is csv file data itself and another is cross tabulation analysis. I am facing problem while rendering data, so for a table like
I am getting parse responce as
For cross tabulation we need data in a tabular form as
I have a object array which I need to manipulate in best possible way so as to make easily render on html page . I am not getting a way how to do calculation on data I get so as to store cross tabulation result .Any idea on how should I approach .
data json is :
[{"Sample #":"1","Gender":"Female","Handedness;":"Right-handed;"},{"Sample #":"2","Gender":"Male","Handedness;":"Left-handed;"},{"Sample #":"3","Gender":"Female","Handedness;":"Right-handed;"},{"Sample #":"4","Gender":"Male","Handedness;":"Right-handed;"},{"Sample #":"5","Gender":"Male","Handedness;":"Left-handed;"},{"Sample #":"6","Gender":"Male","Handedness;":"Right-handed;"},{"Sample #":"7","Gender":"Female","Handedness;":"Right-handed;"},{"Sample #":"8","Gender":"Female","Handedness;":"Left-handed;"},{"Sample #":"9","Gender":"Male","Handedness;":"Right-handed;"},{"Sample #":";"}
There are many ways you can do this and since you have not been very specific on the usage, I will go with the simplest one.
Assuming you have an object structure such as this:
[
{gender: 'female', handdness: 'lefthanded', id: 1},
{gender: 'male', handdness: 'lefthanded', id: 2},
{gender: 'female', handdness: 'righthanded', id: 3},
{gender: 'female', handdness: 'lefthanded', id: 4},
{gender: 'female', handdness: 'righthanded', id: 5}
]
and in your controller you have exposed this with something like:
$scope.members = [the above array of objects];
and you want to display the total of female members of this object, you could filter this in your html
{{(members | filter:{gender:'female'}).length}}
Now, if you are going to make this a table it will obviously make some ugly and unreadable html so especially if you are going to repeat using this, it would be a good case for making a directive and repeat it anywhere, with the prerequisite of providing a scope object named tabData (or whatever you wish) in your parent scope
.directive('tabbed', function () {
return {
restrict: 'E',
template: '<table><tr><td>{{(tabData | filter:{gender:"female"}).length}}</td></tr><td>{{(tabData | filter:{handedness:"lefthanded"}).length}}</td></table>'
}
});
You would use this in your html like so:
<tabbed></tabbed>
And there are ofcourse many ways to improve this as you wish.
This is more of a general data structure/JS question than Angular related.
Functional helpers from Lo-dash come in very handy here:
_(data) // Create a chainable object from the data to execute functions with
.groupBy('Gender') // Group the data by its `Gender` attribute
// map these groups, using `mapValues` so the named `Gender` keys persist
.mapValues(function(gender) {
// Create named count objects for all handednesses
var counts = _.countBy(gender, 'Handedness');
// Calculate the total of all handednesses by summing
// all the values of this named object
counts.Total = _(counts)
.values()
.reduce(function(sum, num) { return sum + num });
// Return this named count object -- this is what each gender will map to
return counts;
}).value(); // get the value of the chain
No need to worry about for-loops or anything of the sort, and this code also works without any changes for more than two genders (even for more than two handednesses - think of the aliens and the ambidextrous). If you aren't sure exactly what's happening, it should be easy enough to pick apart the single steps and their result values of this code example.
Calculating the total row for all genders will work in a similar manner.