Best way to compare values to two javascript objects - ecmascript-6

I am writing a bunch of unit tests and was wondering what is the best way to compare values of two javascript objects (the actual vs the expected). Lets say I have the following
ActualObject: {
Val1: '1',
Val2: '2',
Val3: '3',
Val4: '4'
}
ExpectedObject: {
Val1: '1',
Val2: '2',
Val3: '3',
Val4: '4'
}
Now I want to check if the values of the properties in each of the objects are equal. What is the best way to do this. Currently I am comparing all the properties individually. The alternative I can think of is JSON.stringify, however I'm not sure if this will change the order of the properties at random?

There isn't really a simple "one answer fits all" solution to this, especially as the parameters of your question are quite broad. For example, what would you consider equality? Strict? Loose? Equality of only own enumerable properties, or of all properties?
I would recommend steering clear of using JSON.stringify as 1) it is a fairly costly operation to serialise an object, especially if performing frequently, and 2) it transforms values in potentially dangerous ways for comparison sake (e.g. JSON.stringify(NaN) === JSON.stringify(null) //=> true).
You should use a library implementation such as lodash's isEqual, and save yourself the pain of reinventing the wheel.
However for the sake of completeness, and to give you an idea of a simple, naive approach, you could loop over each property of your object ExpectedObject and check for equality with the equivalent property in the object ActualObject, like so:
function isEqual (ExpectedObject, ActualObject) {
// if the ExpectedObject has a different number of
// keys than ActualObje, it is not equal
if (Object.keys(ExpectedObject).length !== Object.keys(ActualObject).length) {
return false
}
for (const key in ExpectedObject) {
// if ActualObject does not have a property that is
// on ExpectedObject, it is not equal
if (!(key in ActualObject)) {
return false
}
// if a property's value on ActualObject is not equal
// with a strict comparison, to the equivalent property
// on ExpectedObject, it is not equal
if (ActualObject[key] !== ExpectedObject[key]) {
return false
}
}
return true
}
console.log(isEqual({ a:1 }, { a: 1 })) //=> true
console.log(isEqual({ a:1 }, { a: "1" })) //=> false
console.log(isEqual({ a:1 }, { a: 1, b: 2 })) //=> false
Obviously you would need to introduce type-checking so that you know you're dealing with objects to begin with, but there's a basic idea for you to think about.

Related

How can I query for multiple values after a wildcard?

I have a json object like so:
{
_id: "12345",
identifier: [
{
value: "1",
system: "system1",
text: "text!"
},
{
value: "2",
system: "system1"
}
]
}
How can I use the XDevAPI SearchConditionStr to look for the specific combination of value + system in the identifier array? Something like this, but this doesn't seem to work...
collection.find("'${identifier.value}' IN identifier[*].value && '${identifier.system} IN identifier[*].system")
By using the IN operator, what happens underneath the covers is basically a call to JSON_CONTAINS().
So, if you call:
collection.find(":v IN identifier[*].value && :s IN identifier[*].system")
.bind('v', '1')
.bind('s', 'system1')
.execute()
What gets executed, in the end, is (simplified):
JSON_CONTAINS('["1", "2"]', '"2"') AND JSON_CONTAINS('["system1", "system1"]', '"system1"')
In this case, both those conditions are true, and the document will be returned.
The atomic unit is the document (not a slice of that document). So, in your case, regardless of the value of value and/or system, you are still looking for the same document (the one whose _id is '12345'). Using such a statement, the document is either returned if all search values are part of it, and it is not returned if one is not.
For instance, the following would not yield any results:
collection.find(":v IN identifier[*].value && :s IN identifier[*].system")
.bind('v', '1')
.bind('s', 'system2')
.execute()
EDIT: Potential workaround
I don't think using the CRUD API will allow to perform this kind of "cherry-picking", but you can always use SQL. In that case, one strategy that comes to mind is to use JSON_SEARCH() for retrieving an array of paths corresponding to each value in the scope of identifier[*].value and identifier[*].system i.e. the array indexes and use JSON_OVERLAPS() to ensure they are equal.
session.sql(`select * from collection WHERE json_overlaps(json_search(json_extract(doc, '$.identifier[*].value'), 'all', ?), json_search(json_extract(doc, '$.identifier[*].system'), 'all', ?))`)
.bind('2', 'system1')
.execute()
In this case, the result set will only include documents where the identifier array contains at least one JSON object element where value is equal to '2' and system is equal to system1. The filter is effectively applied over individual array items and not in aggregate, like on a basic IN operation.
Disclaimer: I'm the lead developer of the MySQL X DevAPI Connector for Node.js

Destructuring/list assignment with the `has` declarator

[I ran into the issues that prompted this question and my previous question at the same time, but decided the two questions deserve to be separate.]
The docs describe using destructuring assignment with my and our variables, but don't mention whether it can be used with has variables. But Raku is consistent enough that I decided to try, and it appears to work:
class C { has $.a; has $.b }
class D { has ($.a, $.b) }
C.new: :a<foo>; # OUTPUT: «C.new(a => "foo", b => Any)»
D.new: :a<foo>; # OUTPUT: «D.new(a => "foo", b => Any)»
However, this form seems to break attribute defaults:
class C { has $.a; has $.b = 42 }
class D { has ($.a, $.b = 42) }
C.new: :a<foo>; # OUTPUT: «C.new(a => "foo", b => 42)»
D.new: :a<foo>; # OUTPUT: «C.new(a => "foo", b => Any)»
Additionally, flipping the position of the default provides an error message that might provide some insight into what is going on (though not enough for me to understand if the above behavior is correct).
class D { has ($.a = 42, $.b) }
# OUTPUT:
===SORRY!=== Error while compiling:
Cannot put required parameter $.b after optional parameters
So, a few questions: is destructuring assignment even supposed to work with has? If so, is the behavior with default values correct/is there a way to assign default values when using destucturing assignment?
(I really hope that destructuring assignment is supported with has and can be made to work with default values; even though it might seem like a niche feature for someone using classes for true OO, it's very handy for someone writing more functional code who wants to use a class as a slightly-more-typesafe Hash with fixed keys. Being able to write things like class User { has (Str $.first-name, Str $.last-name, Int $.age) } is very helpful for that sort of code)
This is currently a known bug in Rakudo. The intended behavior is for has to support list assignment, which would make syntax very much like that shown in the question work.
I am not sure if the supported syntax will be:
class D { has ($.a, $.b = 42) }
D.new: :a<foo>; # OUTPUT: «D.new(a => "foo", b => 42)»
or
class D { has ($.a, $.b) = (Any, 42) }
D.new: :a<foo>; # OUTPUT: «D.new(a => "foo", b => 42)»
but, either way, there will be a way to use a single has to declare multiple attributes while also providing default values for those attributes.
The current expectation is that this bug will get resolved sometime after the RakuAST branch is merged.

DC.JS How to handle objects with different amount of properties

Let's say i have 2 objects each with the same properties but one has an extra property middleName and the other does not.
How should i handle this in DC.js?
var objects = [{
name: "De Smet",
firstName: "Jasper",
adress: "Borsbeke",
},{
name: "De Backer",
firstName: "Dieter",
middleName: "middleName",
adress: "Borsbeke"
},{
name: "De Bondtr",
firstName: "Simon",
middleName: "OtherMiddleName",
adress: "Denderleeuw"
}
]
The wanted behaviour would be that the object without the property gets filtered out. Like so:
Here is a fiddle:
https://jsfiddle.net/mj92shru/41/
It seems to add the property middlename to the first object and assigns it the next middlename it finds
Adding the property to the first object and adding a placeholder value like "none" works but it doesnt really produce wanted behaviour.
I realize i could filter out the objects where the middlename is set to "none" but this would be difficult in the actual application i am writing
i've also found that adding the object without the property last causes it to crash.
Indeed, using undefined fields for your dimension or group keys can crash crossfilter because it does not validate its data. NaN, null, and undefined do not have well-defined sorting operations.
It's strange to see the value folded into another bin, but I suspect it's the same undefined behavior, rather than something you can depend on.
If you have fields which may be undefined, you should always default them, even if you don't want the value:
middleNameDimension = j.dimension(d => d.middleName || 'foo'),
I think you do want to filter your data, but not in the crossfilter sense where those rows are removed and do not influence other charts. Instead, it should just be removed from the group without affecting anything else.
You can use a "fake group" for this, and there is one in the FAQ which is suited perfectly for your problem:
function remove_bins(source_group) { // (source_group, bins...}
var bins = Array.prototype.slice.call(arguments, 1);
return {
all:function () {
return source_group.all().filter(function(d) {
return bins.indexOf(d.key) === -1;
});
}
};
}
Apply it like this:
.group(remove_bins(middleNameGroup, 'foo'))
Fork of your fiddle.
Be careful with this, because a pie chart implicitly adds up to 100%, and in this case it only adds up to 66%. This may be confusing for users, depending how it is used.

reactivemongo - merging two BSONDocuments

I am looking for the most efficient and easy way to merge two BSON Documents. In case of collisions I have already handlers, for example if both documents include Integer, I will sum that, if a string also, if array then will add elements of the other one, etc.
However due to BSONDocument immutable nature it is almost impossible to do something with it. What would be the easiest and fastest way to do merging?
I need to merge the following for example:
{
"2013": {
"09": {
value: 23
}
}
}
{
"2013": {
"09": {
value: 13
},
"08": {
value: 1
}
}
}
And the final document would be:
{
"2013": {
"09": {
value: 36
},
"08": {
value: 1
}
}
}
There is a method in BSONDocument.add, however it doesn't check uniqueness, it means I would have at the end 2 BSON documents with "2013" as a root key, etc.
Thank you!
If I understand you inquiry, you are looking to aggregate field data via composite id. MongoDB has a fairly slick aggregate framework. Part of that framework is the $group pipeline aggregate keyword. This will allow you to specify and _id to group by which could be defined as a field or a document as in your example, as well as perform aggregation using accumulators such as $sum.
Here is a link to the manual for the operators you will probably need to use.
http://docs.mongodb.org/manual/reference/operator/aggregation/group/
Also, please remove the "merge" tag from your original inquiry to reduce confusion. Many MongoDB drivers include a Merge function as part of the BsonDocument representation as a way to consolidate two BsonDocuments into a single BsonDocument linearly or via element overwrites and it has no relation to aggregation.
Hope this helps.
ndh

apply different functions to each element of a Perl data structure

Given an arbitrarily nested data structure, how can I create a new data structure so that all the elements in it have been standardized by applying a function on all the elements depending on the type of the element. For example, I might have
$data = {
name => 'some one',
date => '2010-10-10 12:23:45',
sale => [34, 22, 65],
cust => {
name => 'Jimmy',
addr => '1 Foobar Way',
amnt => 452.024,
item => ['books', 'pens', 'post-it notes']
}
}
and I want to convert all text values to upper case, all dates to UTC date times, find the square of all integers, round down all real numbers and add 1, and so on. So, in effect, I want to apply a different function to each element depending on the type of element.
In reality the data might arrive via a database query, in which case they are already a Perl data structure, or they might start life as a JSON object, in which case I can use JSON::from_json to convert it to a Perl data structure. The idea is to standardize all the values in the data structure based on the value type, and then spit out the Perl data structure back again as a JSON object.
I read the answers to executing a function on every element of a data structure and feel that Data::Rmap might do the trick, but can't figure out how. Seems like Rmap works on all the keys as well, not just the values.
It's crazy straightforward with Data::Rmap you mentioned.
use Data::Rmap qw( rmap );
rmap { $_ = transform($_); } $data;
Regarding the question in the comments:
use Data::Rmap qw( rmap );
use Scalar::Util qw( looks_like_number );
# Transforms $_ in place.
sub transform {
if (looks_like_number($_)) {
if (...) {
$_ *= 2;
}
$_ = 0+$_; # Makes it look like a number to JSON::XS
} else {
...
}
}
&rmap(\&transform, $data);