iterate through Poco::JSON::Object in insertion order - json

It is possible to preserve insertion order when parsing a JSON struct with a
Poco::JSON::Parser( new Poco::JSON::ParseHandler( true ) ): the non-default ParseHandler parameter preserveObjectOrder = true is handed over to the Poco::JSON::Objects so that they keep an private list of keys sorted in insertion order.
An object can then be serialized via Object::stringify() to look just like the source JSON string. Fine.
What, however, is the official way to step through a Poco::JSON::Object and access its internals in insertion order? Object::getNames() and begin()/end() use the alphabetical order of keys, not insertion order -- is there another way to access the values, or do I have to patch Poco?

As you already said:
Poco::JSON::ParseHandler goes into the Poco::JSON::Parser-constructor.
Poco::JSON::Parser::parse() creates a Poco::Dynamic::Var.
From that you'll extract a Poco::JSON::Object::Ptr.
The Poco::JSON:Object has the method "getNames". Beginning with this commit it seems to preserve the order, if it was requested via the ParseHandler. (Poco::JSON:Object::getNames 1.8.1, Poco::JSON:Object::getNames 1.9.0)
So now it should work as expected to use:
for(auto const & name : object->getNames()){
auto const & value = object->get(name); // or one of the other get-methods
// ... do things ...
}

Related

azure ADF - Get field list of .csv file from lookup activity

context: azure - ADF brief process description:
Get a list of the fields defined in the first row of a .csv(blobed) file. This is the first step, detect fields
then 2nd step would be a kind of compare with actual columns of an SQL table
3rd one a stored procedure execution to make the alter table task, finishing with a (customized) table containing all fields needed to successfully load the .csv file into the SQl table.
To begin my ADF pipeline, I set up a lookup activity that "querys" the first line of my blobed file, "First row only" flag = ON.As a second pipeline activity, an "Append Variable" task, there I would like to get all .csv fields(first row) retrieved from the lookup activity, as a list.
Here is where a getting the nightmare.
As far I know, with dynamic content I can get an array with all values (w/ format like {"field1_name":"field1_value_1st_row", "field2_name":"field2_value_1st_row", etc })
with something like #activity('Lookup1').output.firstrow.
Or any array element with #activity('Lookup1').output.firstrow.<element_name>,
but I can't figure out how to get a list of all field names (keys?) of the array.
I will appreciate any advice, many thanks!
I would save the part of LookUp Activity because it seems that you are familiar with it.
You could use Azure Function HttpTrigger to get the key list of firstrow JSON object. For example your json object like this as you mentioned in your question:
{"field1_name":"field1_value_1st_row", "field2_name":"field2_value_1st_row"}
Azure Function code:
module.exports = async function (context, req) {
context.log('JavaScript HTTP trigger function processed a request.');
var array = [];
for(var key in req.body){
array.push(key);
}
context.res = {
body: {"keyValue":array}
};
};
Test Output:
Then use Azure Function Activity to get the output:
#activity('<AzureFunctionActivityName>').keyValue
Use Foreach Activity to loop the keyValue array:
#item()
Still based on the above sample input data,please refer to my sample code:
dct = {"field1_name": "field1_value_1st_row", "field2_name": "field2_value_1st_row"}
list = []
for key in dct.keys():
list.append(key)
print(list)
dicOutput = {"keys": list}
print(dicOutput)
Have you considered doing this in ADF data flow? You would map the incoming fields to a SQL dataset without a target schema. Define a new table name in the dataset definition and then map the incoming fields from your CSV to a new target table schema definition. ADF will write the rows to a new table using that file's schema.

Setting lua table in redis

I have a lua script, which simplified is like this:
local item = {};
local id = redis.call("INCR", "counter");
item["id"] = id;
item["data"] = KEYS[1]
redis.call("SET", "item:" .. id, cjson.encode(item));
return cjson.encode(item);
KEYS[1] is a stringified json object:
JSON.stringify({name : 'some name'});
What happens is that because I'm using cjson.encode to add the item to the set, it seems to be getting stringified twice, so the result is:
{"id":20,"data":"{\"name\":\"some name\"}"}
Is there a better way to be handling this?
First, regardless your question, you're using KEYS the wrong way and your script isn't written according to the guidelines. You should not generate key names in your script (i.e. call SET with "item:" .. id as a keyname) but rather use the KEYS input array to declare any keys involved a priori.
Secondly, instead of passing the stringified string with KEYS, use the ARGV input array.
Thirdly, you can do item["data"] = json.decode(ARGV[1]) to avoid the double encoding.
Lastly, perhaps you should learn about Redis' Hash data type - it may be more suitable to your needs.

Play + Slick: How to do partial model updates?

I am using Play 2.2.x with Slick 2.0 (with MYSQL backend) to write a REST API. I have a User model with bunch of fields like age, name, gender etc. I want to create a route PATCH /users/:id which takes in partial user object (i.e. a subset of the fields of a full user model) in the body and updates the user's info. I am confused how I can achieve this:
How do I use PATCH verb in Play 2.2.x?
What is a generic way to parse the partial user object into an update query to execute in Slick 2.0?I am expecting to execute a single SQL statement e.g. update users set age=?, dob=? where id=?
Disclaimer: I haven't used Slick, so am just going by their documentation about Plain SQL Queries for this.
To answer your first question:
PATCH is just-another HTTP verb in your routes file, so for your example:
PATCH /users/:id controllers.UserController.patchById(id)
Your UserController could then be something like this:
val possibleUserFields = Seq("firstName", "middleName", "lastName", "age")
def patchById(id:String) = Action(parse.json) { request =>
def addClause(fieldName:String) = {
(request.body \ fieldName).asOpt[String].map { fieldValue =>
s"$fieldName=$fieldValue"
}
}
val clauses = possibleUserFields.flatMap ( addClause )
val updateStatement = "update users set " + clauses.mkString(",") + s" where id = $id"
// TODO: Actually make the Slick call, possibly using the 'sqlu' interpolator (see docs)
Ok(s"$updateStatement")
}
What this does:
Defines the list of JSON field names that might be present in the PATCH JSON
Defines an Action that will parse the incoming body as JSON
Iterates over all of the possible field names, testing whether they exist in the incoming JSON
If so, adds a clause of the form fieldname=<newValue> to a list
Builds an SQL update statement, comma-separating each of these clauses as required
I don't know if this is generic enough for you, there's probably a way to get the field names (i.e. the Slick column names) out of Slick, but like I said, I'm not even a Slick user, let alone an expert :-)

cant set Index ObjectChoiceField (load slow json)

I have a select that I get Json post with http, but I try to sets initially selected index but there is nothing in the list do not select anything. because the json is great.
public AppMainScreen() {
loadLists();
MySelect = new ObjectChoiceField( "Select: ", new Object[0], 3 );
VerticalFieldManager vfm = new VerticalFieldManager(Manager.VERTICAL_SCROLL);
vfm.add(MySelect);
add(vfm);
}
This statement appears wrong to me:
new ObjectChoiceField( "Select: ", new Object[0],3);
The second parameter to this constructor is supposed to be an array of objects whose .toString() method will be used to populate the choices. In this case, you have given it a 0 length array, i.e. no Objects. So there is nothing to choose. And then you have asked it to automatically select the 3rd item, and of course there is no 3rd item.
You should correct the code to actually supply an object array.
One option to make it easy is have your JSON load actually create a String array with one entry per selectable item. Then you use the index selected to identify the chosen item.

Best way to cache results of method with multiple parameters - Object as key in Dictionary?

At the beginning of a method I want to check if the method is called with these exact parameters before, and if so, return the result that was returned back then.
At first, with one parameter, I used a Dictionary, but now I need to check 3 parameters (a String, an Object and a boolean).
I tried making a custom Object like so:
var cacheKey:Object = { identifier:identifier, type:type, someBoolean:someBoolean };
//if key already exists, return it (not working)
if (resultCache[cacheKey]) return resultCache[cacheKey];
//else: create result ...
//and save it in the cache
resultCache[cacheKey] = result;
But this doesn't work, because the seccond time the function is called, the new cacheKey is not the same object as the first, even though it's properties are the same.
So my question is: is there a datatype that will check the properties of the object used as key for a matching key?
And what else is my best option? Create a cache for the keys as well? :/
Note there are two aspects to the technical solution: equality comparison and indexing.
The Cliff Notes version:
It's easy to do custom equality comparison
In order to perform indexing, you need to know more than whether one object is equal to another -- you need to know which is object is "bigger" than the other.
If all of your properties are primitives you should squash them into a single string and use an Object to keep track of them (NOT a Dictionary).
If you need to compare some of the individual properties for reference equality you're going to have a write a function to determine which set of properties is bigger than the other, and then make your own collection class that uses the output of the comparison function to implement its own a binary search tree based indexing.
If the number of unique sets of arguments is in the several hundreds or less AND you do need reference comparison for your Object argument, just use an Array and the some method to do a naive comparison to all cached keys. Only you know how expensive your actual method is, so it's up to you to decide what lookup cost (which depends on the number of unique arguments provided to the function) is acceptable.
Equality comparison
To address equality comparison it is easy enough to write some code to compare objects for the values of their properties, rather than for reference equality. The following function enforces strict set comparison, so that both objects must contain exactly the same properties (no additional properties on either object allowed) with the same values:
public static propsEqual(obj1:Object, obj2:Object):Boolean {
for(key1:* in obj1) {
if(obj2[key1] === undefined)
return false;
if(obj2[key1] != obj2[key1])
return false;
}
for(key2:* in obj2)
if(obj1[key2] === undefined)
return false;
return true;
}
You could speed it up by eliminating the second for loop with the tradeoff that {A:1, B:2} will be deemed equal to {A:1, B:2, C:'An extra property'}.
Indexing
The problem with this in your case is that you lose the indexing that a Dictionary provides for reference equality or that an Object provides for string keys. You would have to compare each new set of function arguments to the entire list of previously seen arguments, such as using Array.some. I use the field currentArgs and the method to avoid generating a new closure every time.
private var cachedArgs:Array = [];
private var currentArgs:Object;
function yourMethod(stringArg:String, objArg:Object, boolArg:Boolean):* {
currentArgs = { stringArg:stringArg, objArg:objArg, boolArg:boolArg };
var iveSeenThisBefore:Boolean = cachedArgs.some(compareToCurrent);
if(!iveSeenThisBefore)
cachedArgs.push(currentArgs);
}
function compareToCurrent(obj:Object):Boolean {
return someUtil.propsEqual(obj, currentArgs);
}
This means comparison will be O(n) time, where n is the ever increasing number of unique sets of function arguments.
If all the arguments to your function are primitive, see the very similar question In AS3, where do you draw the line between Dictionary and ArrayCollection?. The title doesn't sound very similar but the solution in the accepted answer (yes I wrote it) addresses the exact same techinical issue -- using multiple primitive values as a single compound key. The basic gist in your case would be:
private var cachedArgs:Object = {};
function yourMethod(stringArg:String, objArg:Object, boolArg:Boolean):* {
var argKey:String = stringArg + objArg.toString() + (boolArg ? 'T' : 'F');
if(cachedArgs[argKey] === undefined)
cachedArgs[argKey] = _yourMethod(stringArg, objArg, boolArg);
return cachedArgs[argKey];
}
private function _yourMethod(stringArg:String, objArg:Object, boolArg:Boolean):* {
// Do stuff
return something;
}
If you really need to determine which reference is "bigger" than another (as the Dictionary does internally) you're going to have to wade into some ugly stuff, since Adobe has not yet provided any API to retrieve the "value" / "address" of a reference. The best thing I've found so far is this interesting hack: How can I get an instance's "memory location" in ActionScript?. Without doing a bunch of performance tests I don't know if using this hack to compare references will kill the advantages gained by binary search tree indexnig. Naturally it would depend on the number of keys.