How to update Array in couchbase via SDK - couchbase

looked thru the docs and was wondering if there is a way to update a doc in array directly. I only found Append, Prepend and Insert and for now my work around is
bucket
.mutateIn(docID)
.remove(path + "[" +index +"]")
.arrayInsert(path +"[" +index +"]", doc)
.execute((err, result) => {}
which works but i doubt ideal as it causes 2 operations in my case actually 3 on the doc as i have to find
the index for this Doc in array before i can delete and insert again

There is a sub-document command called replace that allows an element to be updated.
bucket
.mutateIn(docID)
.replace(path + "[" +index +"]", doc)
.execute((err, result) => {}

Looks like this method is missing in the SDK, I will open a ticket for it. In the meantime, you could also use N1QL via ARRAY_REMOVE or ARRAY_REPLACE (tks to vsr)
https://docs.couchbase.com/server/6.0/n1ql/n1ql-language-reference/arrayfun.html#fn-array-remove

Related

Output my JSON data before I use Firebase $save or $add in console?

I've been trying to debug for hours a Firebase rule problem and was wondering if there is something easier available.
My problem is that I save my firebaseObject with $save (or create with $add) and get a permission denied because of my rules. However, both the rules and the object is pretty complex and there are dozens of rules which are involved. In my simulator, I think I got it all, but still get permission denied.
The problem is that I am not 100% sure how the JSON data actually looks which $save tries to send to Firebase. If I use the normal console.log(myObject), I get of course a list of all values and functions inside this object, but this isn't the same as the raw JSON I would expect (like { "name": "value" }).
Is there any way to display the actual plain JSON data $save sends to copy this into the rule simulator and debug? Or is there any other way to see which exact permission is denied?
Otherwise, I have to go one by one, switching my permissions off and on which would be a pretty long night for me. :(
If the value of the $firebaseObject is an object, the only difference (in addition to the prototype-wired methods) should be a number of $-prefixed properties (like $id and $resolved). So you should be able to see the actual JSON of what will be written to the database using something like this:
var written = {};
Object.keys(myObject).forEach(function (key) {
if (key.charAt(0) !== "$") { written[key] = myObject[key]; }
});
console.log(JSON.stringify(written));
The $$hashKey entries mentioned in your comment are added by AngularJS. A more general mechanism could be used to remove/ignore all $-prefixed keys throughout the object:
console.log(JSON.stringify(myObject, function (key, val) {
return key.charAt(0) === "$" ? undefined : val;
}));

Saving JSON data to DB in Zotonic

Im trying to write a small app that retrieves a JSON file (it contains a list of items, which all have some properties), saves its contents to the DB and then displays some of it later on. I have Zotonic up and running, and generating some HTML is no problem.
ATM i'm stuck trying to figure out how to define a custom resource and how to get the data from the JSON in the DB. When the data is there I should be fine, that part seems covered ok by the documentation.
I wrote some standalone erlang scripts that fetch the data and I noticed that Zotonic has a library for decoding JSON so that part should be fine. Any tips on where to put which code or where to look further?
The z_db module allows for creating custom tables by using:
z_db:create_table(Table, Cols, Context).
The Table variable is your table name which can be either an atom or a list containing a single atom.
The Cols is a list of column definitions, which are defined by records. Currently the record definition (you can find this in include/zotonic.hrl) is:
-record(column_def, {name, type, length, is_nullable=true, default, primary_key}).
See Erlang docs on records for more info on records
Example code which I put in users/sites/[sitename]/models/m_[sitename].erl:
init(Context) ->
case z_db:table_exists(?table,Context) of
false ->
z_db:create_table(tablename,
[
#column_def{name=id, type="serial"},
#column_def{name=gid, type="integer", is_nullable=false},
#column_def{name=magnitude, type="real"},
#column_def{name=depth, type="real"},
#column_def{name=location, type="character varying"},
#column_def{name=time, type="integer"},
#column_def{name=date, type="integer"}
], Context);
true -> ok
end,
ok.
Pay attention to what options of the record you specify. Most of the errors I got were e.g. from specifying a length on the integer fields.
The models/m_sitename:init/1 does not get called on site start. The sitename:init/1 does get called so I call the init function there to ensure the table exists. Example:
init(Context) ->
m_sitename:init(Context).
It is called by Zotonic with the Context variable of the site automatically. You can get this variable manually as well with z:c(sitename).. So if you call the m_sitename:init(Context). from somewhere else you would do:
m_sitename:init(z:c(sitename)).
Next, insertion in the DB can be done with:
z_db:insert(Table, PropList, Context).
Where Table is again an atom or a list containing a single atom representing the table name. Context is the same as above.
PropList is a property list which is a list containing tuples consisting of two elements where the first is an atom and the second is its associated value/property. Example:
PropList = [
{row, Value},
{anotherrow, AnotherValue}
].
Table = tablename.
Context = z:c(sitename).
z_db:insert(Table, PropList, Context).
See Erlang docs on Property Lists for more info on property lists.
=== The dependencies have been updated so if you build from source the step directly below is no longer needed ===
The JSON part is bit more tricky. Included with Zotonic are mochijson2 and as a secondary dependency also jiffy. The latest version of jiffy contains jiffy:decode/2 which allows you to specify maps as a return type. Much more readable than the standard {struct, {struct, <<"">>}} monster. To update to the latest version edit the line in deps/twerl/rebar.config that says
{jiffy, ".*", {git, "https://github.com/davisp/jiffy.git", {tag, "0.8.3"}}},
to
{jiffy, ".*", {git, "https://github.com/davisp/jiffy.git", {tag, "0.14.3"}}},
Now run z:m(). in the Zotonic shell. (you must do this after every change in your code).
Now check in the Zotonic shell if there is a jiffy:decode/2 available by typing jiffy: <tab>, it will show a list of available functions and their arity.
To retrieve a JSON file from the internet run:
{ok, {{_, 200, _}, _, Body}} = httpc:request(get, {"url-to-JSON-here", []}, [], [])
Which will yield the variable Body with the contents. See Erlang docs on http client for more info on this call.
Next convert the contents of Body to Erlang terms with:
JsonData = jiffy:decode(Body, [return_maps]).
What you have to do next depends a lot on the structure of your JSON resource. Keep in mind that everything is now in binary UTF-8 encoded strings! If you print JsonData to screen (just enter JsonData. in your Zotonic/Erlang shell) you will see a lot of #map{<<"key"", <<"Value">>} this.
My data was nested so I had to extract the needed data like this:
[{_,ItemList}|_] = ListData.
This gave me a list of maps, and in order to deal with them as individual items I used the following function:
get_maps([]) ->
done;
get_maps([First|Rest]) ->
Map = maps:get(<<"properties">>, First),
case is_map(Map) of
true ->
map_to_proplist(Map),
get_maps(Rest);
false -> done
end,
done;
get_maps(_) ->
done.
As you might remember, the z_db:insert/3 function needs a property list to populate rows, so that what the call to map_to_proplist/1 is for. How this function looks is completely dependent on how your data looks but as an example here is what worked for me:
map_to_proplist(Map) ->
case is_map(Map) of
true ->
{Value1,_} = string:to_integer(binary_to_list(maps:get(<<"key1">>, Map))),
{Value2,_} = string:to_float(binary_to_list(maps:get(<<"key2">>, Map))),
{Value3,_} = string:to_float(binary_to_list(maps:get(<<"key3">>, Map))),
Value4 = binary_to_list(maps:get(<<"key4">>, Map)),
{Value5,_} = string:to_integer(binary_to_list(maps:get(<<"key5">>, Map))),
{Value6,_} = string:to_integer(binary_to_list(maps:get(<<"key6">>, Map))),
PropList = [{rowname1, Value1}, {rowname2, Value2}, {rowname3, Value3}, {rowname4, Value4}, {rowname5, Value5}, {rowname6, Value6}],
m_sitename:insert_items(PropList,z:c(sitename)),
ok;
false ->
ok
end.
See the documentation on string:to_list/1 as to why the tuples are needed when casting. The call to m_sitename:insert_items(PropList,z:c(sitename)) calls the z_db:insert/3 in models/m_sitename.erl but wrapped in a catch:
insert_items(PropList,Context) ->
(catch z_db:insert(?table, PropList, Context)).
Ok, quite a long post but this should get you up and running if you were looking for this answer.
The above was done with Zotonic 0.13.2 on Erlang/OTP 18.
A repost (except the JSON part) of my post in the Zotonic Developers group.

How to identify duplicate keys when parsing a json object with json-cpp?

I am trying to parse a json object read from a file.
I want to identify duplicate keys, as json-cpp doesn't like them (even if they are not illegal in json).
I need to be able to say: ERROR: your json file has duplicate keys and we dont like that.
Json::Reader reader(Json::Features::strictMode());
Using reader in strictMode does not do the trick.
Set
rejectDupKeys
in
void Json::CharReaderBuilder::setDefaults ( Json::Value * settings )
JsonCPP Doc
There is no way out of the box, but you can program that functionality.
Since JsonCPP uses a map to store object keys, you have to add some code to:
Value &Value::resolveReference(const char *key, bool isStatic)
First, you have to be sure you are parsing (and not accessing some Json::Value). Then, you have to add something (like an exception or a flag) to this if:
if (it != value_.map_->end() && (*it).first == actualKey)
{
// key is already present: if parsing, throw!
return (*it).second;
}
Open an issue. That could be added easily. (Sga's idea might be the best way.) We've done a lot of work recently to make it easier to add features while maintaining binary-compatibility.

How to enumerate the keys and values of a record in AppleScript

When I use AppleScript to get the properties of an object, a record is returned.
tell application "iPhoto"
properties of album 1
end tell
==> {id:6.442450942E+9, url:"", name:"Events", class:album, type:smart album, parent:missing value, children:{}}
How can I iterate over the key/value pairs of the returned record so that I don't have to know exactly what keys are in the record?
To clarify the question, I need to enumerate the keys and values because I'd like to write a generic AppleScript routine to convert records and lists into JSON which can then be output by the script.
I know it's an old Q but there are possibilities to access the keys and the values now (10.9+). In 10.9 you need to use Scripting libraries to make this run, in 10.10 you can use the code right inside the Script Editor:
use framework "Foundation"
set testRecord to {a:"aaa", b:"bbb", c:"ccc"}
set objCDictionary to current application's NSDictionary's dictionaryWithDictionary:testRecord
set allKeys to objCDictionary's allKeys()
repeat with theKey in allKeys
log theKey as text
log (objCDictionary's valueForKey:theKey) as text
end repeat
This is no hack or workaround. It just uses the "new" ability to access Objective-C-Objects from AppleScript.
Found this Q during searching for other topics and couldn't resist to answer ;-)
Update to deliver JSON functionality:
Of course we can dive deeper into the Foundation classes and use the NSJSONSerialization object:
use framework "Foundation"
set testRecord to {a:"aaa", b:"bbb", c:"ccc"}
set objCDictionary to current application's NSDictionary's dictionaryWithDictionary:testRecord
set {jsonDictionary, anError} to current application's NSJSONSerialization's dataWithJSONObject:objCDictionary options:(current application's NSJSONWritingPrettyPrinted) |error|:(reference)
if jsonDictionary is missing value then
log "An error occured: " & anError as text
else
log (current application's NSString's alloc()'s initWithData:jsonDictionary encoding:(current application's NSUTF8StringEncoding)) as text
end if
Have fun, Michael / Hamburg
If you just want to iterate through the values of the record, you could do something like this:
tell application "iPhoto"
repeat with value in (properties of album 1) as list
log value
end repeat
end tell
But it's not very clear to me what you really want to achieve.
Basically, what AtomicToothbrush and foo said. AppleScript records are more like C structs, with a known list of labels, than like an associative array, with arbitrary keys, and there is no (decent) in-language way to introspect the labels on a record. (And even if there were, you’d still have the problem of applying them to get values.)
In most cases, the answer is “use an associative array library instead.” However, you’re specifically interested in the labels from a properties value, which means we need a hack. The usual one is to force an error using the record, and then parse the error message, something like this:
set x to {a:1, b:2}
try
myRecord as string
on error message e
-- e will be the string “Can’t make {a:1, b:2} into type string”
end
Parsing this, and especially parsing this while allowing for non-English locales, is left as an exercise for the reader.
ShooTerKo's answer is incredibly helpful to me.
I'll bring up another possibility I'm surprised I didn't see anyone else mention, though. I have to go between AppleScript and JSON a lot in my scripts, and if you can install software on the computers that need to run the script, then I highly recommend JSONHelper to basically make the whole problem go away:
https://github.com/isair/JSONHelper

mongodb translation for sql INSERT...SELECT

How to simply duplicate documents from collectionABC and copy them into collectionB if a condition like {conditionB:1} and add a timestamp like ts_imported - without knowing the details contained within the original documents?
I could not find a simple equivalent for mongodb which is similar to mysql's INSERT ... SELECT ...
You can use javascript from mongoshell to achieve a similar result:
db.collectionABC.find({ conditionB: 1 }).
forEach( function(i) {
i.ts_imported = new Date();
db.collectionB.insert(i);
});
I realise that this is an old question but...there is a better way of doing it now. MongoDB has now something called aggregation pipeline (v 3.6 and above, maybe some older ones too - I haven't checked). The aggregation pipeline allows you to do more complex things like perform joins, add fields and save documents into a different collection. For the OP's case, the pipeline would look like this:
var pipeline = [
{$match: {conditionB: 1}},
{$addFields: {ts_imported: ISODate()}},
{$out: 'collectionB'}
]
// now run the pipeline
db.collectionABC.aggregate(pipeline)
Relevant docs:
Aggregation pipeline
$out stage
some important limits
Mongodb does not have that kind of querying ability whereby you can (inside the query) insert to another collection based upon variables from the first collection.
You will need to pull that document out first and then operate on it.
You could technically use an MR for this but I have a feeling it will not work for your scenario.
It seems that this works in the context of sequence generation
http://docs.mongodb.org/manual/tutorial/create-an-auto-incrementing-field/