Is there easy way to use custom Javascript library in Mapreduce in Couchbase? Something like this. I would like to get all keys for example.
function (doc, meta) {
if(doc.doctype && doc.doctype=="regions"){
var keys = jQuery.map( doc, function( value, key ) {
return key;
});
emit(doc.id,keys)
}
}
See this question for how to retrieve all keys (just add the additional logic for your doctype) How to extract, in a list, all keys from a bucket in couchbase
As far as I know you can't use libraries such as Jquery with couchbase.
Related
Hi there I'm using a api which returns as follows
{"secret-finance":{"usd":0.04883396}}
Problem is I'm using vue and retrieving data like this
async getCurrentSefiPrice() {
await axios
.get(
"https://api.coingecko.com/api/v3/simple/price?ids=secret-finance&vs_currencies=usd"
)
.then(
(response) =>
(this.sefi_token_current_price = response.secret-finance.usd)
// console.log(response)
);
console.log(this.sefi_token_current_price);
}
But when I use secret-finance to get usd value I get an error.
Thanks in advance.
#haseebsaeed
You'll need to reference it as
response["secret-finance"].usd
You need to use key notation rather then dot notion when a key has a hyphen in it. Any key that contains characters that are not allowed in JavaScript variables you'll need to reference them as above.
A further example,
If the secret-finance object has a property key of us-dollars rather than the current one of usd, you would then access it by doing,
response["secret-finance"]["us-dollars"]
does anyone have a suggestion for the best way to deep convert a js list of lists to nested ordered maps with Immutable.js?
You can create your own custom conversion. For example, to turn JS objects into Immutable.OrderedMap:
function fromJSOrdered(js) {
return typeof js !== 'object' || js === null ? js :
Array.isArray(js) ?
Immutable.Seq(js).map(fromJSOrdered).toList() :
Immutable.Seq(js).map(fromJSOrdered).toOrderedMap();
}
The fromJS has a second parameter called reviver, which can be exactly used for this.
import Immutable from 'immutable';
const reviver = (key, value) =>
Immutable.Iterable.isKeyed(value) ? value.toOrderedMap() : value.toList();
const data = Immutable.fromJS(js, reviver);
the response by #Albert Olivé had problems on my case use, because of recursion and lack of main context. I've tried with a second context argument to map, with problems again.
Finally I realized that I didn't care about order in submaps, just in main map passed to the function, to maintain order provided by server on html lists. So I changed the function to no recursive this way:
fromJSOrdered(js) {
if (typeof js !== 'object' || js === null || Array.isArray(js)) {
return fromJS(js);
}
return new OrderedMap(fromJS(js));
}
I actually published a package recently that can transform an object, array or Map of objects into Immutable List's and Record's:
https://github.com/jhukdev/immutable-parsejs
You can take a look at the source if you don't want Record's, easy to change.
Record's are nice though, as you have direct property access, meaning in future if you wanted to switch away from ImmutableJs, it's an easier prospect
I can implement Mvvm with Knockout.js. But I want to use it with cross browser(FF and Chrome) supported Html 5 offline storage.
I want to bind html objects to offline storage.
I haven't tried it, but there is a knockout.localStorage project on GitHub, that seems to be what you are looking for.
With that plugin, you should be able to pass an object as a second argument, when you create your observable, which saves the observable into localStorage.
From the documentation:
var viewModel = {
name: ko.observable('James', {persist: 'name'})
}
ko.applyBindings(viewModel);
You can use a library such as amplify.js which can serialize objects to localStorage (cross browser). It falls back to older storage tools for older browsers too. First, unwrap the observables to a JSON object, then use amplify.store to serialize the object and store it. Then you can pull it back out and map it back to an observable object when you want to fetch it.
http://amplifyjs.com/api/store/
http://craigcav.wordpress.com/2012/05/16/simple-client-storage-for-view-models-with-amplifyjs-and-knockout/
His solution works!
I worked out a solution bases on the subscribe feature of KnockoutJS. It takes a model and persist all the observable properties.
ko.persistChanges = function (vm, prefix) {
if (prefix === undefined) {
prefix = '';
}
for (var n in vm) {
var observable = vm[n];
var key = prefix + n;
if (ko.isObservable(observable) && !ko.isComputed(observable)) {
//track change of observable
ko.trackChange(observable, key);
//force load
observable();
}
}
};
Check http://keestalkstech.com/2014/02/automatic-knockout-model-persistence-offline-with-amplify/ for code and JSFiddle example.
I'm populating a YUI DataTable via JSON, starting from the sample code DataTable + DataSource.Get + JSON Data. Despite its promising title, this sample uses JSONP, not straight JSON. In my case, I'm querying with a relative URL, so I don't need (or want) JSONP.
My code defines a data source and schema like this:
var dataSource = new Y.DataSource.Get({ source: "myLocalUrl.json" });
dataSource.plug(Y.Plugin.DataSourceJSONSchema, {
schema: { resultListLocator: "result.path.to.array", resultFields: ["key1", "key2"]}
});
Nowhere in here does it specify JSONP, but apparently that's the default behavior-- despite the security warnings in the JSONP documentation. Perhaps I'm missing something obvious, but I've checked the API docs for Y.DataSource and Y.DataSource.Get, and neither is particularly enlightening.
I had better luck with DataSource.IO
var dataSource = new Y.DataSource.IO({ source: "myLocalUrl.json" });
dataSource.plug(Y.Plugin.DataSourceJSONSchema, {
schema: { resultListLocator: "result.path.to.array", resultFields: ["key1", "key2"]}
});
I would like to know if any better way to create multiple collections fetching from a single big JSON file. I got a JSON file looks like this.
{
"Languages": [...],
"ProductTypes": [...],
"Menus": [...],
"Submenus": [...],
"SampleOne": [...],
"SampleTwo": [...],
"SampleMore": [...]
}
I am using the url/fetch to create each collection for each node of the JSON above.
var source = 'data/sample.json';
Languages.url = source;
Languages.fetch();
ProductTypes.url = source;
ProductTypes.fetch();
Menus.url = source;
Menus.fetch();
Submenus.url = source;
Submenus.fetch();
SampleOne.url = source;
SampleOne.fetch();
SampleTwo.url = source;
SampleTwo.fetch();
SampleMore.url = source;
SampleMore.fetch();
Any better solution for this?
Backbone is great for when your application fits the mold it provides. But don't be afraid to go around it when it makes sense for your application. It's a very small library. Making repetitive and duplicate GET requests just to fit backbone's mold is probably prohibitively inefficient. Check out jQuery.getJSON or your favorite basic AJAX library, paired with some basic metaprogramming as following:
//Put your real collection constructors here. Just examples.
var collections = {
Languages: Backbone.Collection.extend(),
ProductTypes: Backbone.Collection.extend(),
Menus: Backbone.Collection.extend()
};
function fetch() {
$.getJSON("/url/to/your/big.json", {
success: function (response) {
for (var name in collections) {
//Grab the list of raw json objects by name out of the response
//pass it to your collection's constructor
//and store a reference to your now-populated collection instance
//in your collection lookup object
collections[name] = new collections[name](response[name]);
}
}
});
}
fetch();
Once you've called fetch() and the asyn callback has completed, you can do things like collections.Menus.at(0) to get at the loaded model instances.
Your current approach, in addition to being pretty long, risks retrieving the large file multiple times (browser caching won't always work here, especially if the first request hasn't completed by the time you make the next one).
I think the easiest option here is to go with straight jQuery, rather than Backbone, then use .reset() on your collections:
$.get('data/sample.json', function(data) {
Languages.reset(data['Languages']);
ProductTypes.reset(data['ProductTypes']);
// etc
});
If you wanted to cut down on the redundant code, you can put your collections into a namespace like app and then do something like this (though it might be a bit too clever to be legible):
app.Languages = new LanguageCollection();
// etc
$.get('data/sample.json', function(data) {
_(['Languages', 'ProductTypes', ... ]).each(function(collection) {
app[collection].reset(data[collection]);
})
});
I think you can solve your need and still stay into the Backbone paradigm, I think an elegant solution that fits to me is create a Model that fetch the big JSON and uses it to fetch all the Collections in its change event:
var App = Backbone.Model.extend({
url: "http://myserver.com/data/sample.json",
initialize: function( opts ){
this.languages = new Languages();
this.productTypes = new ProductTypes();
// ...
this.on( "change", this.fetchCollections, this );
},
fetchCollections: function(){
this.languages.reset( this.get( "Languages" ) );
this.productTypes.reset( this.get( "ProductTypes" ) );
// ...
}
});
var myApp = new App();
myApp.fetch();
You have access to all your collections through:
myApp.languages
myApp.productTypes
...
You can easily do this with a parse method. Set up a model and create an attribute for each collection. There's nothing saying your model attribute has to be a single piece of data and can't be a collection.
When you run your fetch it will return back the entire response to a parse method that you can override by creating a parse function in your model. Something like:
parse: function(response) {
var myResponse = {};
_.each(response.data, function(value, key) {
myResponse[key] = new Backbone.Collection(value);
}
return myResponse;
}
You could also create new collections at a global level or into some other namespace if you'd rather not have them contained in a model, but that's up to you.
To get them from the model later you'd just have to do something like:
model.get('Languages');
backbone-relational provides a solution within backbone (without using jQuery.getJSON) which might make sense if you're already using it. Short answer at https://stackoverflow.com/a/11095675/70987 which I'd be happy to elaborate on if needed.