I need a more or less very general advice how to update my DB.
I have a huge JSON (ugly doubly nested) data coming from the client and what's worse is that I do not know which particular one has been modified, which means I can't use { $set: something ..
I desinged referenced documents in order to avoid doubly nested array.
Therefore I have 5 collections.
locations
floors
stores
restrooms
benches
e.g.
location has floors
floor has stores, restrooms, and benches and so on.
here is my simplified schema
const Location = new Schema({
floors: [{
type: Schema.Types.ObjectId,
ref: 'FloorSchema',
}],
});
const FloorSchema = new Schema({
stores: [{
type: Schema.Types.ObjectId,
ref: 'StoreSchema',
}],
restrooms: [{
type: Schema.Types.ObjectId,
ref: 'RestroomSchema',
}],
benches: [{
type: Schema.Types.ObjectId,
ref: 'BenchSchema',
}],
});
From the client side, there is no api request until the changes are done.
So the first request I get from the client is huge map data with no history recorded.
Do you think it is a reasonable idea to drop the entire collections (except the location collection) and save a whole new map data instead of investigating what has been modified?
Thank you,
======
I found one question&answer which seems to be very relevant.
I will try the first solution
Related
I want to create a Tree like storage structure to be used with my app, but can't find enough documentation on how to create a tree model using Waterline attributes.
Case is simple. I do need to have a set of folders, that can have multiple levels of subfolders and in the end, files. What you usually do in mySQL for this kind of data is to add a parent_id field to your model as a foreign key to the model itself.
How can this be done using attributes in sailsjs/waterline model?
I've tried doing something like the following, which generates quite a bit of redundant and orphaned data:
--
attributes: {
name: {
type: 'string'
},
parentFolder: {
model: 'Folder'
},
childFolders: {
model: 'Folder',
via: 'parentItem'
}
}
--
Any ideas?
And by the way, if this is possible, let's say for example using mySQL as a backend. How will it replicate to say, mongoDB?
This seemed to work:
name: {
type: 'string',
maxLength: 255,
required: true
},
parent: {
model: 'folder'
},
childs: {
collection: 'folder',
via: 'parent'
}
I do believe duplicates were being generated by posting data directly via GET in the browser. I'm posting data with a client via POST and it seems to work as expected. (At least from what I see in mySQL)
Is there an inverse of Mongoose.js validation that can inflate the subdocument when the parent is retrieved? I may have been looking at the docs so long I'm not recognizing an existing feature for what it is.
A beauty of MongoDB is that the query specifications (e.g. {likes: {$gt: 10, $le: 14}} are themselves Javascript objects, and until recently have been storing them in a MongoDB instance as subdocuments.
However, upgrading from MongoDB 2.4 to 2.6, these are no longer valid to store as such, and am now getting the error: The dollar ($) prefixed field '$or' ... is not valid for storage
Am thus in the situation in this Google Groups Discussion. The author there suggests flattening the document to a String. This situation can also occur if the subdocuments have legitimate Javascript attributes that have embedded dots (e.g. {"802.11g": ...})
That's easy enough to by specifying JSON.parse and JSON.stringify as the getter/setter in Mongoose.js:
var ProjectSchema = new Schema({
name: { type: String, required: false, default: "New project" },
spec: {type: mongoose.Schema.Types.Mixed, set: JSON.stringify, get: JSON.parse},
});
But the getter only gets called if I explicitly ask for the attribute value. The attribute is still a string underneath and gets passed as such:
Project.findById(req.params.projectId, function(err, project) {
console.log("......"+(typeof project.spec)) // project.spec is an object!
res.send(project); // project.spec is a String!
});
Obviously i can call model.spec = JSON.parse(model.spec) within each Model.find(...) call and for each flattened attribute but it'd be nice to do it at one central location.
https://groups.google.com/forum/?fromgroups=#!topic/mongoose-orm/8AV6aoJzdiQ
You can invoke your getter in res.send by adding the {toJSON: {getters: true}} option to the ProjectSchema definition. You'll probably want to enable that for the toObject option as well for cases like passing the doc to console.log.
var ProjectSchema = new Schema({
name: { type: String, required: false, default: "New project" },
spec: {type: mongoose.Schema.Types.Mixed, set: JSON.stringify, get: JSON.parse},
}, {
toJSON: {getters: true},
toObject: {getters: true}
});
Docs here.
We have an internal API that was specifically built to be used with a new piece of software I'm building that runs on Backbone. The API has a single URL and takes JSON as input to determine what it needs to return. It essentially allows me to build custom queries with JSON that return exactly what I'm looking for.
Thing is this JSON can get pretty verbose and is often 3–4 levels deep, but sometimes may just be a few lines and just 1 level deep.
First question first: How do I send a string of JSON along with the ID when I do a fetch()? Do I have to set these parameters as the model or collection's defaults?
Here is an example of a really simple string to get a specific user's info
{
"which" : "object",
"object" : {
"type" : "customer",
"place" : "store",
"customerID" : "14"
}
}
As others have suggested it will likely be challenging to work with SOAP, but it shouldn't be impossible. Backbone models and collections communicate with the server through the sync operation; you should be able to customize that. I think something along these lines might get the ball rolling (for models):
Backbone.SoapyModel = Backbone.Model.extend({
sync: function(method, model, options) {
// force POST for all SOAP calls
method = 'create';
options = _.extend(options, {
// Setting the data property will send the model's state
// to the server. Add whatever complexity is needed here:
data: JSON.stringify({
"which" : "object",
"object" : model.toJSON()
}),
// Set the request's content type
contentType: 'application/json'
});
// Defer the rest to Backbone
return Backbone.sync.apply(this, [method, model, options]);
}
});
var SoapyModelImpl = Backbone.SoapyModel.extend({
url: '/test'
});
var soapTest = new SoapyModelImpl({
id: 42,
name: 'bob',
address: '12345 W Street Dr',
phone: '867 5304'
});
soapTest.fetch();
I have a table called employee. empid and empname are 2 fields. I want to get data from that table through web service to sencha touch list view. My web service is returning json data; I mean I converted the output to json. My javascript code is as below
Ext.data.JsonStore({
proxy: new Ext.data.HttpProxy({
url: 'http://localhost:58984/Service1.asmx/GetListData', // werservicename.asmx/webMethodName
headers: {
'content-type': 'application/json'
}
}),
root: 'd',
idProperty: 'empid', // provide the unique id of the row
fields: [empname] // specify the array of fields
});
itemTpl: '{empname}'
But I am getting 2 errors:
empname is not defined
The following classes are not declared even if their files have been loaded: 'Acsellerate.view.Main'. Please check the source code of their corresponding files for possible typos: 'app/view/Main.js
It would be good if you pointed out exactly where the errors are occurring, but for nr1 the issue is probably that fields: [empname] should be fields: ['empname'].
Not sure if nr1 and nr2 are related, but if fixing the first error doesn't solve your problem you really need to show us your Main.js
Hope this helps
What's basic difference between JsonStore and JsonReader in context to Ext.data?
I mean when I should go for JsonStore and when I should use JsonReader as for me both are providing same solution.
Actually they are two separate things. A Ext.data.JsonReader reads a given JSON object and returns data records (Ext.data.Record objects) that are later stored by the respective data store.
The Ext.data.Store is the base class for all Ext storages and uses helper objects for retrieving data (Ext.data.DataProxy), for writing data (Ext.data.DataWriter) and for reading data (Ext.data.DataReader). These base classes come in different flavors such as:
Ext.data.DataProxy:
Ext.data.DirectProxy
Ext.data.HttpProxy
Ext.data.MemoryProxy
Ext.data.ScriptTagProxy
Ext.data.DataWriter
Ext.data.JsonWriter
Ext.data.XmlWriter
Ext.data.DataReader
Ext.data.JsonReader
Ext.data.XmlReader
This all builds up to a very extendable component that allows the developer to configure exactly what he needs to tweak. To make it easier for developers (especially new ones) Ext comes with some pre-configured data stores:
Ext.data.ArrayStore to make reading from simple Javascript arrays easier
Ext.data.DirectStore, just a store preconfigured with an Ext.data.DirectProxy and an Ext.data.JsonReader
Ext.data.JsonStore, just a store preconfigured with an Ext.data.JsonReader
Ext.data.XmlStore, just a store preconfigured with an Ext.data.XmlReader
So actually a Ext.data.JsonStore is just a convenience class to make it easier for the developer.
The following two snippets will create the same (or comparable) stores:
var store = new Ext.data.JsonStore({
url: 'get-images.php',
root: 'images',
idProperty: 'name',
fields: ['name', 'url', {name:'size', type: 'float'}, {name:'lastmod', type:'date'}]
});
// or
var store = new Ext.data.Store({
url: 'get-images.php',
reader: new Ext.data.JsonReader({
root: 'images',
idProperty: 'name',
fields: ['name', 'url', {name:'size', type: 'float'}, {name:'lastmod', type:'date'}]
});
});
A JsonReader reads JSON from a data source into an Ext Store. JsonData is not a specifically-defined Ext object, although maybe you've seen it as a variable name? In what context are you using it?