Create schemaless collection in mongoDB via mongoose - json

I have the following mongoose schema records:
var mongoose = require('mongoose');
module.exports = mongoose.model('lM', {
any : mongoose.Schema.Types.Mixed,
},'mlr');
And in my code I am doing as such:
var lm = require('../server/models/records');
new lm().save(lmr);
Being that lmr is a JSON object.
This produces a mongodb database with the name I provided but the records inside that collection contain only:
_id: objectID
_v: 0
The JSON objects is nowhere to be seen. How can I get the JSON object inside the any wrapper in the schema?

var lm = require('../server/models/records');
new lm({'any':lmr}).save();
In save() method pass callback function[optional] if you want to track error if any.
new lm({'any':lmr}).save(function(err){
if(err) {
console.log(err)
}else{
console.log('saved')
}
});
To create the schemaless collection you will have to set strict:false which is by default true. strict option ensures values passed to your model constructor that were not specified in schema do not get saved to the db.
strict option docs

Related

How to pull down an schema-less JSON document from CosmosDB in Xamarin App

I am creating an app in Xamarin and am having issues querying a general JSON document from my CosmosDB. I am able to query my DB with a known structure (very similar to what we see in the Xamarin ToDo Example):
public async static Task<List<ToDoItem>> GetToDoItems()
{
var todos = new List<ToDoItem>();
if (!await Initialize())
return todos;
**var todoQuery = docClient.CreateDocumentQuery<ToDoItem>(
UriFactory.CreateDocumentCollectionUri(databaseName, collectionName),
new FeedOptions { MaxItemCount = -1, EnableCrossPartitionQuery = true })
.Where(todo => todo.Completed == false)
.AsDocumentQuery();**
while (todoQuery.HasMoreResults)
{
var queryResults = await todoQuery.ExecuteNextAsync<ToDoItem>();
todos.AddRange(queryResults);
}
return todos;
}
The problem I see with this "code fixed scheme" approach is that if the scheme of your JSON file changes throughout development, older versions of code will overwrite the newer scheme in CosmosDB since writes to the DB are on a document level and not a property level. Instead, it would be helpful for older versions of the code to be able to pull down the latest scheme and work with the properties that it knows about without having to force the user to update.
Does anyone know how to query a schema-less JSON document out of CosmosDB? Thanks!
Use JObject as the type to query a schema-less JSON document out of CosmosDB, and use following code to pull the raw JSON data into your object:
JsonConvert.DeserializeObject<Class>(JSONString);

Data from Mongo db arrives later than data from a json file on server in Next js App

I have a form that currently uses data from a json file.
The data is retrieved like so
const categoryList = require('../data/categories.json');
I am changing this to retrieve data from an api which fetches data from mongodb.
http://localhost:3000/api/categories
The above link gives me same results as the json file.
import { getCategories } from '../../lib/hooks';
....
const PostEditor = () => {
const [categories] = getCategories(); // this gives me all the categories from api as an array
console.log("categories from db")
console.log(categories)
const catList = require('../data/categories.json'); // same result as above
console.log("cat list from json file")
console.log(catList)
...
...
return (
<>
...
...
<Autocomplete
multiple
options={catList} // Data from db(categories field) does not reach here.
limitTags={2}
From the above code, both console log from db and console log from the json file have same values in browser.
The console log in command prompt shows values from db as "undefined". I think the data from db is not reaching the page in time but from the json file it does.
How to solve this problem?
I use Next js.
Edit:
I moved the data fetching from Api into a parent component (directly under pages) and passed the categories as props to the child component
<PostEditor categories={cats}/>
Still facing the same issue.
As mongodb data is external, so it's taking time to load. You have to call asynchronously. Check like this:
const PostEditor = async() => {
const [categories] = await getCategories();
...
}

Is Mongoose query result read-only?

How can I modify an object returned by a Mongoose query?
Assume we have the following schema:
var S = new mongoose.Schema( { 'name': String, 'field': String } );
I do the following query and modification to the result:
var retrieve = function(name, callback) {
S.findOne({ name: name }).exec(function (err, obj) {
if (err) return handleError(err);
obj['field'] = 'blah';
callback(obj);
});
}
The obj.field will not contain blah but the original value returned by the query, as if it was read-only. What is going on?
Note: my environment is Node.js, Express, Mongoose and MongoDB
Note: This is an edit, my original answer was rather different
So this is a little confusing but Mongoose returns MongooseDocument objects and not plain JSON objects. So use the .lean() method on obj which will turn it into JSON, and from there you start altering it as you wish.
With thanks to Ze Jibe.
The doc object returned from mongoose is somehow read only. To get a writeable object from it you must run:
var writeableObject = doc.toObject()
writeableObject['field'] = 'blah'
res.send(writeableObject)
Scroll down to "Transform" in the mongoose documentation to read more: link

NodeJS + Mongo - how to get the contents of collection?

I'm writing a simple NodeJS app with mongo. For connecting to mongo I use:
var mongo = require('mongodb'),
Server = mongo.Server,
Db = mongo.Db,
ObjectID = require('mongodb').ObjectID;
db.open(function(err,db) {...};
So, I have database "docs", and I've created a collection called "companies". Now it has 4 objects (records) in it. I want to get the full contents of this collection as an array and show them line-by-line:
//get companies list
app.get('/companies',function(req,res){
db.collection("companies",function(err,collection){
collection.find({},function(err, companies) {
companies.forEach(function(err,company){
console.log (company);
}
);
});
});
});
However, Node returns me such error:
TypeError: Object #<Cursor> has no method 'forEach'
Any ideas? Thanks in advance.
The companies parameter that's passed into your find callback is a Cursor object, not an array. Call toArray on it to convert the query results to an array of objects, or each to process the results one at a time.
use .each on companies, instead of forEach

Backbone multiple collections fetch from a single big JSON file

I would like to know if any better way to create multiple collections fetching from a single big JSON file. I got a JSON file looks like this.
{
"Languages": [...],
"ProductTypes": [...],
"Menus": [...],
"Submenus": [...],
"SampleOne": [...],
"SampleTwo": [...],
"SampleMore": [...]
}
I am using the url/fetch to create each collection for each node of the JSON above.
var source = 'data/sample.json';
Languages.url = source;
Languages.fetch();
ProductTypes.url = source;
ProductTypes.fetch();
Menus.url = source;
Menus.fetch();
Submenus.url = source;
Submenus.fetch();
SampleOne.url = source;
SampleOne.fetch();
SampleTwo.url = source;
SampleTwo.fetch();
SampleMore.url = source;
SampleMore.fetch();
Any better solution for this?
Backbone is great for when your application fits the mold it provides. But don't be afraid to go around it when it makes sense for your application. It's a very small library. Making repetitive and duplicate GET requests just to fit backbone's mold is probably prohibitively inefficient. Check out jQuery.getJSON or your favorite basic AJAX library, paired with some basic metaprogramming as following:
//Put your real collection constructors here. Just examples.
var collections = {
Languages: Backbone.Collection.extend(),
ProductTypes: Backbone.Collection.extend(),
Menus: Backbone.Collection.extend()
};
function fetch() {
$.getJSON("/url/to/your/big.json", {
success: function (response) {
for (var name in collections) {
//Grab the list of raw json objects by name out of the response
//pass it to your collection's constructor
//and store a reference to your now-populated collection instance
//in your collection lookup object
collections[name] = new collections[name](response[name]);
}
}
});
}
fetch();
Once you've called fetch() and the asyn callback has completed, you can do things like collections.Menus.at(0) to get at the loaded model instances.
Your current approach, in addition to being pretty long, risks retrieving the large file multiple times (browser caching won't always work here, especially if the first request hasn't completed by the time you make the next one).
I think the easiest option here is to go with straight jQuery, rather than Backbone, then use .reset() on your collections:
$.get('data/sample.json', function(data) {
Languages.reset(data['Languages']);
ProductTypes.reset(data['ProductTypes']);
// etc
});
If you wanted to cut down on the redundant code, you can put your collections into a namespace like app and then do something like this (though it might be a bit too clever to be legible):
app.Languages = new LanguageCollection();
// etc
$.get('data/sample.json', function(data) {
_(['Languages', 'ProductTypes', ... ]).each(function(collection) {
app[collection].reset(data[collection]);
})
});
I think you can solve your need and still stay into the Backbone paradigm, I think an elegant solution that fits to me is create a Model that fetch the big JSON and uses it to fetch all the Collections in its change event:
var App = Backbone.Model.extend({
url: "http://myserver.com/data/sample.json",
initialize: function( opts ){
this.languages = new Languages();
this.productTypes = new ProductTypes();
// ...
this.on( "change", this.fetchCollections, this );
},
fetchCollections: function(){
this.languages.reset( this.get( "Languages" ) );
this.productTypes.reset( this.get( "ProductTypes" ) );
// ...
}
});
var myApp = new App();
myApp.fetch();
You have access to all your collections through:
myApp.languages
myApp.productTypes
...
You can easily do this with a parse method. Set up a model and create an attribute for each collection. There's nothing saying your model attribute has to be a single piece of data and can't be a collection.
When you run your fetch it will return back the entire response to a parse method that you can override by creating a parse function in your model. Something like:
parse: function(response) {
var myResponse = {};
_.each(response.data, function(value, key) {
myResponse[key] = new Backbone.Collection(value);
}
return myResponse;
}
You could also create new collections at a global level or into some other namespace if you'd rather not have them contained in a model, but that's up to you.
To get them from the model later you'd just have to do something like:
model.get('Languages');
backbone-relational provides a solution within backbone (without using jQuery.getJSON) which might make sense if you're already using it. Short answer at https://stackoverflow.com/a/11095675/70987 which I'd be happy to elaborate on if needed.