When should i use mongoDB instead of a relationnal database - mysql

I've followed several courses on udemy about NodeJS/Express,ReactJS/Redux and some the courses are MERN STACK and i've really enjoyed them.But the only thing that instructors don't really explain is the choice of MongoDB.Before taking theses courses i always use whether PostgreSQL or MariaBD for all my projects persistence layer.And now i've just used MongoBD in all theses courses without knowing in what kind of situations it's the best choice or not.All the instructors just say that as data are stored like json json arrays,it's easy to parse as it's naturaly compatible with Javascript just like any JSON format.I would really like to understand technical reasons of using Mongo instead of relationnal database or not according to any given project requirements instead of just using it because instructor do so

I have worked in both SQL and MongoDB.
MongoDB
From my experience MongoDB is best suitable when you follow agile methodologies
wherein you don't really know what your JSON object will have to contain.
SQL
SQL will be best suitable where JSON object is not going to be changed, like
for bank projects, where you know the exact format of JSON which is never going
to be changed.
DIFF
Why I am saying this is, whenever we need to add some extra fields or need to
link some fields with primary, foreign keys it would be pain and wouldn't know
what could go wrong.
Wherein in MongoDB if you need to reference some field you will just need to
add key ref with that field and you are done.
MongoDB
MongoDB would be great when one of your key from one JSON object is to be
referenced to mutiple JSON object by just adding that field name in ref (ref:
'user')
Please vote this answer so that someone in need finds this answer and can get when MongoDB would be best suitable for them.

From my personal experience I would use SQL when the project is somewhat like a social media having to maintain Posts, Comments, likes, blocking, Unblocking. I hope u get the idea.
Mongo Db can also be used for above project type but its aggregations become pain and unreadable.
Mongo Db is good for all projects except socialmedia alike. Once again this is from my experience.

My take is that SQL is preferred when:
Relational integrity (data integrity) is essential
Data is updated frequently
Clear and/or strong relations exist between entities
MongoDB is better for:
Horizontal scaling / sharding
Replicating SQL data to avoid costly queries in production (some alternatives may be better)
I would say that when unsure use SQL. It's much easier to migrate from SQL to NoSQL than the other way around when you change your mind.

When you have large volume of schema-less data.
When your data and users are going to grow indefinitely and you need a scalable cloud database solution.
When you need higher performance boost in certain aspects of your CRUD.
When your software stack is JS based.
You can check the other reasons in this youtube video https://youtu.be/K8xsuFgCRkU

Related

How to migrate data from mongodb to mysql?

I am currently working on an application like to analitics, i has Angularjs app which communicates with Spring REST Client App from which user creates token(trackingID) and use generated script with this id putting on his website to collect information about visitor's actions through another Spring REST tracking App, for tracking app i am using as mongodb to collect visitor actions/visitor info for fast insertion, but for rest client app mysql with user/accounts details.
My question is how to migrate mongo data from tracking app to mysql maybe for getting posibility of join for easily and fastest way of analyze data with any kind of filters from angularjs client app, to create manually any workers that periodically will transfer data from last point to present state from mongo to mysql, or are any existed tools that can be setted for this transfer?
There is no official library to do this.
But you can use mongoexport feature from mongoDB to export it in a CSV format and mysqlimport to import them into MySQL.
Here are links to the documentation MySQL import and MongoDB Export.
One more method you can try to write a program in one of your favorite language and read from MongoDB and write into MySQL
MySQL 5.7 has a new JSON data type, that can be very convenient.
You can create a table at MySQL to receive the JSON messages AS IS, and then use SQL to query it or do a post processing to load the data in a structured set of database tables.
Check this out: https://dev.mysql.com/doc/refman/5.7/en/json.html
I realise this question is a few years old - but recently I've had a number of people enquiring whether a tool I developed (https://virtual.blue/apps/json-converter) can do exactly what the OP is asking (convert MongoDB to SQL) so I am guessing it is still something people want. Keep reading to find out why I am honestly not surprised by this.
The short answer to whether the tool can help you is: perhaps. If your existing data relationships are not too complicated, and your database is not enormous, it may well be worth a try.
However, I thought it might help to try and explain what the issues are with this kind of conversion, since all the answers I have seen so far are along the lines of "try tool X" or "first convert to format Y and then you can slurp it into MySQL using utility Z". ie there is no thought to whether what you get at the end of doing this is going to make sense in terms of data relationships and integrity.
For example, you could just stick your entire database dump in a single field of a single SQL table (ok space limitations might prevent this in reality, but hopefully you get my point). Then your database would be "in MySQL format", but it would be absolutely no use to anyone.
The point is, what you actually want is a fully defined database model, correctly encapsulating all of the intrinsic data relationships. ("Database normalization" as it is known.) If your conversion process gets those relationships wrong, then you have a broken model, and any queries you try to run over it are likely to return nonsense. Unfortunately there is no magic tool that is just going to "know" the best way to represent your data in MySQL, and closing your eyes and shovelling it into a bunch of random tools is unlikely to miraculously get you what you want.
And herein lies the fundamental problem with the "NoSQL" philosophy (fad). They sold people the bogus notion of "non-relational data". My first thought when I heard this was, "How does that work? Surely all data is relational?" By the looks of things we are steadily getting more and more evidence that my instincts were right. ("NoSQL? Why stop there? I go with 'NoDatabase'. It returns no results at all, but it sure is fast!")
The NoSQL madness throws several important fundamental engineering principles to the wind. We shouted "don't hard code!", "DRY!" (Don't Repeat Yourself) because these actions infuse inflexibility into systems. Traditional wisdom makes precisely the same flexibility argument when it advises "create a fully described model with all the data relationships represented". Then you can execute any arbitrary query over it and expect meaningful results. "Yes but there are a whole bunch of queries we are never going to need to run," says the NoSQL proponent. But surely we learnt our lesson on things we are "never going to need to do"? ("I hard code liberally, because I know I am never going to want to change my code." Hmm...)
The arguments about speed are largely moot. Say it turns out you are frequently doing a complex 9 table join, with unsurprisingly sluggish performance. So create an index. Cache it. Swap some disk space for speed. The NoSQL philosophy is to swap data integrity for speed, which makes no sense at all.
When you generate your fast lookup index (cache/table/map/whatever) what you are really doing is creating a view over your model. If your model changes, you can readily update your view. Going from a model to a view is easy - it's a one to many operation and you are on the right side of entropy.
However, when you went with MongoDB you effectively decided to create views without bothering to describe your fundamental model. Now you discover there are queries you want to run, but can't - and so it's no wonder you want to move over to SQL and actually have your data modelled correctly. The problem is you now want to go from a view to a model. Now you're on the wrong side of entropy. Your view is a lossy representation of the model's fundamental relationships. You can't expect a tool to "translate" your database, because you are asking it to insert new relationships which were not originally defined. These are real world relationships that are not machine-guessable. The tool cannot know what relationships were intended.
In short the only way you can do this reliably is to get your hands dirty. An intelligent human, with complete understanding of the system you are modelling needs to sit down and carefully come up with (possibly a substantial amount of) code which effectively picks through the data and resolves all of the insufficiently represented data relationships. If your data is complex then it's going to be a headache and there is no way to cheat.
If your data is still relatively simple then I would suggest making the conversion as soon as possible, before it becomes difficult. In this case my tool (https://virtual.blue/apps/json-converter) may be able to help.
(They really should have asked a Physicist before they came up with all this nonsense...!)
You can download a trial version of Studio 3T for Mongo and export your database to SQL (or JSON) directly

Using MySQL and MongoDB together

I have never used NoSQL before, generally the applications I write requires relations. However, I have encountered a something that I don't know how to go about. So far, I am only designing the database. For now, my main logic is in the MySQL Database. I have static content that I will be hosting through a CDN. However, I have dynamic content that will be updated but very rarely but will be read almost on every request - like phone number, email address, address, additional info. They will not be used for searching, however this data is unstructured. A user can have multiple email addresses, phone numbers, and addresses; and they would be needed for multiple tables. So, using relational database in this case fails my needs (I don't want to create an Entity-Attribute-Value Table for this) and since I know that it doesn't affect the logic - its only used as a "meta-data" I want to keep them in a JSON format. And after Google-ing for sometime, I found out that MongoDB stores "documents" in JSON which sounded like the perfect solution. However, I have one question regarding this. How do I connect these databases together? Do I need to just add a user_id or organization_id "column"/field for a document on create/update and do a "select" query (whatever is the equivalent in the MongoDB) to receive the meta data? Or is there a different way?
I'll present here my opinion. What you're trying to do here is called "polyglot persistence". If you introduce mongo, you'll have 2 architectures for storing your data, different in strength, api, design, what not and this has its price.
Mongo DB, is a great product, I've used it by myself with a great success, but you have to understand that it doesn't provide all the features you would expect from RDBMS like MySQL. For example, it totally lacks transactions.
Moreover if you store in both MySQL and Mongo you'll have to care for data integrity by yourself (what happens if as a part of logical transaction mysql transaction succeeds, but mongo fails to store the data), there is no rollback...
I believe you've got my point.
Yes, mongo really allows to query by various JSON parameters, in fact it features the whole query language, it resembles SQL to some extent, but its not really a "relational" query engine, because mongo is not a relational database, so you don't have JOINs for example. But you've said by yourself that you are not going search by these fields, so I kind of don't understand what benefit you would have from using mongo. Maybe this is only about terminology, but I'm confused with this statement a little.
Where mongo is really shines is when you have a lot of data (its a big data product after all), then you have funny stuff like replica-sets and sharding, but the question is whether you really need it? do you really have "big data" - really huge amount of objects to be stored?
As an alternative, I think maybe you can use a text column for storing the JSON "as is". I mean, you might have a column, storing the JSON.
You even sometimes have "JSON" type as a native type in the database, I'm not sure whether MySQL supports it.
In this case you even can do some operations on these jsons (like, append, partial update and so forth).
Of course the choice is yours, all I'm saying is that you should think whether you have more benefits while using 2 persistence engines, or will make your project more complicated.
Hope this helps

which database suits my application mysql or mongodb ? using Node.js , Backbone , Now.js

I want to make an application like docs.google.com (without its api,completely on my own server) using
frontend : backbone
backend : node
What database would u think is better ? mysql or mongodb ? Should support good scalability .
I am familiar with mysql with php and i will be happy if the answer is mysql.
But many tutorials i saw, they used mongodb, why did they use mongodb without mysql ?
What should i use ?
Can anyone give me link for some sample application(with source) build using backbone , Node , mysql (or mongo) . or atleast app. with Node and mysql
Thanks
With MongoDB, you can just store JSON objects and retrieve them fully-formed, so you don't really need an ORM layer and you spend less CPU time translating your data back-and-forth. The developers behind MongoDB have also made horizontally scaling the database a higher priority and let you run arbitrary Javascript code to pre-process data on the DB side (allowing map-reduce style filtering of data).
But you lose some for these gains: You can't join records. Actually, the JSON structure you store could only be done via joins in SQL, but in MongoDB you only have that one structure to your data, while in SQL you can query differently and get your data represented in alternate ways much easier, so if you need to do a lot of analytics on your database, MongoDB will make that harder.
The query language in MongoDB is "rougher", in my opinion, than SQL's, partly because it's less familiar, and partly because the querying features "feel" haphazardly put together, partially to make it valid JSON, and partially because there are literally a couple of ways of doing the same thing, and some are older ways that aren't as useful or regularly-formatted as the others. And there's the added complexity of the array and sub-object types over SQL's simple row-based design, so the syntax has to be able to handle querying for arrays that contain some of the values you defined, contain all of the values you defined, contain only the values you defined, and contain none of the values you defined. The same distinctions apply to object keys and their values, and this makes the query syntax harder to grasp. (And while I can see the need for edge-cases, the $where query parameter, which takes a javascript function that is run on every record of the data and returns a boolean, is a Siren song because you can easily define what objects you want to return or not, but it has to run on every record in the database, no indexes can be used.)
So, it depends on what you want to do, but since you say it's for a Google Docs clone, you probably don't care about any representation but the document representation, itself, and you're probably only going to query based on document ID, document name, or the owner's ID/name, nothing too complex in the querying.
Then, I'd say being able to take the JSON representation of the document your user is editing, and just throw it into the database and have it automatically index these important fields, is worth the price of learning a new database.
I was also struggling with this choice looking at the hype created by using MongoDB for tasks it was not built for. So my 2 cents are:
Storing and retrieving hierarchical objects, that your documents probably are, is easier in MongoDB, as David says. It becomes more complicated if you want to store documents that are bigger than 16Mb though - MongoDB's answer is GridFS.
Organising documents in folders, groups, keeping track of which user owns which documents and who he/she provided access to them is definitely easier with MySQL - you have the advantage of powerful SQL queries with joins etc., built in EXPLAIN optimization, triggers, functions, stored procedures, etc. MongoDB is nowhere near.
So what prevents you from using both MySQL to organize the documents and MongoDB to store one collection of documents identified by id (or several collections - one for each document type)? It seems to me the best choice and using two databases in one application is not a problem, really.
MySQL will store users, groups, folders, permissions - whatever you fancy - and for each document it will store a reference to the collection and the document id (MongoDB has a special format for it - DBRefs). MongoDB will store documents themselves in collections, if they are all less than 16MB, or the previews and metadata of documents in collections and the whole documents in GridFS.
David provided a good answer. A few things to add to it.
MongoDB's flexible nature permits for easy agile / iterative development.
MongoDB like node.js is asyncronous in nature and works very well within asyncronous environments.
Mongoose is a good ODM (object document mapper) that makes working with MongoDB with Node.js feel very natural. Unlike ORMs this is a very thin layer.
For Google Doc like functionality, the flexibility & very rich data structure provided by MongoDB feels like a much better fit.
You can find some good example posts by searching for mongoose, node and MongoDB.
Here's one that also uses backbone.js and looks good http://mattkopala.com/blog/2012/02/12/getting-started-with-nodejs/

combination mysql mongodb

I am building a web application that requires to be scalable. In a nutshell:
We got users, users have friends, so they got a friendlist. Users can create messages, and messages from your friends are displayed on the homepage, each message is linked to a location and these messages can be filtered by date, for example I want to display all the messages from my friends that where posted yesterday, or display me all messages from location X.
I am now building the application fully in MongoDb, however I am heading into trouble atm. For example:
On the mainpage, we got the message list of the friends of the users, no problem we use:
$db->messages->find(array('users._id' => array('$in' => $userFriendListGoesHere)));
So then we got our messages, however after that, each message has a location, so I have to make a loop through all messages, and get the location from another collection, and also multiple users can be bound to a single message, so we also have to get all the user data from another collection, in MySql simply a join query, in MongoDb 2 loops, and this is my first question: is this a problem? Does this require alot of resources, the looping?
So my idea is to split up with MySql and MongoDb, I use MongoDb to store all the locations (since it are over 350.000+ locations and use lat long calculations) and MySql for the message, users and friends of the users, so second question, can you help me with my decision, should I keep using MongoDb with the loops? Or use a combination?
Thanks for reading and your time.
.. in MySql simply a join query, in MongoDb 2 loops, and this is my first question: is this a problem?
This is par for the course with MongoDB, in fact, it's a core MongoDB trade-off.
MongoDB is based on the precept that joins do not scale. So it has no joins and leaves you to "roll your own". Some libraries like Morphia (for Java) provide built-in logic for loading references.
PHP has the Doctrine project, which should help with some of this.
Does this require alot of resources, the looping?
Kind of? This will really depend on implementation.
It's obviously going to involve a bunch of back and forth with the DB, but it may be less network traffic than the SQL version. You will need memory space for all of the data coming back. But again, that's not terribly different from SQL.
Really, it's up to you to make all of the trade-offs about how this is implemented and who is keeping what in memory.
should I keep using MongoDb with the loops
MongoDB is a great idea when your data is not inherently relational.
In the example you provided, it kinda seems like your data is relational. MySQL and other relational DBs (such as Postgres) are better data stores than MongoDB for relational data. This blog post covers this topic in more detail.
In summary, I'd recommend the following:
Please spend some time analyzing whether your data is inherently relational or not.
If it is not, then MongoDB can give you benefits over using MySQL.
If it is relational, then MySQL is the better solution.
Using both is, of course, possible - but it will create additional work & complexity for you. In the long term - is that worth the effort? Only you will know the answer.
Best of luck with your web app!

Where to use MongoDB and where MySQL?

I am thinking about using one of two databases - MySQL and MongoDB. I am planning to storing text and numeric data and I will building my app in RoR.
So I don't know, which database system could be better for this purpose - can you help me, please, under which criterium I will decide?
Let me cast this question within more general setting and into some historical perspective.
In the 60s they were asking whether to use hierarchical or network database
In the 70s the debate was relational against network
In the 80s Relational turned into SQL databases, so question mutated to SQL vs. network
In the 90s it was SQL against object databases
In 00s it was SQL against XML databases
Today we have SQL vs. NoSQL
Do you see a pattern here? Would you still bet some money onto SQL competitor, especially if it's nothing more than glorified hash table?
I have used also MySQL and MongoDB with Mongoid in my projects, and I can say that if you want to keep binary data like images, mp3s and other stuff in your database so try Mongo, for other reasons you can use SQL databases. MongoDB has no structure - you processing the hash, so you can dynamicly add and remove keys/columns.
In your case I would use MySQL.
In my opinion you should base your decision on the purpose of your application. Do you want to search through your text data, how will you define keys. There is little use in going for MySQL if you have to request each record and scan it. Even if there is functionality to do text scans in MySQL (does it have that?) MongoDB will probably do the job more efficiently. The other way around, if you are not going to use MongoDB's strong points then you might as well go for MySQL.
Another factor might be the deadline for implementing something. If you need it fast, don't waste time on learning something new. If you have time to experiment, figure out the key features you will most likely rely upon in your application.
I think, if you need a hard structure you should use MySQL because it't its nature, but if you need something more dynamic, whith no structure at all (schema-less) you should use MongoDB, I've never use MongoDB but I know it's more object/document oriented.
It would be helpful if you could provide some more detail. Would your data easily fit into a schema, or do you need the flexibility that a document store offers? What about auto-sharding, etc? Without more information, no one can give you advice that fits your needs. Lacking that, you can't hope for feedback any better than people's personal preferences, which is little more than a flamewar waiting to happen.