How to migrate MySql Database to Firestore - mysql

I'm searching the best way to migrate my MySql database to Firebase's new Cloud Firestore.
What are the steps? I'm trying first of all to convert my tables and relations in my relational db to a document logic.
I read about Cloud Firestore REST API because I have more experience on using REST instead of socket, but I'm not sure if that's the point.
Is it a good idea to create a script starting from this sample and running it on nodeJS?
Has anyone already did this thing?
Thanks

I couldn't find a way to do that. But the way I suggest is using a programming language (python is preferable), make a script to turn all your data in the MySql database to a json format file structuring data in the way you want to store in Firestone. Then read the firestore API to populate instances in the documentary. This must work for sure!

You can convert your mySQL database to a CSV file, and then convert that CSV to JSON.

Related

How to import a JSON file to Cloud Firestore

I had developed a web app on a mySQL database and now I am switching to Android Mobile Development but I have large amount of data to be exported into Firebase's Cloud Firestore. I could not find a way to do so, I have the mySQL data stored in JSON and CSV.
Do I have to write a script? If yes then can you share the script or is there some sort of tool?
I have large amounts of data to be exported into Firebase's Cloud Firestore, I could not find a way to do so
If you're looking for a "magic" button that can convert your data from a MySQL database to a Cloud Firestore database, please note that there isn't one.
Do I have to write a script?
Yes, you have to write code in order to convert your actual MySQL database into a Cloud Firestore database. Please note that both types of databases share two different concepts. For instance, a Cloud Firestore database is composed of collections and documents. There are no tables in the NoSQL world.
So, I suggest you read the official documentation regarding Get started with Cloud Firestore.
If yes then can you share the script or is there some sort of tool.
There is no script and no tool for that. You should create your own mechanism for that.

Is there any best way to transfer bulk data from Mysql to Mongodb?

I am using MongoDB first time here. Is there any best way to transfer bulkdata from mysql into MongoDB. I try to search to in different ways but i did not find.
You would have to physically map all your mysql tables to documents in mongodb but you can use an already developed tool.
You can try: Mongify (http://mongify.com/)
It's a super simple way to transform your data from a MySql to MongoDB. It has a ton of support for changing your existing schema into a schema that would work better with MongoDB.
Mongify will read your mysql database, build a translation file for you and all you have to do is map how you want your data transformed.
It supports:
Updating internal IDs (to BSON ObjectID)
Updating referencing IDs
Typecasting values
Embedding Tables into other documents
Before filters (to change data manually before import)
and much much more...
There is also a short 5 min video on the homepage that shows you how easy it is.
Try it out and tell me what you think.
Please up vote if you find this answer helpful.

How to export data from a database to a json file?

In my haskell application I'm using a database, for now it's Postgresql, but in the future it might also be MySql or Sqlite3. I'm looking for a way to export data from my database to a json file via my application, that is, without using the capabilities of a database. How can I do that, is there a relatively easy way? I can do that in a straightforward way, of course, if there're no other options. Your suggestions?

Grails with CSV (No DB)

I have been building a grails application for quite a while with dummy data using MySQL server, this was eventually supposed to be connected to Greenplum DB (postgresql cluster).
But this is not feasible anymore due to firewall issues.
We were contemplating connecting grails to a CSV file on a shared drive( which is constantly updated by greenplum DB, data is appended hourly only)
These CSV files are fairly large(3mb, 30mb and 60mb) The last file has 550,000+ rows.
Quick questions:
Is this even feasible? Can CSV be treated as a database and can grails directly access this CSV file and run queries on it, similar to that of a DB?
Assuming this is feasible, how much rework will be required in the grails codes in Datasource, controller and index ( Currently, we are connected to Mysql and we filter data in controller and index using sql queries and ajax calls using remotefunction)
Will the constant reading( csv -> grails ) and writing (greenplum -> csv) render the csv file corrupt or bring up any more problems?
I know this is not a very robust method, but I really need to understand the feasibility of this idea. Can grails function wihtout any DB and merely a CSV file on a shared drive accesssible to multiple users?
The short answer is, No. This won't be a good solution.
No.
It would be nearly impossible, if at all possible to rework this.
Concurrent access to a file like that in any environment is a recipe for disaster.
Grails is not suitable for a solution like this.
update:
Have you considered using the built in H2 database which can be packaged with the Grails application itself? This way you can distribute the database engine along with your Grails application within the WAR. You could even have it populate it's database from the CSV you mention the first time it runs, or periodically. Depending on your requirements.

Is it possible to store json stuff in database as json type and doing CRUD operation other than PostgreSQL ..?

I am currently developing a project to CRUD json stuff in database using codeigniter. I don’t want to be database specific, so I don’t rely only on PostgreSQL. So is there an efficient way to manipulate .json stuff(json schema and json form structure data) in database?
What would be the procedure?
I am a newbie in this, so please help.
Thanks in advance.
I assume you are talking about storing JSON data in PostgresSQL. You can use JSON datatype and JSON functions that are built into the latest version. Here are some links that might help you to find detailed info about this -
http://wiki.postgresql.org/wiki/What's_new_in_PostgreSQL_9.2#JSON_datatype
http://www.postgresql.org/docs/9.3/static/functions-json.html
And here's a previously asked question about the same topic - How do I query using fields inside the new PostgreSQL JSON datatype?
Only database I know it could be possible without supereffort is PostgreSQL version 9.3.
In SQL Server you could create clr function for working with JSON, or create and parse JSON objects manually
Some useful links:
SQL Server SELECT to JSON function
JSON.SQL: A CLR-resident JSON serializer/deserializer for SQL Server
Consuming JSON Strings in SQL Server - see the resulting function(!!!)
SOLVED!! NoSQL is best database for json stuff. I recommend MongoDB for json stuff. MongoDB
and to work like mysql use
Robomongo gui to manipulate mongodb.