Creating mysql tables from json file in django - mysql

I have exported data from firebase in one json file,I want to create a mysql tables using this json file.Is there any way to do so?

Try to use MySql Workbench. It looks provide Json format to import.
https://dev.mysql.com/doc/workbench/en/wb-admin-export-import.html

I guess you could convert your json into csv and import that into mysql like it has been described here: Importing JSON into Mysql
However, I am not really sure how that would be connected to django (since you mentioned it in your questions title). It might be helpful if you would describe in more detail what you want to do.

You can store data with very dynamic schema with Firebase. MySQL is a traditional RDBMS, you have to analyze the structure of your dataset and decide what parts do you want to convert to relational format.
There's a good news however: there are several packages which support JSON fields on models: https://www.djangopackages.com/grids/g/json-fields/
There's one MySQL specific also: https://django-mysql.readthedocs.org/en/latest/model_fields/json_field.html
Such JSON data is queryable and updatable. So if your dataset has some varying schema parts, you can store those in JSON format.
Starting from MySQL 5.7 JSON data is even natively supported by the DB. Read more here:
https://django-mysql.readthedocs.org/en/latest/model_fields/json_field.html

Related

convert mongoDB Collection to mySQL Database

I was created my project in spring 4 MVC + Hibernate with MongoDB. now, I have to convert it into the Hibernate with MySQL. my problem is I have too many collections in MongoDB the format of bson and json. how can I convert that file into MySQL table format? is that possible?
Mongodb is a non-relational database, while MySQL is relational. The key difference is that the non relational database contains documents (JSON objects) which can contain hierarchical structure, where as the relational database expects the objects to be normalised, and broken down into tables. It is therefore not possible to simply convert the bson data from MongoDB into something which MySQL will understand. You will need to write some code that will read the data from MongoDB and the write it into MySQL.
The documents in your MongoDB collections represent serialised forms of some classes (POJOs, domain object etc) in your project. Presumably, you read this data from MongoDB deserialise it into its class form and use it in your project e.g. display it to end users, use it in calculations, generate reports from it etc.
Now, you'd prefer to host that data in MySQL so you'd like to know how to migrate the data from MongoDB to MySQL but since the persistent formats are radically different you are wondering how to do that.
Here are two options:
Use your application code to read the data from MongoDB, deserialise it into your classes and then write that data into MySQL using JDBC or an ORM mapping layer etc.
Use mongoexport to export the data from MongoDB (in JSON format) and then write some kind of adapter which is capable of mapping this data into the desired format for your MySQL data model.
The non functionals (especially for the read and write aspects) will differ between these approaches but fundamentally both approaches are quite similar; they both (1) read from MongoDB; (2) map the document data to the relational model; (3) write the mapped data into MySQL. The trickiest aspect of this flow is no. 2 and since only you understand your data and your relational model there is no tool which can magically do this for you. How would a thirdparty tool be sufficiently aware of your document model and your relational model to be able to perform this transformation for you?
You could investigate a MongoDB JDBC driver or use something like Apache Drill to facilitate JDBC queries onto your Mongo DB. Since these could return java.sql.ResultSet you would be dealing with a result format which is more suited for writing to MySQL but it's likely that this still wouldn't match your target relational model and hence you'd still need some form of transformation code.

Is there any best way to transfer bulk data from Mysql to Mongodb?

I am using MongoDB first time here. Is there any best way to transfer bulkdata from mysql into MongoDB. I try to search to in different ways but i did not find.
You would have to physically map all your mysql tables to documents in mongodb but you can use an already developed tool.
You can try: Mongify (http://mongify.com/)
It's a super simple way to transform your data from a MySql to MongoDB. It has a ton of support for changing your existing schema into a schema that would work better with MongoDB.
Mongify will read your mysql database, build a translation file for you and all you have to do is map how you want your data transformed.
It supports:
Updating internal IDs (to BSON ObjectID)
Updating referencing IDs
Typecasting values
Embedding Tables into other documents
Before filters (to change data manually before import)
and much much more...
There is also a short 5 min video on the homepage that shows you how easy it is.
Try it out and tell me what you think.
Please up vote if you find this answer helpful.

Correct way to store data in SQL database when columns are unknown

So the situation that I have is,I am developing a form builder like application which needs to be custom for all users. The form is hosted and response collected in database. Now what is the correct way to do the same in mysql like database.
For an example assume two forms, one with a text field and another with radio button and text field. Also once that model is created is there any way to use django forms, or will I have to go some other way.
Recently mysql introduced JSON fields.
As of MySQL 5.7.8, MySQL supports a native JSON data type that enables
efficient access to data in JSON (JavaScript Object Notation)
documents. The JSON data type provides these advantages over storing
JSON-format strings in a string column:
Even if you don't have the latest version of mysql it's still possible to save JSON data in a varchar field and is quite a popular solution supported by many third party libraries that provide JSON support for Django.
The reason that a third party library is needed is because Django doesn't have a built in JSONField. One has been added recently for Postgresql but mysql is still lagging behind.
Alternative that does not involve mysql is to use redis. Django has excellent support for redis and as you know redis hashes are very similar to python dictionaries. ORM support requires third party libraries as with mysql json fields. However it's simpler to think of redis as a python dictionary that can be persisted across sessions and queried very fast. Last but not least the hash is just the tip of the iceberge.

Can you store JSON fields on Redshift?

Does Redshift support JSON fields, like Postgresql's json data type? If so what do I do to use it?
You can store JSON in Amazon Redshift, within a normal text field.
There are functions available to extract data from JSON fields, but it is not an effective way to store data since it doesn't leverage the full capabilities of Redshift's column-based architecture.
See: Amazon Redshift documentation - JSON Functions
UPDATE:
Redshift now supports Data column of type "super" which allows saving JSONs and also querying over it.
Added a link to video that further explains the new option:
https://www.youtube.com/watch?v=PR15TVZDgy4

Is it possible to store json stuff in database as json type and doing CRUD operation other than PostgreSQL ..?

I am currently developing a project to CRUD json stuff in database using codeigniter. I don’t want to be database specific, so I don’t rely only on PostgreSQL. So is there an efficient way to manipulate .json stuff(json schema and json form structure data) in database?
What would be the procedure?
I am a newbie in this, so please help.
Thanks in advance.
I assume you are talking about storing JSON data in PostgresSQL. You can use JSON datatype and JSON functions that are built into the latest version. Here are some links that might help you to find detailed info about this -
http://wiki.postgresql.org/wiki/What's_new_in_PostgreSQL_9.2#JSON_datatype
http://www.postgresql.org/docs/9.3/static/functions-json.html
And here's a previously asked question about the same topic - How do I query using fields inside the new PostgreSQL JSON datatype?
Only database I know it could be possible without supereffort is PostgreSQL version 9.3.
In SQL Server you could create clr function for working with JSON, or create and parse JSON objects manually
Some useful links:
SQL Server SELECT to JSON function
JSON.SQL: A CLR-resident JSON serializer/deserializer for SQL Server
Consuming JSON Strings in SQL Server - see the resulting function(!!!)
SOLVED!! NoSQL is best database for json stuff. I recommend MongoDB for json stuff. MongoDB
and to work like mysql use
Robomongo gui to manipulate mongodb.