Does Redshift support JSON fields, like Postgresql's json data type? If so what do I do to use it?
You can store JSON in Amazon Redshift, within a normal text field.
There are functions available to extract data from JSON fields, but it is not an effective way to store data since it doesn't leverage the full capabilities of Redshift's column-based architecture.
See: Amazon Redshift documentation - JSON Functions
UPDATE:
Redshift now supports Data column of type "super" which allows saving JSONs and also querying over it.
Added a link to video that further explains the new option:
https://www.youtube.com/watch?v=PR15TVZDgy4
Related
I want download data from a Rest API into a database.The data I want save are typed objects, like java object. I have chosen cassandra because it support the type Array type, Map type, versus standard SQLdatabase(Mysql, Sqlite,..). It is better to serialize java object.
In first, I should create the tables CQL from json schema of REST API. How it is possible to generate CQL table from json schema of REST API.
I know openapi-generator can generate mysql schema from json schema, but don't support CQL for the moment. So I need to search a alternative solution.
I haven't used off-the-shelf packages extensively to manage Cassandra schema but there are possibly open-source projects or software like Hackolade that might do it for you.
https://cassandra.link/cassandra.toolkit/ managed by Anant (I don't have any affiliation) has an extensive list of resources you might be interested in. Cheers!
ABAP Databases, oracle, MaxDB et al., are mostly RDBMS. Right now, I have a JSON structure that cannot be normalised and hence I want to store it as is. So, I want a MongoDB like Object store in ABAP.
What's the best way to achieve this? Is data cluster an option? Perhaps the only option?
I don't think you can connect to some other then supported DBs directly from ABAP. If you have Netweaver Java, you can call some custom Java application, which accesses MongoDB. You can check SAP Hana if there is something similar.
In ABAP you interact with RDBMS via ABAP Dictionary.
It supports data types like LCHR, STRING, RAWSTRING. Checkout docs for more details.
Data cluster is one option, but you can simply use a binary type DB field for storing the JSON data.
There is a method called transformation in ABAP, which converts from ABAP data to XML/JSON data and vice-versa.
There's a simple example on the following blog:
https://blogs.sap.com/2013/07/04/abap-news-for-release-740-abap-and-json/
Comments on the blog page contain more info.
So the situation that I have is,I am developing a form builder like application which needs to be custom for all users. The form is hosted and response collected in database. Now what is the correct way to do the same in mysql like database.
For an example assume two forms, one with a text field and another with radio button and text field. Also once that model is created is there any way to use django forms, or will I have to go some other way.
Recently mysql introduced JSON fields.
As of MySQL 5.7.8, MySQL supports a native JSON data type that enables
efficient access to data in JSON (JavaScript Object Notation)
documents. The JSON data type provides these advantages over storing
JSON-format strings in a string column:
Even if you don't have the latest version of mysql it's still possible to save JSON data in a varchar field and is quite a popular solution supported by many third party libraries that provide JSON support for Django.
The reason that a third party library is needed is because Django doesn't have a built in JSONField. One has been added recently for Postgresql but mysql is still lagging behind.
Alternative that does not involve mysql is to use redis. Django has excellent support for redis and as you know redis hashes are very similar to python dictionaries. ORM support requires third party libraries as with mysql json fields. However it's simpler to think of redis as a python dictionary that can be persisted across sessions and queried very fast. Last but not least the hash is just the tip of the iceberge.
I have exported data from firebase in one json file,I want to create a mysql tables using this json file.Is there any way to do so?
Try to use MySql Workbench. It looks provide Json format to import.
https://dev.mysql.com/doc/workbench/en/wb-admin-export-import.html
I guess you could convert your json into csv and import that into mysql like it has been described here: Importing JSON into Mysql
However, I am not really sure how that would be connected to django (since you mentioned it in your questions title). It might be helpful if you would describe in more detail what you want to do.
You can store data with very dynamic schema with Firebase. MySQL is a traditional RDBMS, you have to analyze the structure of your dataset and decide what parts do you want to convert to relational format.
There's a good news however: there are several packages which support JSON fields on models: https://www.djangopackages.com/grids/g/json-fields/
There's one MySQL specific also: https://django-mysql.readthedocs.org/en/latest/model_fields/json_field.html
Such JSON data is queryable and updatable. So if your dataset has some varying schema parts, you can store those in JSON format.
Starting from MySQL 5.7 JSON data is even natively supported by the DB. Read more here:
https://django-mysql.readthedocs.org/en/latest/model_fields/json_field.html
Is there a proper way to store generic JSON in MongoDB? With 'generic' I mean any JSON, including hashes with keys that are restricted in MongoDB documents.
For example, we want to store JSON schemas which use the key $ref, which is not allowed in a MongoDB document. This means that a JSON schema as such cannot be stored as a MongoDB document.
Is there a smart way around this? The only options I've come up with is to do back-and-forth deep key replacements or to store it as JSON text.
We're using Morphia, so the solution should be compatible with it.
The solutions you have already thought of are probably the best. Store the schemas as JSON strings then parse them back to JSON on retrieval.