How to export data from a database to a json file? - json

In my haskell application I'm using a database, for now it's Postgresql, but in the future it might also be MySql or Sqlite3. I'm looking for a way to export data from my database to a json file via my application, that is, without using the capabilities of a database. How can I do that, is there a relatively easy way? I can do that in a straightforward way, of course, if there're no other options. Your suggestions?

Related

Nifi for database migration

Why would nifi be a good use case for database migration if all it does is sending the same data over and over again?(I have tried to extract data from a database and putting them into a JSON file I was seeing multiple entries of the same tuple.) Wouldn't that be a waste of computing resources?
If I just want to migrate the database once and sometimes update the changed columns only, is nifi still a good tool to use?
It all depends on which database you want to migrate from/to which environments. Is it a large enterprise Oracle DB you want to migrate into Hadoop? Look into Sqoop https://sqoop.apache.org/. I would recommend Sqoop for doing one-time imports of large databases into Hadoop.
You can use NiFi to do an import as well, using processors such as ExecuteSQL, QueryDatabaseTable, GenerateTableFetch... They all work with JDBC connectors, so depending on if your database supports this, you could opt for this as well.
If you want to get incremental changes, you could use the QueryDatabaseTable processor and use it's Maximum-Value Column property, Matt Burgess has an article explaining how you can put this in place over at https://community.hortonworks.com/articles/51902/incremental-fetch-in-nifi-with-querydatabasetable.html.

How to migrate MySql Database to Firestore

I'm searching the best way to migrate my MySql database to Firebase's new Cloud Firestore.
What are the steps? I'm trying first of all to convert my tables and relations in my relational db to a document logic.
I read about Cloud Firestore REST API because I have more experience on using REST instead of socket, but I'm not sure if that's the point.
Is it a good idea to create a script starting from this sample and running it on nodeJS?
Has anyone already did this thing?
Thanks
I couldn't find a way to do that. But the way I suggest is using a programming language (python is preferable), make a script to turn all your data in the MySql database to a json format file structuring data in the way you want to store in Firestone. Then read the firestore API to populate instances in the documentary. This must work for sure!
You can convert your mySQL database to a CSV file, and then convert that CSV to JSON.

Is there any best way to transfer bulk data from Mysql to Mongodb?

I am using MongoDB first time here. Is there any best way to transfer bulkdata from mysql into MongoDB. I try to search to in different ways but i did not find.
You would have to physically map all your mysql tables to documents in mongodb but you can use an already developed tool.
You can try: Mongify (http://mongify.com/)
It's a super simple way to transform your data from a MySql to MongoDB. It has a ton of support for changing your existing schema into a schema that would work better with MongoDB.
Mongify will read your mysql database, build a translation file for you and all you have to do is map how you want your data transformed.
It supports:
Updating internal IDs (to BSON ObjectID)
Updating referencing IDs
Typecasting values
Embedding Tables into other documents
Before filters (to change data manually before import)
and much much more...
There is also a short 5 min video on the homepage that shows you how easy it is.
Try it out and tell me what you think.
Please up vote if you find this answer helpful.

Grails with CSV (No DB)

I have been building a grails application for quite a while with dummy data using MySQL server, this was eventually supposed to be connected to Greenplum DB (postgresql cluster).
But this is not feasible anymore due to firewall issues.
We were contemplating connecting grails to a CSV file on a shared drive( which is constantly updated by greenplum DB, data is appended hourly only)
These CSV files are fairly large(3mb, 30mb and 60mb) The last file has 550,000+ rows.
Quick questions:
Is this even feasible? Can CSV be treated as a database and can grails directly access this CSV file and run queries on it, similar to that of a DB?
Assuming this is feasible, how much rework will be required in the grails codes in Datasource, controller and index ( Currently, we are connected to Mysql and we filter data in controller and index using sql queries and ajax calls using remotefunction)
Will the constant reading( csv -> grails ) and writing (greenplum -> csv) render the csv file corrupt or bring up any more problems?
I know this is not a very robust method, but I really need to understand the feasibility of this idea. Can grails function wihtout any DB and merely a CSV file on a shared drive accesssible to multiple users?
The short answer is, No. This won't be a good solution.
No.
It would be nearly impossible, if at all possible to rework this.
Concurrent access to a file like that in any environment is a recipe for disaster.
Grails is not suitable for a solution like this.
update:
Have you considered using the built in H2 database which can be packaged with the Grails application itself? This way you can distribute the database engine along with your Grails application within the WAR. You could even have it populate it's database from the CSV you mention the first time it runs, or periodically. Depending on your requirements.

How to convert data stored in XML files into a relational database (MySQL)?

I have a few XML files containing data for a research project which I need to run some statistics on. The amount of data is close to 100GB.
The structure is not so complex (could be mapped to perhaps 10 tables in a relational model), and given the nature of the problem, this data will never be updated again, I only need it available in a place where it's easy to run queries on.
I've read about XML databases, and the possibility of running XPATH-style queries on it, but I never used them and I'm not so comfortable with it. Having the data in a relational database would be my preferred choice.
So, I'm looking for a way to covert the data stored in XML into a relational database (think of a big .sql file similar to the one generated by mysqldump, but anything else would do).
The ultimate goal is to be able to run SQL queries for crunching the data.
After some research I'm almost convinced I have to write it on my own.
But I feel this is a common problem, and therefore there should be a tool which already does that.
So, do you know of any tool that would transform XML data into a relational database?
PS1:
My idea would be something like (it can work differently, but just to make sure you get my point):
Analyse the data structure (based on the XML themselves, or on a XSD)
Build the relational database (tables, keys) based on that structure
Generate SQL statements to create the database
Generate SQL statements to create fill in the data
PS2:
I've seen some posts here in SO but still I couldn't find a solution.
Microsoft's "Xml Bulk Load" tool seems to do something in that direction, but I don't have a MS SQL Server.
Databases are not the only way to search data. I can highly recommend Apache Solr
Strategies to Implement search on XML file
Keep your raw data as XML and search it using the Solr index
Importing XML files of the right format into a MySql database is easy:
https://dev.mysql.com/doc/refman/5.6/en/load-xml.html
This means, you typically have to transform your XML data into that kind of format. How you do this depends on the complexity of the transformation, what programming languages you know, and if you want to use XSLT (which is most probably a good idea).
From your former answers it seems you know Python, so http://xmlsoft.org/XSLT/python.html may be the right thing for you to start with.
Take a look at StAX instead of XSD for analyzing/extraction of data. It's stream based and can deal with huge XML files.
If you feel comfortable with Perl, I've had pretty good luck with XML::Twig module for processing really big XML files.
Basically, all you need is to setup few twig handlers and import your data into MySQL using DBI/DBD::mysql.
There is pretty good example on xmltwig.org.
If you comfortable with commercial products, you might want to have a look at Data Wizard for MySQL by the SQL Maestro Group.
This application is targeted especially at exporting and, of course, importing data from/ to MySQL databases. This also includes XML import. You can download a 30-day trial to check if this is what you are looking for.
I have to admit that I did not use the MySQL product line from them yet, but I had a good user experience with their Firebird Maestro and SQLite Maestro products.