How to import the data from .MYD into MATLAB? - mysql

I just obtained a bunch of MySQL data stored in raw MySQL ( MyISAM table) format in a .MYD file.
I now wish to start data analysis over those data. All I need to do is just reading the numbers into my MATLAB and process them.
What is the easiest way of doing so? I am using Mac OS, by the way.

Creating a mysql database and dropping the file into a (non running at the time) mysql server is certainly one way to get to the stage where you have the data in a form you can re-export it.
I am not familiar with MACos locations but in linux the data structure is :
/var/lib/mysql/databasename/*.MYI and MYD
I would be leery of trying to extract an ISAM file using anything other than mysql frankly.
Maybe someone else knows better, but I don't :-)

Related

How to import from sql dump to MongoDB?

I am trying to import data from MySQL dump .sql file to get imported into MongoDB. But I could not see any mechanism for RDBMS to NoSQL data migration.
I have tried to convert the data into JSON and CSV but it is not giving m the desired output in the MongoDB.
I thought to try Apache Sqoop but it is mostly for SQL or NoSQL to Hadoop.
I could not understand, how it can be possible to migrate data from 'MySQL' to 'MongoDB'?
I there any thought apart from what I have tried till now?
Hoping to hear a better and faster solution for this type of migration.
I suggest you dump Mysql data to a CSV file,also you can try other file format,but make sure the file format is friendly so that you can import the data into MongoDB easily,both of MongoDB and Mysql support CSV file format very well.
You can try to use mysqldump or OUTFILE keyword to dump Mysql databases for backup,using mysqldump maybe takes a long time,so have a look at How can I optimize a mysqldump of a large database?.
Then use mongoimport tool to import data.
As far as I know,there are three ways to optimize this importing:
mongoimport --numInsertionWorkers N It will start several insertion workers, N can be the number of cores.
mongod --njournal Most of the continuous disk usage come from the journal,so disable journal might be a good way for optimizing.
split up your file and start parallel jobs.
Actually in my opinion, importing data and exporting data aren't difficulty,it seems that your dataset is large,so if you don't design you document structure,it still make your code slower,it is not recommended doing automatic migrations from relational database to MongoDB,the database performance might not be good.
So it's worth designing your data structure, you can check out Data models.
Hope this helps.
You can use Mongify which helps you to move/migrate data from SQL based systems to MongoDB. Supports MySQL, PostgreSQL, SQLite, Oracle, SQLServer, and DB2.
Requires ruby and rubygems as prerequisites. Refer this documentation to install and configure mongify.

Which RDMS for Tableau connections?

We are finally moving from Excel and .csv files to databases. Currently, most of my Tableau files are connected to large .csv files (.twbx).
Is there any performance differences between PostgreSQL and MySQL in Tableau? Which would you choose if you were starting from scratch?
Right now, I am using pandas to join files together and creating a new .csv file based on the join.(Example, I take a 10mil row file and drop duplicates and create a primary key, then I join it with the same key on a 5mil row file, then I export the new 'Consolidated' file to .csv and connect Tableau to it. Sometimes the joins are complicated involving dates or times and several columns).
I assume I can create a view in a database and then connect to that view rather than creating a separate file, correct? Each of my files could instead be a separate table which should save space and allow me to query dates rather than reading the whole file into memory with pandas.
Some of the people using the RDMS would be completely new to databases in general (dashboards here are just Excel files, no normalization, formulas in the raw data sheet, etc.. it's a mess) so hopefully either choice has some good documentation to lesson the learning curve (inserting new data and selecting data mainly, not the actual database design).
Both will work fine with Tableau. In fact, Tableau's internal data engine is based on Postgres.
Between the two, I think Postgres is more suitable for a central data warehouse. MySQL doesn’t allow certain SQL methods such as Common Table Expressions and Window Functions.
Also, if you’re already using Pandas, Postgres has a built-in Python extension called PL/Python.
However, if you’re looking to store a small amount of data and get to it really fast without using advanced SQL, MySQL would be a fine choice but Postgres will give you a few more options moving forward.
As stated, either database will work and Tableau is basically agnostic to the type of database that you use. Check out https://www.tableau.com/products/techspecs for a full list of all native (inbuilt & optimized) connections that Tableau Server and Desktop offer. But, if your database isn't on that list you can always connect over ODBC.
Personally, I prefer postgres over mysql (I find it really easy to use psycopg2 to write to postgres from python), but mileage will vary.

Is there any tool to migrate data from MySQL (or mongodb) to Aerospike?

I would like to migrate data from MySQL (or mongodb) to Aerospike, anyone knows if exists any tool to do that?
Aerospike provides something like a csv loader.
https://github.com/aerospike/aerospike-loader
So you can play around with mysqldump data , process the dumped file to create a csv as per the accepted format of aerospike-loader and then load the data into aerospike.
There is a simple python script I have developed. I have been using it for my projects since we are moving all our databases from MySQL to MongoDB. It migrates around 1 Lakh rows in a minute. Migrate MySQL to MongoDB

Importing csv into multiple mysql databases from rails app

I have a CSV file consisting of 78,000 records.Im using smarter_csv (https://github.com/tilo/smarter_csv) to parse the csv file. I want to import that into a MySQL database from my rails app. I have the following two questions
What would be the best approach to quickly importing such a large data-set into MySQL from my rails app ?. Would using resque or sidekiq to create multiple workers be a good idea ?
I need to insert this data into a given table which is present in multiple databases. In Rails, i have a model talk to only one database. So how can i scale the solution to talk to multiple mysql databases from my model ?
Thank You
One way would be to use the native interface of the database application itself for importing and exporting; it would be optimised for that specific purpose.
For MySQL, the mysqlimport provides that interface. Note that the import can also be done as an SQL statement and that this executable provides a much saner interface for the underlying SQL command.
As far as implementation goes, if this is a frequent import exercise, the sidekiq/resque/cron job is the best possible approach.
[EDIT]
The SQL command referred to above is the LOAD DATA INFILE as the other answer points out.
Performance wise probably the best method is the use MYSQL's LOAD DATA INFILE syntax and execute an import command on each database. This requires the data file to be local to each database instance.
As the other answer suggests, mysqlimport can be used to ease the import as the LOAD DATA INFILE statement syntax is highly customisable and can deal with many data formats.

Copy data from Postgresql to MySQL

I have encountered a problem where I need to copy only data from a Postgresql database to Mysql database. I already have the Mysql database with empty tables. By using PGAdmin I got a backup (data only, without database schema). I tried using PSQL tool but it keeps giving segmentation fault which I couldn't fix at the moment. I am using Ubuntu. Any simple help with a guide will be highly appreciated to copy data.
Use postgres COPY, and MySQL LOAD DATA INFILE.
psql will crash because of out-of-memory if you try to display a few millions of rows because it fetches all the data beforehand to determine the column widths for a prettier display. If you intend to use psql to fetch lots of data, disable this.
You could try:
http://www.lightbox.ca/pg2mysql.php
It looks like you might be trying to load data into mysql with the postgres client tool. Use the mysql client tools to manipulate data in the mysql server.
mysql client programs
How can you move data into MySQL if you have errors reading from PSQL?
Fix the error, then ask this question.