what is the fastest method to upload a csv file into mysql via perl, mysql?
The csv file is probably about 20Gs.
thanks
consider using mysql LOAD DATA INFILE
http://dev.mysql.com/doc/refman/5.0/en/load-data.html
You could use Perl and DBI to execute a LOAD DATA INFILE statement, if for some reason you couldn't use the mysql client or mysqladmin.
Related
I am testing Google Cloud SQL and I have to test the instructions LOAD DATA INFILE and SELECT...INTO OUTFILE.
I understood that the instruction LOAD DATA LOCAL INFILE could be used instead of LOAD DATA INFILE, but how can it be used in Cloud SQL?
Can the instruction SELECT...INTO OUTFILE be used as is?
Thank you Renzo for your answer but when I try to execute the instruction SELECT.. INTO OUTFILE when I am connected to the Cloud SQL instance, I get the following error message:
ERROR 1045 (28000) at line 1: Access denied for user 'root'#'%'
I tried to used the instruction LOAD DATA LOCAL INFILE from my computer connected to Cloud SQL instance on MySQL and it worked. I used a file located on my computer to import CSV data into a table successfully. But I am wondering if I can use a file present in a bucket to do the same...
I try to explain what I need to do:
We are migrating our Website to GCP App Engine/Cloud SQL. We extensively use both instructions LOAD DATA INFILE which select files from a folder "ftp" to load them into the databases. As well, we use the SELECT INTO OUTFILE to export data to CSV files into a folder of our Website. So my concern is to be able to use the same process on App Engine/Cloud SQL.
The difference is that the Website instance and database instance are separated on GCP. Should we use buckets on GCP to replace our folders ? Or should we create folders on App Engine / Cloud SQL instance instead? What is the best solution according to you?
Thanks in advance
According to the page about Importing data into Cloud SQL,
you can also use the LOAD DATA LOCAL INFILE statement in the mysql client, which loads a local file to the database
You can follow a sample steps in this link to start the mysql client by connecting to your Cloud SQL instance first through Cloud Shell. After connecting, you will be able to execute the import statement: LOAD DATA LOCAL INFILE.
The same steps can be applied for exporting and the syntax SELECT...INTO OUTFILE can be used as is. Together with other alternatives, this can be also found on Exporting data from Cloud SQL :
Exporting in CSV format is equivalent to running the following SQL statement:
SELECT <query> INTO OUTFILE ... CHARACTER SET 'utf8mb4'
FIELDS TERMINATED BY ',' OPTIONALLY ENCLOSED BY '\"'
ESCAPED BY '\\' LINES TERMINATED BY '\n'
Once you have executed the operations, it is also advisable to check their status especially if it is long running or an error has occurred.
As MySQL doc says LOAD DATA INFILE is 20x times faster than bulk insert(INSERT VALUES), I am tring to replace INSERT VALUES with LOAD DATA INFILE in my application.
When I try LOAD DATA INFILE with ADO.NET, it ends in an error. It seems there is no way to load a file from local machine where MySQL is not running.
But that is not a problem for mysqlimport, the doc also says that.
You can also load data files by using the mysqlimport utility; it
operates by sending a LOAD DATA INFILE statement to the server. The
--local option causes mysqlimport to read data files from the client host.
I am curious about what the protocol is used by mysqlimport when it imports data into MySQL on a remote machine.
Is it parsing the data file and construct SQL like INSERT VALUES?
or it has a special protocol to send the data file directly to MySQL?
I have a PostgreSQL schema file, and I have an SQL dump from a MySQL database. I want to know how I can import my MySQL dump file into Postgresql, using the postgresql schema file.
You cna't direct restore mysql dump file into postgres
You use psotgresql wiki or specific software trasnfer like:
Postgresql wiki
ESF Database Migration Toolkit
Can you install another MySQL, load the dump there, and export the data in CSV format? This should allow you to load the data into PostgreSQL, using the COPY command.
I want to able able to use the load data infile command in mySQL, but instead of loading the data from a local file I want to load it from a CSV file.
I.e., if the file is in local storage it'd look like:
LOAD DATA INFILE'C:\\abc.csv' INTO TABLE abc
But if it's in S3, not sure how I could do something like this.
Is this possible?
NOTE: this is not an RDS machine, so this command does not seem to work:
http://docs.aws.amazon.com/datapipeline/latest/DeveloperGuide/dp-template-copys3tords.html
The mysql CLI allows you to execute STDIN as a stream of SQL statements.
Using a combination of the s3 CLI and mkfifo, you can stream data out of s3.
Then it's a simple matter of connecting the streams with something that re-formats the CSV into valid SQL.
mkfifo /tmp/mypipe
s3 cp s3://your/s3/object /tmp/mypipe
python transform_csv_to_sql.py < /tmp/mypipe | mysql target_database
You might be able to remove the python step and use MySQL's CSV code if you tell MySQL to load the data directly from your fifo:
mkfifo /tmp/mypipe
s3 cp s3://your/s3/object /tmp/mypipe
mysql target_database --execute "LOAD DATA INFILE '/tmp/mypipe'"
Good luck!
Hi I need to copy data from a csv file into a mysql table.Copy of data has to be remotely.My csv file is not present on the db server.Any idea how to do it.
I can do locally file copy from csv but not from different machine.
thanks
you have many options. If you have phpMyAdmin or other sql manager - it is easier variant. Else you should copy the file locally and use mysql shell tools to import the data.
Take a look at LOAD DATA . By using the LOCAL option, you specify that it's a local file and not a file located on the db server. The FIELDS TERMINATD BY and other options lets you adjust to the format of your .csv file