I am trying to load a +75mb csv file with 650k+ rows into a MySQL server on Centos Linux. This is just one example. I've also tried the same with 350mb file and 1.75m rows.
I've tried both of these MySQL import functions via command-line. EACH time I import the same file into an empty table the Row Count is different (inconsistent).
Truncate table.
Import from command-line.
Check row count.
Repeat
Here are my results:
640,568
636,547
644,483
640,961
and on, and on, and on....
I've tried these two different import methods:
mysql -u root -ppassword --execute="USE DBNAME;LOAD DATA INFILE '/home/scripts/extracts/NAMEOFTABLE.CSV' INTO TABLE NAMEOFTABLE FIELDS TERMINATED BY ',' LINES TERMINATED BY '\n';" --show-warnings
output is empty, no warnings or errors (I have seen them before so I know it works)
mysqlimport -u root -ppassword --fields-terminated-by=, --local DBNAME /home/scripts/extracts/NAMEOFTABLE.CSV
This will output:
DBNAME.NAMEOFTABLE: Records: 647435 Deleted: 0 Skipped: 0 Warnings: 647435
Which at first is exciting, but then I check the rows in DB and it is only 640,480. :(
I've been at this for days. Thanks for your help.
Try this query to import csv from phpmyadmin or any other mysql client
Before firing query, please make sure that table matching the fields is already created into db.
LOAD DATA LOCAL INFILE '/home/scripts/extracts/NAMEOFTABLE.CSV' INTO TABLE `tbl_csvs`
FIELDS TERMINATED BY ','
ENCLOSED BY '"'
LINES TERMINATED BY '\n'
(comma separated fields name here field1, field2, field3);
Related
I have a huge amount of data in MariaDB. I need to do create dump files from queries. So far I got something like this.
SELECT DISTINCT uretici INTO OUTFILE '/home/admin/web/example.com/public_html/public/manufacturers.sql' FIELDS TERMINATED BY ',' OPTIONALLY ENCLOSED BY '"' LINES TERMINATED BY '\n' FROM data;
This gives me a SQL file which is cant be imported. I need an importable SQL file. What to do about it?
You can use the following command
mysqldump -u [database_user] -p [database_name] | gzip > [filename_to_compress.sql.gz]
I'm trying to bypass the -secure-file-priv option that MySQL has enabled to export a table into a .csv file.
I've been running SELECT * FROM final_table INTO OUTFILE 'test.csv' FIELDS ENCLOSED BY '"' TERMINATED BY ';' ESCAPED BY '"' LINES TERMINATED BY '\r\n'; into the terminal, but it has been returning The MySQL server is running with the --secure-file-priv option so it cannot execute this statement
Previously, I bypassed this error when importing files by running LOAD DATA LOCAL INFILE instead of LOAD DATA INFILE which had the same error and was wondering if there was a way to do it with exporting as well.
Thanks!
Answering my own question since I found a solution that might help out other people.
mysql -u root -p [database] -e '[SELECT * FROM TABLE_NAME]' > data_export.csv
got me exactly what I needed without disabling secure-file-priv!
I want to insert data with a csv-file. I want to do this by having this row in a bash script:
mysql -uusername -ppassword "LOAD DATA LOCAL INFILE 'CSVname.csv' INTO TABLE table_name FIELDS TERMINATED BY ',' LINES TERMINATED BY '\n'"
But I'm not sure where in this string I can select which DB to use. Only which table.
I get it to work if I enter it manually with "use dbname" and then LOAD DATA etc..
Can anyone please help?
Thanks in advance!
A simple look at the manual would have helped you here, it did me.
All you need to do is add the database name in front of the table name like this:
mysql -uxxx -p yyy "LOAD DATA LOCAL INFILE 'CSVname.csv' INTO TABLE
databasename.table_name FIELDS TERMINATED BY ',' LINES TERMINATED BY '\n'"
I'm trying to import a *.ods file using mysqlimport or 'load data' instead of phpMyAdmin interface because I need to automate the process.
This is my first attempt:
mysqlimport --ignore-lines=1 -u root -p DATABASE /home/luca/Scrivania/lettura.ods
mysqlimport can't upload the file because there are two spreadsheets, I need both and I can't modify the file structure.
With phpMyAdmin i'm able to upload the file correctly. The content of the two spreadsheets fills correctly the two tables. I know that when importing an *.ods file like this, phpMyAdmin uses the sheet name as the table name for import the file, but this is not the behavior of mysqlimport. Mysqlimport uses the file name, not the sheet name.
So I've tried this:
mysql -u root -p -e "load data local infile '/home/luca/Scrivania/lettura.ods' into table TABLE_NAME" DATABASE
Returns no error but the data in the table is totaly inconsistent.
Any suggestions?
Thank You
MySQL doesn't know how to read spreadsheets. Your best bet is to use your spreadsheet program to export a CSV, then load that new CSV into the database. Specify comma-delimited loading in your query and try it:
LOAD LOCAL DATA INFILE '/home/luca/Scrivania/lettura.csv'
INTO TABLE TABLE_NAME
FIELDS TERMINATED BY ','
ENCLOSED BY '' ESCAPED BY '\'
LINES TERMINATED BY '\n'
STARTING BY ''
The documentation for file loading contains tons of possible options, if you need them.
I am trying to build an automated CSV to MYSQL dump which occurs everytime the CSV file in a certain directory is updated.
Once the file is updated, the following line of code is executed in a bash script:
mysql -u root -p$MASTER_DB_PASSW < /usr/local/scripts/order.sql
The contents of order.sql are as follows:
use test;
truncate test.ORDER;
load data infile '/home/test/ORDER.csv' into table test.ORDER fields terminated by ','
enclosed by '"'
lines terminated by '\r\n'
IGNORE 1 LINES
(order_date);
For some reason, when I run order.sql manually in Workbench, it works perfectly... but when I run it on the server, the contents of the csv file do not get dumped. Any tips/advice? Thanks in advance.
I solved it.
I had an error in my bash script. I found this by running:
bash -x scriptname