PHPMyadmin tool slow in importing - mysql

One of my customer reports that while importing a 1mb CSV file it takes more than the usual time (more than 30 min) which is abnormal. While using the format type as CSV using LOAD DATA it throws an error.
SQL query
LOAD DATA INFILE '/tmp/phpzfQQR8' INTO TABLE `course1` FIELDS TERMINATED BY ';'
ENCLOSED BY '"' ESCAPED BY '\\' LINES TERMINATED BY '\r\n'
MySQL said: Documentation
1085 - The file '/tmp/phpzfQQR8' must be in the database directory or be readable by all
It actually works the other way but just abnormally slow. Could someone help me on this really quick.

Instead of CSV using LOAD DATA, try just CSV.

Related

Different numbers of rows when importing csv

I ran into a problem that I didn't understand. I have a .csv file that is about 10 GB in size. I'm transferring it into the database. When I use only the innodb engine, 1.863.941 lines are added. But when I use the Aria engine, 1.765.972 lines are added.
I can't see how many rows there are in this one because it is so big .csv file.
The database I use: Mysql 5.7.24 - Mariadb 10.3.12
The SQL Command I use:
LOAD DATA LOCAL INFILE "'.$file.'"
INTO TABLE '.$table.'
CHARACTER SET UTF8
FIELDS TERMINATED by \',\'
ENCLOSED BY \'"\'
LINES TERMINATED BY \'\r\n\''
I am not getting any error code when transferring SQL.

MySQL does not read all massive data in csv file using load data infile

I have a total of 6077 data. However after running this command
LOAD DATA INFILE '..admin staff.csv' INTO TABLE `adstaff` FIELDS TERMINATED BY ',' ENCLOSED BY '\"' LINES TERMINATED BY '\n'
The total data read is 1 mil++. But when I check my database it only contains 4033 real data and any other row after were null.
I am utterly confuse with another 2000++ data whereabouts because it does not came out when I do manual searching. I hope anyone can help me out thank you

MySQL importing large csv

I have a large CSV file I am trying to import into MySQL (around 45GB, around 150mil rows, most columns small but one with variable length text, can get up to KBs of size). I am using LOAD DATA LOCAL INFILE to try and import it but the server always times out my connection before it finishes. I have tried modifying the global connection timeout variables to fix this, but it already has some hours before it times out. Is there another way to import a database this large, or am I doing something wrong with this approach?
LOAD DATA LOCAL INFILE 'filename.csv' INTO TABLE table CHARACTER SET latin1
FIELDS TERMINATED BY ',' OPTIONALLY ENCLOSED BY '"' ESCAPED BY '\\'
LINES TERMINATED BY '\r\n';
I am executing this command on Windows 10, with the MySQL command line. My MySQL version is 8.0.
The way I have handled this in the past is by writing a php script that reads the file and outputs the top 50% into a new file, then deletes those rows. Then perform two load data infiles, one for original and one for the new.

skip header of csv file to be imported in mysql workbench

What I did is to type this query:
SELECT * FROM report;
then click the Import Records from an External File in MySQL Workbench. It worked and it was imported in my database. The problem is that the header is also imported.
What should I do to import everything except the header?
Ive never really used MySQL workbench, but the best way to import a .csv is with LOAD DATA INFILE.
Example:
LOAD DATA INFILE '{$file_path}' INTO TABLE {$table} FIELDS TERMINATED BY ',' OPTIONALLY ENCLOSED BY '\"' LINES TERMINATED BY '\r\n' IGNORE 1 LINES
Note that your LINES TERMINATED BY may be different based on the way your .csv is formatted. It could also be \n' or less commonly '\r'.

different line break characters in CSVs

I use LOAD DATA LOCAL INFILE to upload csv files to MySQL.
If the csv was created by a mac, I include LINES TERMINATED BY '\r' in the query.
If the csv was created by a MS Office, I include LINES TERMINATED BY '\n' in the query.
If the csv was created in Open Office, I omit LINES TERMINATED BY altogether.
Is there a way I can formulate my query so the csv will be uploaded regardless of how it was created?
Look up line-termination stripping procs. There should be a few resources out there which look at \r\n issues and how to fix them in different coding sets.