Different numbers of rows when importing csv - mysql

I ran into a problem that I didn't understand. I have a .csv file that is about 10 GB in size. I'm transferring it into the database. When I use only the innodb engine, 1.863.941 lines are added. But when I use the Aria engine, 1.765.972 lines are added.
I can't see how many rows there are in this one because it is so big .csv file.
The database I use: Mysql 5.7.24 - Mariadb 10.3.12
The SQL Command I use:
LOAD DATA LOCAL INFILE "'.$file.'"
INTO TABLE '.$table.'
CHARACTER SET UTF8
FIELDS TERMINATED by \',\'
ENCLOSED BY \'"\'
LINES TERMINATED BY \'\r\n\''
I am not getting any error code when transferring SQL.

Related

MySQL importing large csv

I have a large CSV file I am trying to import into MySQL (around 45GB, around 150mil rows, most columns small but one with variable length text, can get up to KBs of size). I am using LOAD DATA LOCAL INFILE to try and import it but the server always times out my connection before it finishes. I have tried modifying the global connection timeout variables to fix this, but it already has some hours before it times out. Is there another way to import a database this large, or am I doing something wrong with this approach?
LOAD DATA LOCAL INFILE 'filename.csv' INTO TABLE table CHARACTER SET latin1
FIELDS TERMINATED BY ',' OPTIONALLY ENCLOSED BY '"' ESCAPED BY '\\'
LINES TERMINATED BY '\r\n';
I am executing this command on Windows 10, with the MySQL command line. My MySQL version is 8.0.
The way I have handled this in the past is by writing a php script that reads the file and outputs the top 50% into a new file, then deletes those rows. Then perform two load data infiles, one for original and one for the new.

How do I update CSV file into mysql Database?

Hello I wondering is there way I can update a CSV file into mysql database. At this current time I have created a database in mysqlworkbench which connected to mysql sever. Then I upload CSV file into database which updates auto on webmin webserver.
The question I want to ask is how can I link new CSV file onto the database automatically.
Create database table with same numbers of columns present in the csv file.
Run following query:
LOAD DATA INFILE 'c:/import.csv' INTO TABLE <table> FIELDS TERMINATED BY ',' ENCLOSED BY '"' LINES TERMINATED BY '\r\n'

Table `generic_table_name` doesn't exist using 'LOAD DATA LOCAL INFILE' for MySQL

I'm under the impression that this command will create a table for you. Is this the wrong idea? I have this:
LOAD DATA LOCAL INFILE '/Users/michael/WebstormProjects/d3_graphs_with_node/public/DFData/ADWORDS_AD_DATA.csv'
INTO TABLE ADWORDS_AD_DATA
FIELDS TERMINATED BY ','
ENCLOSED BY '"'
LINES TERMINATED BY '\n'
IGNORE 1 LINES
(campaign_id,adgroup_id,ad_group,campaign,account,week,keyword,keyword_state,match_type,max_cpc,clicks,impressions,ctr,avg_cpc,avg_cpm,cost,avg_position,quality_score,first_page_cpc,first_position_cpc,converted_clicks,click_conversion_rate,cost_converted_click,cost_all_conv,total_conv_value,conversions,cost_conv,conv_rate)
I counted the columns and there is 28 and I made exactly 28 entries after IGNORE 1 LINES. I looked at these SO posts and tried to follow the syntax outlined by the MySQL documentation here. Any ideas?
mysql-load-local-infile
import-csv-to-mysql-table
Okay manually added table and set the column names and it worked!
A side note, I had 2 versions of MySQL running and that was also giving me confusion as to why the table/database I created in phpmyadmin wasn't showing up in the MySQL shell in the command line.
MAMP MYSQL vs system MYSQL

PHPMyadmin tool slow in importing

One of my customer reports that while importing a 1mb CSV file it takes more than the usual time (more than 30 min) which is abnormal. While using the format type as CSV using LOAD DATA it throws an error.
SQL query
LOAD DATA INFILE '/tmp/phpzfQQR8' INTO TABLE `course1` FIELDS TERMINATED BY ';'
ENCLOSED BY '"' ESCAPED BY '\\' LINES TERMINATED BY '\r\n'
MySQL said: Documentation
1085 - The file '/tmp/phpzfQQR8' must be in the database directory or be readable by all
It actually works the other way but just abnormally slow. Could someone help me on this really quick.
Instead of CSV using LOAD DATA, try just CSV.

MySQL LOAD DATA LOCAL INFILE vs. SQL file

Every day we load around 6GB of CSV files into MySQL using:
LOAD DATA LOCAL INFILE 'file$i.csv' INTO TABLE tableName FIELDS TERMINATED BY ',' ENCLOSED BY '\"' LINES TERMINATED BY '\n';
We have 6 files that goes through this process, so it takes some time. Since we generate these files ourselves, we're in control of what format it gets outputted as.
Originally we chose CSV because this was a smaller process and we needed the data to be moved around and easily read by a non-developer. Now however, it's not so much of an issue since the loading time is so dramatic, we're now talking hours.
Is it quicker to output each row as an INSERT query into a single file and execute that or is CSV still quicker?
We're using the InnoDB storage engine.
If you use MyISAM tables, try ALTER TABLE table_name DISABLE KEYS; before loading the data and ALTER TABLE table_name ENABLE KEYS; after data import is done. This will greatly reduce your time taken for huge data.
Load data is faster than separate insert statement for each row.