MySQL insert from a csv file - mysql

I have a csv file I want to load into a database, but can't seem to have a row as a value for the insert function
for row in readCSV:
cur.execute("INSERT INTO IPaddresses(Start) VALUES(row[0])")
I use the module pymysql

Not sure why you're not using "load data [local] infile". It'd be much easier. Check out the command here: http://dev.mysql.com/doc/refman/5.0/en/load-data.html and someone with a similar problem here Python/MySQL - LOAD DATA LOCAL INFILE

What exactly happens when you run the code? Errors? Unexpected values in your table?

Related

sqlite load data infile Syntax Error

I'm working on importing a very large CSV file into SQLite. My understanding is that LOAD DATA INFILE is my best bet. I've created a table for it to reside in, and am attempting to execute the following query
LOAD DATA LOCAL INFILE 'F:/Downloads/NielsonReport.csv'
INTO TABLE neilson;
IGNORE 1 LINES
but, I get the following error:
Error while executing SQL query on database 'test': near "LOAD": syntax error
I seem to be getting an error along these lines regardless of what I'm trying to execute.
I feel like I'm missing something very basic, and would appreciate any help resolving this problem (I've been referencing this page for information so far)
When you are using SQLite, it would be a good idea to reference the SQLite documentation instead.
Anyway, SQLite itself does not have a CSV import function. But the sqlite3 command-line shell allows to import CSV files with the .import command.
Use import command like this.
.import '/Users/haseeb/Desktop/names_data.txt' Names

import csv file with LOAD DATA LOCAL INFILE in symfony 1.4

I need to fill several of tables with CSV files. I tried to use a loop that do insert with each row but a file with 65,000 records take me more then 20 min.
I want to use the MySQL command LOAD DATA LOCAL INFILE, but I received this message :
LOAD DATA LOCAL INFILE forbidden in C:\xampp\htdocs\myProject\apps\backend\modules\member\actions\actions.class.php on line 112
After a little research, I understand there is need to change one of the security parameters of the PDO (PDO::MYSQL_ATTR_LOCAL_INFILE) to true.
In symfony2, you need to change it at config.yml of your app, but I can't find it on symfony 1.4.
Let me try to understand the question (or questions?!).
If you need to optimize the INSERT queries you should probably batch them at a single INSERT query or a few ones, but definitely not for each row. Besides, the INSERT query in MySQL will be always slow especially for a large amount of data inserted, also depends on indexing, engine and schema structure of the DB.
About the second question, take a look here, maybe it will help.

Upload huge data in mysql database

I have to upload a crores of data into mysql table. The data is in the form of .csv format. I was tried with load infile method. but, it is also taking very long time. Is there any other way to upload a data ?
You can try the Load Data statement of MYSQL -
http://dev.mysql.com/doc/refman/5.0/en/load-data.html
The LOAD DATA INFILE statement reads rows from a text file into a table at a very high speed. The file name must be given as a literal string.
There is a MySQL utility called mysqlimport http://dev.mysql.com/doc/refman/5.0/en/mysqlimport.html
I suggest to try that.
EDIT Actually it seems to be the same as LOAD DATA INFILE
You can find some usefull tuning tips here:
Speed of INSERT Statements
Bulk Data Loading for MyISAM Tables
Bulk Data Loading for InnoDB Tables
Hope this helps

BigDump - UNEXPECTED: Can't set file pointer behind the end of file

While trying to start uplaod the 3.9 GB sql file via BigDump there is error
UNEXPECTED: Can't set file pointer
behind the end of file
Dump of database was exported from PHPMyAdmin. File is not corrupted. What is the problem? What are other ways to import such a big database?
Bigdump uses a INSERT INTO table VALUES (....) kind of method.
This is a very slow way of inserting!
Use
LOAD DATA INFILE 'c:/filename.csv' INTO TABLE table1
Instead. Note the use of forward slashes even on Windows.
See: http://dev.mysql.com/doc/refman/5.1/en/load-data.html
This is the fastest way possible to insert data into a MySQL table.
It will only work if the input file is on the same server as the MySQL server though.
I get similar error: I can't seek into .sql
The reason for this error is, that BigDump tries to set pointer at the end of .sql-File and then find out its size (using fseek() and fteil() functions). As fseek() is failing when you work with files over 2GB, you get this error. Solution is to split your SQL-File into chunks of 1,5GB - 2GB size...

populating mysql database

I have a file with over a million lines of data, each line is a record.
I can go through the file, read the line and do a insert, but this can take up to 2 hours. Is there a faster way like uploading a sql file?
Use LOAD DATA INFILE
You can use Find and replace to build an insert statement around it.