populating mysql database - mysql

I have a file with over a million lines of data, each line is a record.
I can go through the file, read the line and do a insert, but this can take up to 2 hours. Is there a faster way like uploading a sql file?

Use LOAD DATA INFILE

You can use Find and replace to build an insert statement around it.

Related

Erasing records from text file after importing it to MySQL database

I know how to import a text file into MySQL database by using the command
LOAD DATA LOCAL INFILE '/home/admin/Desktop/data.txt' INTO TABLE data
The above command will write the records of the file "data.txt" into the MySQL database table. My question is that I want to erase the records form the .txt file once it is stored in the database.
For Example: If there are 10 records and at current point of time 4 of them have been written into the database table, I require that in the data.txt file these 4 records get erased simultaneously. (In a way the text file acts as a "Queue".) How can I accomplish this? Can a java code be written? Or a scripting language is to be used?
Automating this is not too difficult, but it is also not trivial. You'll need something (a program, a script, ...) that can
Read the records from the original file,
Check if they were inserted, and, if they were not, copy them in another file
Rename or delete the original file, and rename the new file to replace the original one.
There might be better ways of achieving what you want to do, but, that's not something I can comment on without knowing your goal.

import csv file with LOAD DATA LOCAL INFILE in symfony 1.4

I need to fill several of tables with CSV files. I tried to use a loop that do insert with each row but a file with 65,000 records take me more then 20 min.
I want to use the MySQL command LOAD DATA LOCAL INFILE, but I received this message :
LOAD DATA LOCAL INFILE forbidden in C:\xampp\htdocs\myProject\apps\backend\modules\member\actions\actions.class.php on line 112
After a little research, I understand there is need to change one of the security parameters of the PDO (PDO::MYSQL_ATTR_LOCAL_INFILE) to true.
In symfony2, you need to change it at config.yml of your app, but I can't find it on symfony 1.4.
Let me try to understand the question (or questions?!).
If you need to optimize the INSERT queries you should probably batch them at a single INSERT query or a few ones, but definitely not for each row. Besides, the INSERT query in MySQL will be always slow especially for a large amount of data inserted, also depends on indexing, engine and schema structure of the DB.
About the second question, take a look here, maybe it will help.

How do i insert into a MySQL db from a file?

My table has around 190 fields. Out of which only 30 cannot be null. And around 7k rows are to be inserted.
I am using JDBC. Is there anyway of inputting all these directly from a text file in a single INSERT instead of doing an INSERT statement 7k times.
If you want to write a query itself, you can use LOAD DATA INFILE. It is a very fast method of importing a file. However, the file has to be properly formatted.
http://dev.mysql.com/doc/refman/5.1/en/load-data.html

Upload huge data in mysql database

I have to upload a crores of data into mysql table. The data is in the form of .csv format. I was tried with load infile method. but, it is also taking very long time. Is there any other way to upload a data ?
You can try the Load Data statement of MYSQL -
http://dev.mysql.com/doc/refman/5.0/en/load-data.html
The LOAD DATA INFILE statement reads rows from a text file into a table at a very high speed. The file name must be given as a literal string.
There is a MySQL utility called mysqlimport http://dev.mysql.com/doc/refman/5.0/en/mysqlimport.html
I suggest to try that.
EDIT Actually it seems to be the same as LOAD DATA INFILE
You can find some usefull tuning tips here:
Speed of INSERT Statements
Bulk Data Loading for MyISAM Tables
Bulk Data Loading for InnoDB Tables
Hope this helps

Import huge mysql dump with single insert

I need to import huge MySQL dump ~500mb of a single table but the problem is dump file with single insert statement and with multimple rows. When i try to import it to database it takes a so long time that i'm not shure will be it finished or MySQL just gone away. Thx in advance.
several things you can do:
get a binary dump from the original and import that.
get the original db data files and copy them across.
Edit the 500mb file into several, you edit the single sql insert statement to be many sql insert statements. All you need to do is put some closing brackets after, say, 100 rows of values, then put the 'insert into x' statement on the next line. Repeat.