How do i insert into a MySQL db from a file? - mysql

My table has around 190 fields. Out of which only 30 cannot be null. And around 7k rows are to be inserted.
I am using JDBC. Is there anyway of inputting all these directly from a text file in a single INSERT instead of doing an INSERT statement 7k times.

If you want to write a query itself, you can use LOAD DATA INFILE. It is a very fast method of importing a file. However, the file has to be properly formatted.
http://dev.mysql.com/doc/refman/5.1/en/load-data.html

Related

Individiaul MySQL INSERT statements vs writing to local CSV first and then LOAD DATA

I'm trying to extract information from 50 million HTML files into a MySQL database. My question is at what point during the process should I store the information into the MySQL database. For example, I'm considering these options:
Open each file and extract the information I need. Perform an INSERT after each file gets parsed.
Open each file and extract the information I need. Store the information into a CSV file as an intermediary. After all the files have been parsed into the CSV, perform a bulk upload using LOAD DATA INFILE
I know that LOAD DATA INFILE is much faster than individual INSERT statements if I already have the information in a CSV. However, if I don't have the information already in a CSV, I don't know if it's faster to create the CSV first.
At the crux of the question: Is writing to a local CSV faster or about the same as a single INSERT statement?
I'm using PHP in case it matters. Thanks in advance!
They key is not to do one insert per entry, but batch the entries in memory then perform a batch insert.
See: https://dev.mysql.com/doc/refman/5.7/en/insert.html
INSERT statements that use VALUES syntax can insert multiple rows. To do this, include multiple lists of column values, each enclosed within parentheses and separated by commas. Example:
INSERT INTO tbl_name (a,b,c) VALUES(1,2,3),(4,5,6),(7,8,9);
ORMs like SQLAlchemy or Hibernate are smart enough (depending on configuration) to automatically batch your inserts.

How to replace a Column simultaneously with LOAD INFILE in MySQL

Suppose we have table with a DECIMAL column with values, for example: 128.98, 283.98, 21.20.
I want to import some CSV Files to this table. However, in the columns of these files, I have values like 235,69, 23,23, with comma instead of points.
I know I can REPLACE that column, but is there some way of doing that before LOAD INFILE?
I do not believe you can simultaneously replace that column and load the data. Looks like you will have to do multiple steps to get the results you want.
Load the data first into a raw table using the LOAD INFILE command. This table can be identical to the main table. You can use the Create Table like command to create the table.
Process the data (i.e. change the comma to a . where applicable) in the raw table.
select the data from the raw table and insert into main table either with row by row processing or bulk insert.
This can all be done in a stored procedure (SP) or by a 3rd party script written in python, php, etc...
If you want to know more about SP's in Mysql, Here is a useful link.

import csv file with LOAD DATA LOCAL INFILE in symfony 1.4

I need to fill several of tables with CSV files. I tried to use a loop that do insert with each row but a file with 65,000 records take me more then 20 min.
I want to use the MySQL command LOAD DATA LOCAL INFILE, but I received this message :
LOAD DATA LOCAL INFILE forbidden in C:\xampp\htdocs\myProject\apps\backend\modules\member\actions\actions.class.php on line 112
After a little research, I understand there is need to change one of the security parameters of the PDO (PDO::MYSQL_ATTR_LOCAL_INFILE) to true.
In symfony2, you need to change it at config.yml of your app, but I can't find it on symfony 1.4.
Let me try to understand the question (or questions?!).
If you need to optimize the INSERT queries you should probably batch them at a single INSERT query or a few ones, but definitely not for each row. Besides, the INSERT query in MySQL will be always slow especially for a large amount of data inserted, also depends on indexing, engine and schema structure of the DB.
About the second question, take a look here, maybe it will help.

How to load data into a MySQL table without spaces?

I have a text file to be imported in a MySQL table. The columns of the files are comma delimited. I set up an appropriate table and I used the command:
load data LOCAL INFILE 'myfile.txt' into table mytable FIELDS TERMINATED BY ‘,’;
The problem is, there are several spaces in the text file, before and after the data on each column, and it seems that the spaces are all imported in the tables (and that is not what I want). Is there a way to load the file without the empty spaces (other than processing each row of the text file before importing in MySQL)?
As far as I understand, there's no way to do this during the actual load of the data file dynamically (I've looked, as well).
It seems the best way to handle this is to either use the SET clause with the TRIM
function
("SET column2 = TRIM(column2)")
or run an update on the string columns after loading, using the TRIM() function.
You can also create a stored procedure using prepared statements to run the TRIM function on all columns in a specified table, immediately after loading it.
You would essentially pass in the table name as a variable, and the sp would use the information_schema database to determine which columns to upload.
If you can use .NET, CSVReader is a great option(http://www.codeproject.com/KB/database/CsvReader.aspx). You can read data from a CSV and specify delimiter, trimming options, etc. In your case, you could choose to trim left and right spaces from each value. You can then either save the result to a new text file and import it into the database, or loop through the CsvReader object and insert each row into the database directly. The performance of CsvReader is impressive. Hope this helps.

populating mysql database

I have a file with over a million lines of data, each line is a record.
I can go through the file, read the line and do a insert, but this can take up to 2 hours. Is there a faster way like uploading a sql file?
Use LOAD DATA INFILE
You can use Find and replace to build an insert statement around it.