Load Local Data Infile with a condition in MYSQL 5.0 - mysql

I am having a question.
I have to upload a file into mysql 5.0 DB.
but inside the file it has 2 kinds of information, details and header.
it is like
H12345678900TYPE
L12334567TYPE
and the file is not delimited, it is fixed position.
I want to load the lines which starts with H only and also has the type as TYPE.
is there anyway I can check inside the load query?
i tried WHERE SUBSTR(#var1,1,1)='H'
but it says the syntax error.
Any suggestion???
Thanks

It will be better if you load file first in a temp table. and in next step import from temp table to your main table with applying condition.

Related

Erasing records from text file after importing it to MySQL database

I know how to import a text file into MySQL database by using the command
LOAD DATA LOCAL INFILE '/home/admin/Desktop/data.txt' INTO TABLE data
The above command will write the records of the file "data.txt" into the MySQL database table. My question is that I want to erase the records form the .txt file once it is stored in the database.
For Example: If there are 10 records and at current point of time 4 of them have been written into the database table, I require that in the data.txt file these 4 records get erased simultaneously. (In a way the text file acts as a "Queue".) How can I accomplish this? Can a java code be written? Or a scripting language is to be used?
Automating this is not too difficult, but it is also not trivial. You'll need something (a program, a script, ...) that can
Read the records from the original file,
Check if they were inserted, and, if they were not, copy them in another file
Rename or delete the original file, and rename the new file to replace the original one.
There might be better ways of achieving what you want to do, but, that's not something I can comment on without knowing your goal.

Import HDFS data file into mysql

I am trying to import a large HDFS file into a mysql db. The data in the file is delimiter by a '^A'. How do I tell mysql to separate each column by ctrl-A? Also, is it possible for me to specify what fields I want to import.
See the documentation here:
http://dev.mysql.com/doc/refman/5.5/en/mysqlimport.html
You are looking for the --fields-terminated-by=string option. There is not option to only select certain fields for import, though you can use --columns=column_list to map columns in your data to fields in the table.

PhpMyAdmin data import performance issues

Originally, my question was related to the fact that PhpMyAdmin's SQL section wasn't working properly. As suggested in the comments, I realized that it was the amount of the input is impossible to handle. However, this didn't provide me with a valid solution of how to deal with the files that have (in my case - 35 thousand record lines) in format of (CSV):
...
20120509,126,1590.6,0
20120509,127,1590.7,1
20120509,129,1590.7,6
...
The Import option in PhpMyadmin is struggling just as the basic copy-paste input in SQL section does. This time, same as previously, it takes 5 minutes until the max execution time is called and then it stops. What is interesting tho, it adds like 6-7 thousand of records into the table. So that means the input actually goes through and does that almost successfully. I also tried halving the amount of data in the file. Nothing has changed however.
There is clearly something wrong now. It is pretty annoying to have to play with the data in php script when simple data import is not work.
Change your php upload max size.
Do you know where your php.ini file is?
First of all, try putting this file into your web root:
phpinfo.php
( see http://php.net/manual/en/function.phpinfo.php )
containing:
<?php
phpinfo();
?>
Then navigate to http://www.yoursite.com/phpinfo.php
Look for "php.ini".
To upload large files you need max_execution_time, post_max_size, upload_max_filesize
Also, do you know where your error.log file is? It would hopefully give you a clue as to what is going wrong.
EDIT:
Here is the query I use for the file import:
$query = "LOAD DATA LOCAL INFILE '$file_name' INTO TABLE `$table_name` FIELDS TERMINATED BY ',' OPTIONALLY
ENCLOSED BY '\"' LINES TERMINATED BY '$nl'";
Where $file_name is the temporary filename from php global variable $_FILES, $table_name is the table already prepared for import, and $nl is a variable for the csv line endings (default to windows line endings but I have an option to select linux line endings).
The other thing is that the table ($table_name) in my script is prepared in advance by first scanning the csv to determine column types. After it determines appropriate column types, it creates the MySQL table to receive the data.
I suggest you try creating the MySQL table definition first, to match what's in the file (data types, character lengths, etc). Then try the above query and see how fast it runs. I don't know how much of a factor the MySQL table definition is on speed.
Also, I have no indexes defined in the table until AFTER the data is loaded. Indexes slow down data loading.

BigDump - UNEXPECTED: Can't set file pointer behind the end of file

While trying to start uplaod the 3.9 GB sql file via BigDump there is error
UNEXPECTED: Can't set file pointer
behind the end of file
Dump of database was exported from PHPMyAdmin. File is not corrupted. What is the problem? What are other ways to import such a big database?
Bigdump uses a INSERT INTO table VALUES (....) kind of method.
This is a very slow way of inserting!
Use
LOAD DATA INFILE 'c:/filename.csv' INTO TABLE table1
Instead. Note the use of forward slashes even on Windows.
See: http://dev.mysql.com/doc/refman/5.1/en/load-data.html
This is the fastest way possible to insert data into a MySQL table.
It will only work if the input file is on the same server as the MySQL server though.
I get similar error: I can't seek into .sql
The reason for this error is, that BigDump tries to set pointer at the end of .sql-File and then find out its size (using fseek() and fteil() functions). As fseek() is failing when you work with files over 2GB, you get this error. Solution is to split your SQL-File into chunks of 1,5GB - 2GB size...

Can I import tab-separated files into MySQL without creating database tables first?

As the title says: I've got a bunch of tab-separated text files containing data.
I know that if I use 'CREATE TABLE' statements to set up all the tables manually, I can then import them into the waiting tables, using 'load data' or 'mysqlimport'.
But is there any way in MySQL to create tables automatically based on the tab files? Seems like there ought to be. (I know that MySQL might have to guess the data type of each column, but you could specify that in the first row of the tab files.)
No, there isn't. You need to CREATE a TABLE first in any case.
Automatically creating tables and guessing field types is not part of the DBMS's job. That is a task best left to an external tool or application (That then creates the necessary CREATE statements).
If your willing to type the data types in the first row, why not type a proper CREATE TABLE statement.
Then you can export the excel data as a txt file and use
LOAD DATA INFILE 'path/file.txt' INTO TABLE your_table;