MySQL LOAD DATA INFILE Query running successfully but no output - mysql

I am running an SQL statement to read in a CSV file (in this case only 1 column with a heading) and importing it into my database, where further manipulation to the data will occur through subsequent SQL statements.
However, I seem to be unable to load the CSV file into my DB both directly in MySQL Workbench and my PHP website locally as well as on another Mac in my network through the PHP website.
The interesting thing is the query appears to run successfully as I get no errors on any of the platforms or computers but no rows are affected.
I have done a lot of digging in trying to solve the problem. Here is my current SQL code and I will then talk through what I have tried.
LOAD DATA INFILE '/Users/Josh/Desktop/testcsv.csv'
INTO TABLE joinTest
FIELDS
TERMINATED BY ','
ENCLOSED BY '"'
LINES
TERMINATED BY '\r\n'
IGNORE 1 LINES
(interestName);
So this is me trying it in MySQL Workbench. In PHP I have an uploader and variable which stores the location of the tmp file. This works as have echo'd it out and all looks fine.
I've tried running it as
LOAD DATA INFILE
But it still doesn't affect any rows (runs successfully). I've also changed the TERMINATED BY in LINES to just \n but still will not affect any rows.
I can't understand why it is not affecting any rows as my CSV file is readable by all and should be in the correct format (created in Excel and saved as cvs format).
Does anyone know what the potential problem could be?
If any more info is required I will respond with it ASAP. Thanks.

Right so I discovered Mac uses different Line Endings to Unix & Windows. I opened the CSV in Sublime Text 3 and discovered there was an option to change the Line Endings in the View Options.
I set this to Unix, saved the file and the terminator of \n worked. Unfortunately Sublime text doesn't show the line endings as visible characters so this was purely by chance.
I hope this helps anyone else who runs into this issue, make sure the line endings of the CSV match the line endings you are specifying in your LOAD DATA query.

Related

How to export your database after you make some queries in mysql workbench?

I'm new in mysql workbench properties. so here is the situation
i have 2 tables, 'student' and 'classes'. a student can have multiple classes with student ID as the connected field. one to many relationship. i wrote some queries that connects the two table (i.e using join,...) and i want to export what i have on my queries rather than the two tables (which i got from data export wizard).
i've tried to export to csv file using codes but came across the error 1290
select teacher.student.U_id, teacher.student.U_id, teacher.student.F_name, teacher.student.L_name
teacher.classess.days,teacher.classess.mor, teacher.classess.aft
from teacher.student, teacher.classesss
where teacher.student.U_id=teacher.classess.U_id
INTO OUTFILE 'C:\Users\Eddie Vu\Downloads'
FIELDS ENCLOSED BY '"'
TERMINATED BY ';'
ESCAPED BY '"'
LINES TERMINATED BY '\r\n';
i expect the output to be store in a csv file.
please help, thank you in advance
Your command, if it worked, would create the file on the server machine. But probably that directory doesn't exist or MySQL isn't allowed to write to it or it is configured not to do so.
But I suppose you want to export to a file on the client machine.
Note the section "Export/Import" in the little toolbar above the result grid and the little button with a disk in front of a grid.
If you click that button a dialog opens that allows you to save the current result in the grid to a file.

.csv import has random Chinese characters (mysql)

I am currently making a mysql database. When I import data from a .csv file holding my companies records, about half of it gets imported normally while the other half gets changed to the same Chinese characters. At first I thought it was the Heidisql tool but after doing a manual load in mysql I still had Chinese characters in the data. Here is an example of the load statement (sorry if it doesn't format right):
LOAD DATA LOCAL INFILE 'c:/text.csv'
INTO TABLE test
FIELDS TERMINATED BY ','
ENCLOSED BY '"'
LINES TERMINATED BY '\n'
In the pathing above, that is not the actual path it is similar to it. The pathing is correct as no errors were given when that was executed.
After I imported the data I used SELECT and UNION ALL to display my data (with another table because that is one of the major goals of this project). Below is that code snippet:
SELECT PartNum, Description, Quantity, Price, '','' FROM Test1
UNION ALL
SELECT * FROM Test2
The two single quote areas are because the Test1 table has 2 less columns than that of the Test2 table. I was getting an error because the Test1 table had fewer columns and I read that this was a workaround. Now to the issue. My first instinct, upon seeing the Chinese characters, was to try a manual load (opposed to Heidisql). After this did not help, my first thought was to check the .csv file. However, I inspected the table and saw that it was arranged alphabetically. My .csv file is not arranged that way. I have no start point or end point to go off of to inspect the .csv file. As a mysql noob, I thought it would be quicker to ask if anyone knows of anything to help me get these rows back to English. Examples of the Chinese characters are the following: 䍏䱌䕃呏剓㩍䌱㔲㌰㐱㠺䵃ㄵ㈱㌭㈭㌰㐱㠭啎, 䙌䵉千䕌䱁久何区吱〰䵓, etc.
Had the same problem. Using mysqlimport instead solved the issue for me.

If Query result is big its creating multiple line

I am exporting a oracle table data that has 165 rows into a delimited separated text file.
And then importing into mysql table by Load Data Infile command.
Now the problem is, few row is too long so its created separate line in the text file
while import its creating problem.
my text file data is pipeline(|) separated and enclosed with double quote(").
And its on windows server
Any help will be much appreciated.
its nothing wrong with oracle export,
it is the problem with notepad
i open it in different editor and every row in same line

PhpMyAdmin data import performance issues

Originally, my question was related to the fact that PhpMyAdmin's SQL section wasn't working properly. As suggested in the comments, I realized that it was the amount of the input is impossible to handle. However, this didn't provide me with a valid solution of how to deal with the files that have (in my case - 35 thousand record lines) in format of (CSV):
...
20120509,126,1590.6,0
20120509,127,1590.7,1
20120509,129,1590.7,6
...
The Import option in PhpMyadmin is struggling just as the basic copy-paste input in SQL section does. This time, same as previously, it takes 5 minutes until the max execution time is called and then it stops. What is interesting tho, it adds like 6-7 thousand of records into the table. So that means the input actually goes through and does that almost successfully. I also tried halving the amount of data in the file. Nothing has changed however.
There is clearly something wrong now. It is pretty annoying to have to play with the data in php script when simple data import is not work.
Change your php upload max size.
Do you know where your php.ini file is?
First of all, try putting this file into your web root:
phpinfo.php
( see http://php.net/manual/en/function.phpinfo.php )
containing:
<?php
phpinfo();
?>
Then navigate to http://www.yoursite.com/phpinfo.php
Look for "php.ini".
To upload large files you need max_execution_time, post_max_size, upload_max_filesize
Also, do you know where your error.log file is? It would hopefully give you a clue as to what is going wrong.
EDIT:
Here is the query I use for the file import:
$query = "LOAD DATA LOCAL INFILE '$file_name' INTO TABLE `$table_name` FIELDS TERMINATED BY ',' OPTIONALLY
ENCLOSED BY '\"' LINES TERMINATED BY '$nl'";
Where $file_name is the temporary filename from php global variable $_FILES, $table_name is the table already prepared for import, and $nl is a variable for the csv line endings (default to windows line endings but I have an option to select linux line endings).
The other thing is that the table ($table_name) in my script is prepared in advance by first scanning the csv to determine column types. After it determines appropriate column types, it creates the MySQL table to receive the data.
I suggest you try creating the MySQL table definition first, to match what's in the file (data types, character lengths, etc). Then try the above query and see how fast it runs. I don't know how much of a factor the MySQL table definition is on speed.
Also, I have no indexes defined in the table until AFTER the data is loaded. Indexes slow down data loading.

Use MySQL into OUTFILE to generate Excel readable UTF8 data

I have a problem similar to this question. That is - I need to export some UTF8 data within a MySQL database to MS Excel.
The gotchas kindly Excel provides:
Excel opens UTF8 formatted CSV files as ANSCI, thus breaking
Excel will open tab-seperated UTF8 files correctly, but there is no support for linebreaks (my data has linebreaks, though in a worst-case scenario I might be able to loose these)
Excel will, apparently, open UTF-16LE (little endian) encoded CSVs OK. However, so far as I know, MySQL INTO OUTFILE does not accept content encoding argument, and just defaults to the database encoding (UTF8).
My web-app is PHP driven, but unfortunately I cannot use a PHP Excel-file-making library since the database is pretty large. All my exports must be done through MySQL.
If anybody knows how to make MySQL jump through Excel's hoops on this one, that would be great.
Many thanks,
Jack
Edit: This answer describes a solution that works for Excel 2007. Adding a 'BOM' to the file, which I may be able to do by providing the outputted file to the client via a PHP script that appends the BOM. Ideally I would like to find a solution that works in 2003 also.
For handling likebreaks I suggest adding:
FIELDS ENCLOSED BY '"'
A more complete example from mysql docs:
SELECT a,b,a+b INTO OUTFILE '/tmp/result.txt'
FIELDS TERMINATED BY ',' OPTIONALLY ENCLOSED BY '"'
LINES TERMINATED BY '\n'
FROM test_table;
I recall running into this issue with Excel. The BOM fix does work for Excel 2007 and 2010. We also wanted to support 2003, however our solution was to just write XLS files instead of CSV files (using Java). That doesn't sound like an option for you since you're exporting from MySQL. One idea that comes to mind is to convert your UTF8 output to UTF-16LE after your export. This page explains how to do it with Perl.