I am currently making a mysql database. When I import data from a .csv file holding my companies records, about half of it gets imported normally while the other half gets changed to the same Chinese characters. At first I thought it was the Heidisql tool but after doing a manual load in mysql I still had Chinese characters in the data. Here is an example of the load statement (sorry if it doesn't format right):
LOAD DATA LOCAL INFILE 'c:/text.csv'
INTO TABLE test
FIELDS TERMINATED BY ','
ENCLOSED BY '"'
LINES TERMINATED BY '\n'
In the pathing above, that is not the actual path it is similar to it. The pathing is correct as no errors were given when that was executed.
After I imported the data I used SELECT and UNION ALL to display my data (with another table because that is one of the major goals of this project). Below is that code snippet:
SELECT PartNum, Description, Quantity, Price, '','' FROM Test1
UNION ALL
SELECT * FROM Test2
The two single quote areas are because the Test1 table has 2 less columns than that of the Test2 table. I was getting an error because the Test1 table had fewer columns and I read that this was a workaround. Now to the issue. My first instinct, upon seeing the Chinese characters, was to try a manual load (opposed to Heidisql). After this did not help, my first thought was to check the .csv file. However, I inspected the table and saw that it was arranged alphabetically. My .csv file is not arranged that way. I have no start point or end point to go off of to inspect the .csv file. As a mysql noob, I thought it would be quicker to ask if anyone knows of anything to help me get these rows back to English. Examples of the Chinese characters are the following: 䍏䱌䕃呏剓㩍䌱㔲㌰㐱㠺䵃ㄵ㈱㌭㈭㌰㐱㠭啎, 䙌䵉千䕌䱁久何区吱〰䵓, etc.
Had the same problem. Using mysqlimport instead solved the issue for me.
Related
I have so many data records and I want to import it into a table in database. I'm using phpmyadmin. I typed
LOAD DATA INFILE 'C:/Users/Asus/Desktop/cobacobaa.csv' INTO TABLE akdhis_kelanjutanstudi
but the result came like this:
I do not know why it said that I have duplicate entry "0" for primary, but actually in my data there is no duplicate entry, here is a part of my data
could you please help me to solve this? what may I do to solve that problem? thanks in advance
I would guess that your primary key is a number. The problem would then be that the value starts with a double quote. When converting a string to a number, MySQL converts the leading numeric characters -- with no such characters, the value is zero.
The following might fix your problem:
LOAD DATA INFILE 'C:/Users/Asus/Desktop/cobacobaa.csv'
INTO TABLE akdhis_kelanjutanstudi
FIELDS TERMINATED BY ',' ENCLOSED BY '"';
I usually load data into a staging table, where all the values are strings, and then convert the staging table into the final version. This may use a bit more space in the database but I find that the debugging process goes much faster when something goes wrong.
I am running an SQL statement to read in a CSV file (in this case only 1 column with a heading) and importing it into my database, where further manipulation to the data will occur through subsequent SQL statements.
However, I seem to be unable to load the CSV file into my DB both directly in MySQL Workbench and my PHP website locally as well as on another Mac in my network through the PHP website.
The interesting thing is the query appears to run successfully as I get no errors on any of the platforms or computers but no rows are affected.
I have done a lot of digging in trying to solve the problem. Here is my current SQL code and I will then talk through what I have tried.
LOAD DATA INFILE '/Users/Josh/Desktop/testcsv.csv'
INTO TABLE joinTest
FIELDS
TERMINATED BY ','
ENCLOSED BY '"'
LINES
TERMINATED BY '\r\n'
IGNORE 1 LINES
(interestName);
So this is me trying it in MySQL Workbench. In PHP I have an uploader and variable which stores the location of the tmp file. This works as have echo'd it out and all looks fine.
I've tried running it as
LOAD DATA INFILE
But it still doesn't affect any rows (runs successfully). I've also changed the TERMINATED BY in LINES to just \n but still will not affect any rows.
I can't understand why it is not affecting any rows as my CSV file is readable by all and should be in the correct format (created in Excel and saved as cvs format).
Does anyone know what the potential problem could be?
If any more info is required I will respond with it ASAP. Thanks.
Right so I discovered Mac uses different Line Endings to Unix & Windows. I opened the CSV in Sublime Text 3 and discovered there was an option to change the Line Endings in the View Options.
I set this to Unix, saved the file and the terminator of \n worked. Unfortunately Sublime text doesn't show the line endings as visible characters so this was purely by chance.
I hope this helps anyone else who runs into this issue, make sure the line endings of the CSV match the line endings you are specifying in your LOAD DATA query.
Currently, I have a CSV file with data in it. I want to turn it into a SQL table, so I can run SQL queries on it. I want the table to be within a web-based database that others in my organization can also access. What's the easiest way to go from CSV file to this end result? Would appreciate insight on setting the up database and table, giving others access, and getting data inside. Preferably PostgreSQL, but MySQL is fine too.
To create the table it depends on the number of columns you have. If you have only a few then do it manually:
CREATE TABLE <table name> (<variable name> <variable type (e.g. int or varchar(100)>, <etc.>)
If you have many columns you can open the csv file in excel and get 'SQL Converter for Excel' which will build a create statement for you using your column headings (and autodetect variable types too).
Loading data from a csv is also pretty straightforward:
LOAD DATA INFILE <filepath (e.g. 'C:/Users/<username>/Desktop/test.csv'>
INTO TABLE <table name>
FIELDS TERMINATED BY ','
ENCLOSED BY '"'
LINES TERMINATED BY '\n'
IGNORE 1 ROWS; (Only use this line if you have column names included in the csv).
As for a web-based solution: https://cloud.google.com/products/cloud-sql/
That's a relatively open-ended question. A couple of noteworthy pointers off the top of my head:
MySQL allows you to store your data in different formats, one of them being CSV. That's a very straightforward solution if you're happy with it and don't mind a few limitations (see http://dev.mysql.com/doc/refman/5.0/en/csv-storage-engine.html).
Otherwise you can import your data into a table with a full-featured engine (see other answer(s) for details).
If you're happy with PostgreSQL and look for fully web based solution, have a look at Heroku.
There are a great many ways to make your data available through web services without accessing the back-end data store directly. Have a look at REST and SOAP for instance.
HTH
Originally, my question was related to the fact that PhpMyAdmin's SQL section wasn't working properly. As suggested in the comments, I realized that it was the amount of the input is impossible to handle. However, this didn't provide me with a valid solution of how to deal with the files that have (in my case - 35 thousand record lines) in format of (CSV):
...
20120509,126,1590.6,0
20120509,127,1590.7,1
20120509,129,1590.7,6
...
The Import option in PhpMyadmin is struggling just as the basic copy-paste input in SQL section does. This time, same as previously, it takes 5 minutes until the max execution time is called and then it stops. What is interesting tho, it adds like 6-7 thousand of records into the table. So that means the input actually goes through and does that almost successfully. I also tried halving the amount of data in the file. Nothing has changed however.
There is clearly something wrong now. It is pretty annoying to have to play with the data in php script when simple data import is not work.
Change your php upload max size.
Do you know where your php.ini file is?
First of all, try putting this file into your web root:
phpinfo.php
( see http://php.net/manual/en/function.phpinfo.php )
containing:
<?php
phpinfo();
?>
Then navigate to http://www.yoursite.com/phpinfo.php
Look for "php.ini".
To upload large files you need max_execution_time, post_max_size, upload_max_filesize
Also, do you know where your error.log file is? It would hopefully give you a clue as to what is going wrong.
EDIT:
Here is the query I use for the file import:
$query = "LOAD DATA LOCAL INFILE '$file_name' INTO TABLE `$table_name` FIELDS TERMINATED BY ',' OPTIONALLY
ENCLOSED BY '\"' LINES TERMINATED BY '$nl'";
Where $file_name is the temporary filename from php global variable $_FILES, $table_name is the table already prepared for import, and $nl is a variable for the csv line endings (default to windows line endings but I have an option to select linux line endings).
The other thing is that the table ($table_name) in my script is prepared in advance by first scanning the csv to determine column types. After it determines appropriate column types, it creates the MySQL table to receive the data.
I suggest you try creating the MySQL table definition first, to match what's in the file (data types, character lengths, etc). Then try the above query and see how fast it runs. I don't know how much of a factor the MySQL table definition is on speed.
Also, I have no indexes defined in the table until AFTER the data is loaded. Indexes slow down data loading.
I'm trying to import a csv file into a table in a mysql database. The table has 141 columns with datatypes of INT, VARCHAR(20), TIMESTAMP,TIMESTAMP, and then a series of TINYTEXTs and VARCHAR(4)s.
I'm getting 7 data truncated errors on columns with datatype VARCHAR(4) for which the data does not exceed 4 characters.
I tried forcing it to continue with IGNORE, but then it went crazy and chopped data from other cells into pieces and scattered them throughout the table.
I'm using the command-line on an MS 2008 R2 Server to run the SQL and the csv file is located in the Db's directory.
P.S. I did read several of the other posts and google results related to mysql+import or mysql+data truncated (1265), but they didn't seem to cover this particular issue.
Thanks!
EDIT: I'm pretty sure it has something to do with the LOAD DATA function because if I INSERT one row at a time, it works just fine…
My sql for the import is:
LOAD DATA INFILE '2011-09.csv' IGNORE INTO TABLE `survey`.`2011-09` FIELDS TERMINATED BY ',' LINES TERMINATED BY '\n';
During an INSERT, MySQL will tell you if you have too many columns. But apparently MySQL is not very kind: during a LOAD it does not tell you if you have too many columns.