Importing into mysql with symbols - mysql

I have some data (in csv) per country taken from a third-party source, and I am having some issues importing them into mySQL.
For example, one column in my table is Country with value 'Côte d'Ivoire' - the import in mysql appears to divide this one row of data into two with Country value of 'C'. It is unable to import the text value of 'Côte d'Ivoire'.
This is what I used for the import:
TRUNCATE TABLE source_DATA_TABLE;
LOAD DATA INFILE 'H://TESTDATA/2015/source_DATA.csv'
INTO TABLE source_proc_pqr
FIELDS TERMINATED BY '\#'
LINES TERMINATED BY '\n'
IGNORE 1 LINES;
I changed the delimited through my regional settings on my PC to # but the same problem exists.
Anyone have a fix to this problem? I'm using mySQL workbench (xampp/phpmyadmin).

I tried something and it works. first of all the file should be in utf8. and then you can load it like this:
LOAD DATA INFILE 'H://TESTDATA/2015/source_DATA.csv'
INTO TABLE source_proc_pqr
CHARACTER SET UTF8
FIELDS TERMINATED BY '\#'
LINES TERMINATED BY '\n'
IGNORE 1 LINES;

Related

How to Import CSV file into MySQL without having data truncated

I am trying to import a CSV file into MySQL via the command line with following command :
USE database
LOAD DATA LOCAL INFILE 'C:/TEMP/filename.csv' INTO TABLE tablename
FIELDS TERMINATED BY ','
LINES TERMINATED BY '\n';
But when I do that, the system will truncate all the data of some columns to a length of 10 characters. I have tried using the CHARACTER SET option (UTF8, LATIN1), but that did not help. I've also tried to save the initial CSV file differently.
An example of such data item in such column would be "+00000000000#abc.de.fg;abdc=apcde,abcdefghijklmnop#qrs.tu.vw" which will be truncated to +000000000 , sometimes when there is no number it will be truncated to "abc#vgtw.c" or similar.
How do I get the data to be imported without truncating anything, and have the data imported as is ?

Preserving character set when importing .CSV file to MySQL

I want to import an .xlsx file with ~60k rows to MySQL. Some columns contain Vietnamese characters. I managed to convert from .xlsx to .csv without messing up the character set. However I can't do the same when importing .csv to MySQL.
I used LOAD DATA INFILE. It looks something like this:
LOAD DATA LOCAL INFILE 'c:/Projekt/Big Data/events.csv'
INTO TABLE database.table
FIELDS TERMINATED BY ','
ENCLOSED BY '"'
LINES TERMINATED BY '\n'
IGNORE 1 LINES;
(Source: http://blog.habrador.com/2013/01/how-to-import-large-csv-file-into-mysql.html)
This method imports the data fine but the character set of Vietnamese characters are totally messed up. I did change table's collation to utf8_unicode_ci.
I also test the traditional import method of MySQL with smaller datasets and it preserves the font perfectly. However I cannot use it since my file's size exceeds the limit of MySQL.
Really appreciate if someone could help me with this.
Try to explicit character set specify by import:
LOAD DATA LOCAL INFILE 'c:/Projekt/Big Data/events.csv'
INTO TABLE database.table
CHARACTER SET utf8
FIELDS TERMINATED BY ','
ENCLOSED BY '"'
LINES TERMINATED BY '\n'
IGNORE 1 LINES;
See docs for more details about loading from file.

load data infile format general col type to numeric

I am importing a .csv into a MySql table using LOAD DATA INFILE and would like to find a way around columns containing formatting like "6.10111E+11" -- this should import as "610111447853" but is instead 610111000000. The table col is VARCHAR as sometimes there are letters and those are necessary. Formatting the .csv column as numeric before saving it in the shared location does not seem to work. Can I specify some form of "Set" on that column upon import?
here is my code:
$query = <<<eof
LOAD DATA LOCAL INFILE '/home/c/Documents/UTC/UTC.csv'
INTO TABLE UTC_import
FIELDS TERMINATED BY ',' ENCLOSED BY '"'
LINES TERMINATED BY '\r\n'
IGNORE 1 LINES
eof;

mysql load data local infile

I'm trying to load data into a mysql table using LOAD DATA LOCAL INFILE using the code below.
Mysql:
LOAD DATA INFILE '/var/www/vhosts/domain.com/httpdocs/test1.csv' INTO TABLE temp_table FIELDS TERMINATED BY ',' ENCLOSED BY '\"' LINES TERMINATED BY '\r\n' IGNORE 1 LINES (recloc,client_acc)
Edit: changed LOAD DATA LOCAL INFILE to LOADA DATA INFILE, removed SET id=null, added IGNORE 1 LINES
I'm getting no errors and no imported records. I believe the issue is related to the column names but i'm having a hard time fully understanding what those names should be. Should they be the actual column names within the CSV? or the field names in the DB Table? I would also like the have an auto_incremented primary key (id).
CSV:
recloc,client_acc
"NLGSX3","CORPORATE"
"7SC3BA","QUALITY ASSURANCE"
"3B9OHF","90717-6710"
Any suggestions to what I may be doing wrong? thanks!
Column names in CSV are not necessary, so you should add IGNORE 1 LINES clause.
Columns in your query (recloc,client_acc) need to match columns in table.
First column from CSV will be inserted into recloc, second into client_acc.
If you don't specifu AUTO_INCREMENT column in the statement, but there is one in the table, it should fill automatically.
Short and sweet solution for excel to mysql data import:
Working good for txt file formats.
IN DETAIL:
tbl name=t1
feilds are= name varchar,email varchar;
text.txt file <<== this text file first lines table column names:
name, email
"n1", "e1" next line
"n2", "e2" next line
"n3", "e3" next line
"n4", "e4" next line
"n5", "e5" next line
"n6", "e6" next line
"n7", "e7" next line
pls ignore next line statements
SQL query in wamp
LOAD DATA INFILE 'c:/wamp/www/touch/text.txt' INTO TABLE t1 FIELDS TERMINATED BY ',' ENCLOSED BY '\"' LINES TERMINATED BY '\r\n' IGNORE 1 LINES(name,email)
For this commnad run successfully we have create folders for separately.
Real one is
C:\wamp\mysql\data\wamp\www\touch\text.txt <<==pysical file path is.
But we mention c:/wamp/touch/text.txt

Import CSV to MySQL

I have created a database and a table. I have also created all the fields I will be needing. I have created 46 fields including one that is my ID for the row. The CSV doesn't contain the ID field, nor does it contain the headers for the columns. I am new to all of this but have been trying to figure this out. I'm not on here being lazy asking for the answer, but looking for directions.
I'm trying to figure out how to import the CSV but have it start importing data starting at the 2nd field, since I'm hoping the auto_increment will fill in the ID field, which is the first field I created.
I tried these instructions with no luck. Can anyone offer some insight?
The column names of your CSV file must match those of your table
Browse to your required .csv file
Select CSV using LOAD DATA options
Check box 'ON' for Replace table data with file
In Fields terminated by box, type ,
In Fields enclosed by box, "
In Fields escaped by box, \
In Lines terminated by box, auto
In Column names box, type column name separated by , like column1,column2,column3
Check box ON for Use LOCAL keyword.
Edit:
The CSV file is 32.4kb
The first row of my CSV is:
Test Advertiser,23906032166,119938,287898,,585639051,287898 - Engager - 300x250,88793551,Running,295046551,301624551,2/1/2010,8/2/2010,Active,,Guaranteed,Publisher test,Maintainer test,example-site.com,,All,All,,Interest: Dental; custom geo zones: City,300x250,-,CPM,$37.49 ,"4,415","3,246",3,0,$165.52 ,$121.69 ,"2,895",805,0,0,$30.18 ,$37.49 ,0,$0.00 ,IMPRESSIONBASED,NA,USD
You can have MySQL set values for certain columns during import. If your id field is set to auto increment, you can set it to null during import and MySQL will then assign incrementing values to it. Try putting something like this in the SQL tab in phpMyAdmin:
LOAD DATA INFILE 'path/to/file.csv' INTO TABLE your_table FIELDS TERMINATED BY ',' LINES TERMINATED BY '\n' SET id=null;
Please look at this page and see if it has what you are looking for. Should be all you need since you are dealing with just one table. MYSQL LOAD DATA INFILE
So for example you might do something like this:
LOAD DATA INFILE 'filepath' INTO TABLE 'tablename' FIELDS TERMINATED BY ',' LINES TERMINATED BY '\n' (column2, column3, column4);
That should give you an idea. There are of course more options that can be added as seen in the above link.
be sure to use LOAD DATA LOCAL INFILE if the import file is local. :)