How to Import CSV file into MySQL without having data truncated - mysql

I am trying to import a CSV file into MySQL via the command line with following command :
USE database
LOAD DATA LOCAL INFILE 'C:/TEMP/filename.csv' INTO TABLE tablename
FIELDS TERMINATED BY ','
LINES TERMINATED BY '\n';
But when I do that, the system will truncate all the data of some columns to a length of 10 characters. I have tried using the CHARACTER SET option (UTF8, LATIN1), but that did not help. I've also tried to save the initial CSV file differently.
An example of such data item in such column would be "+00000000000#abc.de.fg;abdc=apcde,abcdefghijklmnop#qrs.tu.vw" which will be truncated to +000000000 , sometimes when there is no number it will be truncated to "abc#vgtw.c" or similar.
How do I get the data to be imported without truncating anything, and have the data imported as is ?

Related

Trouble loading data from txt file into MySQL with LOAD DATA INFILE

I am having a hard time loading my data to MySQL from a text file. I have been attempting to choose the correct delimiters but my file contains column names with each value.
The data is structured like this
{"id":"15","name":"greg","age":"32"}
{"id":"16","name":"jim","age":"42"}
the sql statement I am working on looks something like this currently
LOAD DATA LOCAL INFILE '/xxx.txt' INTO TABLE t1 FIELDS
TERMINATED BY ',' OPTIONALLY ENCLOSED BY '"' LINES TERMINATED BY '\r\n'(id, name, age);
results are being stored like this
{"id":"16", "name","greg"}
I need to do away with the column name and store the value.
any tips?
Writing a script as suggested by #Shadow will be easier to wrap around but you can check the section on json on this page how to import json-text-xml and csv data into mysql
or
Import JSON to MySQL made easy with the MySQL Shell if you are using MySQL Shell 8.0.13 (GA)

How to Load amount using LOAD DATA LOCAL INFILE?

i'm unable to load amount using LOAD DATA LOCAL INFILE
I have a text file where there is one column called 'Amount'. My problem is this field has values as 20,200.00 and -61,066.11 etc in the text file. When I try to load these values in my MySQL database, it loads only 20 and -61.
What should I do to load the full amount in my table?
I am using Decimal(10,2) data type.
I am using following query.
LOAD DATA LOCAL INFILE 'C:/Users/Dataction123/Desktop/nordic.txt' INTO TABLE spend_nordic LINES TERMINATED BY '\r\n';
LOAD DATA LOCAL INFILE 'C:/Users/Dataction123/Desktop/nordic.txt' IGNORE
INTO TABLE spend_nordic
FIELDS TERMINATED BY '\t' LINES TERMINATED BY '\r\n'
(int_col, #float_col)
SET float_col = replace(#float_col, ',', '.');
Explanation :
MySQL casts the value into the column type when it reads it from the file, before applying the transformations described in the SET clause. If you instruct it to read it into a variable the value is handled as string from the beginning. float_col is name of field.
For reference Click here

Importing into mysql with symbols

I have some data (in csv) per country taken from a third-party source, and I am having some issues importing them into mySQL.
For example, one column in my table is Country with value 'Côte d'Ivoire' - the import in mysql appears to divide this one row of data into two with Country value of 'C'. It is unable to import the text value of 'Côte d'Ivoire'.
This is what I used for the import:
TRUNCATE TABLE source_DATA_TABLE;
LOAD DATA INFILE 'H://TESTDATA/2015/source_DATA.csv'
INTO TABLE source_proc_pqr
FIELDS TERMINATED BY '\#'
LINES TERMINATED BY '\n'
IGNORE 1 LINES;
I changed the delimited through my regional settings on my PC to # but the same problem exists.
Anyone have a fix to this problem? I'm using mySQL workbench (xampp/phpmyadmin).
I tried something and it works. first of all the file should be in utf8. and then you can load it like this:
LOAD DATA INFILE 'H://TESTDATA/2015/source_DATA.csv'
INTO TABLE source_proc_pqr
CHARACTER SET UTF8
FIELDS TERMINATED BY '\#'
LINES TERMINATED BY '\n'
IGNORE 1 LINES;

load data infile syntax in mysql

I am using LOAD DATA local INFILE command to upload big csv file in mysql db. The field is an integer in the mysql table and so it will accept only integer.
Now if the csv file contains numbers like below then the sql doesn't work.The format of my csv file is
ID
"322"
"445"
"211"
So it is inserting 0 in the table. But if the csv file contains no double quotes then it's fine.
How can I tell "load data infile" command to ignore double quotes( as sometimes it is too difficult to remove double quotes from csv file which has got about 1 million records).My Sql is
$insert_query = "LOAD DATA local INFILE '".$target_path."'
INTO table master_huts
IGNORE 1 LINES
(hids)
";
Try adding this:
FIELDS OPTIONALLY ENCLOSED BY '"'
before:
IGNORE 1 LINES

mysql load data local infile syntax issues with set fields

I'm trying to use mysql's LOAD DATA LOCAL INFILE syntax to load a .csv file into an existing table. Here is one record from my .csv file (with headers):
PROD, PLANT,PORD, REVN,A_CPN, A_CREV,BRDI, DTE, LTME
100100128144,12T1,2070000,04,3DB18194ACAA,05_01,ALA13320004,20130807,171442
The issue is that I want 3 extra things done during import:
A RECORDID INT NOT NULL AUTO_INTEGER PRIMARY_KEY field should be incremented as each row gets inserted (this table column and structure already exists within the mysql table)
DTE and LTME should be concatenated and converted to a mysql DATETIME format and inserted into an existing mysql column named TRANS_OCR
A CREATED TIMESTAMP field should be set to the current unix timestamp on row insertion (this table column and structure already exists as well within the mysql table)
I'm trying to import this data into the mysql table with the following command:
LOAD DATA LOCAL INFILE 'myfile.csv' INTO TABLE seriallog
FIELDS TERMINATED BY ',' OPTIONALLY ENCLOSED BY '\"'
LINES TERMINATED BY '\n'
IGNORE 1 LINES
(FLEX_PN, FLEX_PLANT, FLEX_ORDID, FLEX_REV, CUST_PN, CUST_REV, SERIALID)
SET CREATED = CURRENT_TIMESTAMP;
I think I have the CREATED column set properly but the others are causing a mysql warning to be issued:
Warning: Out of range value for column 'FLEX_PN' at row 1
Warning: Row 1 was truncated; it contained more data than there were input columns
Can someone help me with the syntax, the LOAD DATA LOCAL INFILE module is confusing to me...
Figured out the proper syntax to make this work:
sql = """LOAD DATA LOCAL INFILE %s INTO TABLE seriallog_dev
FIELDS TERMINATED BY ',' OPTIONALLY ENCLOSED BY '\"'
LINES TERMINATED BY '\\n'
IGNORE 1 LINES
(FLEX_PN, FLEX_PLANT, FLEX_ORDID, FLEX_REV, CUST_PN, CUST_REV, SERIALID, #DTE, #LTME)
SET RECORDID = NULL,
TRANS_OCR = STR_TO_DATE(CONCAT(#DTE,'',#LTME), "%%Y%%m%%d%%H%%i%%s"),
CREATED = CURRENT_TIMESTAMP;"""
params = (file,)
self.db.query( sql, params )
Mind you--this is done with python's mysqldb module.
CAVEAT
The only issue with this solution is that for some reason my bulk insert only inserts the first 217 rows of data from my file. My total file size is 19KB so I can't imagine that it is too large for the mysql buffers... so what gives?
more info
Also, I just tried this syntax directly within the msyql-server CLI and it works for all 255 records. So, obviously it is some problem with python, the python mysqldb module, or the mysql connection that the mysqldb module makes...
DONE
I JUST figured out the problem, it had nothing to do with the load data local infile command but rather the method I was using to convert my original .dbf file into the .csv before attempting to import the .csv. For some reason the mysql import method was running on the .csv before .dbf to .csv conversion method finished -- resulting in a partial data set being found in the .csv file and imported... sorry to waste everyone's time!