Im trying to import a csv data into my table in mysql by using:
LOAD DATA INFILE "r'/Users/temp/random_file.csv'"
INTO TABLE random_table
FIELDS TERMINATED BY ','
ENCLOSED BY '"'
LINES TERMINATED BY '\n'
IGNORE 1 ROWS;
But I keep getting this error:
LOAD DATA INFILE "/Users/temp/random_file.csv'"
^
SyntaxError: invalid syntax
And I really don't know why. All help is appreciated!
LOAD DATA INFILE "r'/Users/temp/random_file.csv'"
The file name must be given as a literal string. You seem use raw string which is in Python. Change it to
LOAD DATA INFILE '/Users/temp/random_file.csv'
I encountered an error when trying to upload a CSV file to my database via the command-line interface.
The CSV file is structured as follows:
Id;price;shorttext;text
2020;24;foo;just a longtext for the product
2019;10;bar;"sometimes there are ; in the column"
2018;45;foobar;next longtext for the product
However, in the last column, I have the problem that the character ';' can occur in the text itself, so this column is sometimes specified with "..." as shown in the code above.
During the import I tried the following statement:
LOAD DATA INFILE 'path/to/file.csv' INTO TABLE product Fields OPTIONALLY ENCLOSED BY '"' Terminated by ';' Lines terminated by '\n' IGNORE 1 ROWS;
But then I get an ERROR 1406: Data too long for column "text" at row 2. I think there is a problem because of the ';' in the text in the last column.
Do anyone know a solution, how to handle this optionally '"' in the last column?
Thank you very much :-)
I'm completely novice to MySQL, and I'm trying to load some CSV sheets to the server using HeidiSQL -> tools -> Import CSV. But it keeps giving me this error:
Error 1148: The used command is not allowed with this MySQL version".
Is there a way to fix that, or maybe another way to load a CSV?
For query based csv import you can use load data ... to load csv files.
For more info refer here
Example:
To skip the first line which is header in csv you can use ignore 1 lines
load data local infile 'D:std_table.csv' into table local.student
fields terminated by ','
enclosed by '"'
lines terminated by '\r\n'
ignore 1 lines;
For windows based system use \r\n for line termination and for Linux use \n
Side Note:
You could try using MySQL Workbench for MySQL from official site here.
try this one:
LOAD DATA INFILE 'c:/tmp/discounts.csv'
INTO TABLE discounts
FIELDS TERMINATED BY ','
ENCLOSED BY '"'
LINES TERMINATED BY '\n'
IGNORE 1 ROWS;
I am importing a .csv into a MySql table using LOAD DATA INFILE and would like to find a way around columns containing formatting like "6.10111E+11" -- this should import as "610111447853" but is instead 610111000000. The table col is VARCHAR as sometimes there are letters and those are necessary. Formatting the .csv column as numeric before saving it in the shared location does not seem to work. Can I specify some form of "Set" on that column upon import?
here is my code:
$query = <<<eof
LOAD DATA LOCAL INFILE '/home/c/Documents/UTC/UTC.csv'
INTO TABLE UTC_import
FIELDS TERMINATED BY ',' ENCLOSED BY '"'
LINES TERMINATED BY '\r\n'
IGNORE 1 LINES
eof;
I use the following command to read in a csv:
LOAD DATA INFILE '/Users/Tyler/Desktop/players_escaped.txt'
INTO TABLE players
FIELDS TERMINATED BY ','
ENCLOSED BY '^'
LINES TERMINATED BY '\n';
The csv looks like this:
^1^,^False^,^False^,^Ovie^,^Soko^,^6^,^8^,^210^,^^,^M^,^London^,^^,^^,^^,^^,^0^,^2009^,^^,^False^,^False^,^{299F909C-88D9-4D26-8ADC-3EC1A66168BB}^,^844^,^2013^,^^,^^,^0^,^^,^^,^2011-02-16 20:53:34.877000000^,^^,^2011-02-16 20:53:34.877000000^,^^,^^,^^,^^,^^,^^,^^,^^,^^,^1^,^2^,^^,^^,^^,^^,^^,^^,^^,^^,^^,^^
^2^,^False^,^False^,^Jordan^,^Swing^,^6^,^6^,^200^,^^,^M^,^Birmingham^,^AL^,^35218^,^^,^^,^0^,^2009^,^^,^False^,^False^,^{299F909C-88D9-4D26-8ADC-3EC1A66168BB}^,^844^,^2013^,^^,^^,^0^,^^,^^,^2011-02-16 20:53:34.877000000^,^^,^2011-02-16 20:53:34.877000000^,^^,^^,^^,^^,^^,^^,^^,^^,^^,^1^,^2^,^^,^^,^^,^^,^^,^^,^^,^^,^^,^^
I also tried \ as a delimiter and got the same results.
I'm only getting the odd numbered rows.
There are 250k records in the csv.
Thanks
Have you checked the line endings of the .txt file?