i use below script for insert data to sql from textpad.
#!/bin/bash
mysql --utest -ptest test << EOF
LOAD DATA INFILE 'test.txt'
INTO TABLE content_delivery_process
FIELDS TERMINATED BY ',';
EOF
in my test file i have a format like,
cast , date , name , buy
i can insert but i need format like below,
S.NO | date | name | buy | cast
You can specify the columns you want to import:
From the MySQL Manual:
MySQL LOAD DATA INFILE
The following example loads all columns of the persondata table:
LOAD DATA INFILE 'persondata.txt' INTO TABLE persondata;
By default, when no column list is provided at the end of the LOAD
DATA INFILE statement, input lines are expected to contain a field for
each table column.
If you want to load only some of a table's columns, specify a column
list:
LOAD DATA INFILE 'persondata.txt' INTO TABLE persondata (col1,col2,...);
You must also specify a column list if the order of the fields in the
input file differs from the order of the columns in the table.
Otherwise, MySQL cannot tell how to match input fields with table
columns.
You would include "FIELDS TERMINATED BY '|';" at the end to import data delimited with a '|' symbol.
Hope this helps.
create table [YOUR TABLE] ( `S.NO` INT AUTO_INCREMENT, date DATETIME, name VARCHAR(50), buy VARCHAR(50), cast VARCHAR(50));
Load data local infile 'test.txt' ignore into table [YOUR TABLE] fields terminated by ',' lines terminated by '\n'(cast , date , name , buy);
Related
I have the data saved in my csv excel. The data have some null value in row and column. I want to save this data into my database in MySQL. But null value is causing problem in saving the data to MySQL. This is the query for creating the table -
create table student (
Std_ID int,
Roll_NO int,
First_Name varchar(10) NOT NULL,
Last_Name varchar(10),
Class int,
constraint test_student primary key (Std_ID)
);
...and it ran successfully. Now I want to save my data from csv to this table using the query -
load data infile 'C:\\ProgramData\\MySQL\\MySQL Server 8.0\\Uploads\\new.csv' into table student fields terminated by ',' lines terminated by '\n' ignore 1 lines;
...and is giving me the error msg -
ERROR 1366 (HY000): Incorrect integer value: '' for column 'XXX' at row X.
For the reference you can use this data.
The same can be find below.
LOAD DATA INFILE 'C:\\ProgramData\\MySQL\\MySQL Server 8.0\\Uploads\\new.csv'
INTO TABLE student
FIELDS TERMINATED BY ','
LINES TERMINATED by '\n'
IGNORE 1 LINES
-- specify columns, use variables for the columns where incorrect value may occur
(Std_ID, #Roll_NO, First_Name, Last_Name, #Class)
-- use preprocessing, replace empty string with NULL but save any other value
SET Roll_NO = NULLIF(#Roll_NO, ''),
Class = NULLIF(#Class, '');
If some column in CSV is empty string then NULL value will be inserted into according column of the table.
Std_ID is not preprocessed because it is defined as PRIMARY KEY, and it cannot be NULL.
UPDATE
OP provides source file sample. Viewing it in HEX mode shows that the file is Windows-style text file, and hence the lines terminator is '\r\n'. After according edition the file is imported successfuly.
I would like to convert MySQL string to required date format .
I have below lines in a file.
30-06-2017,clarke
31-07-2018,warner
my table is having 2 columns .
Column1 datatype :: date
Column2 datatype :: varchar(30)
I have executed below query
load data local infile 'test.txt' into table sample fields terminated by ',' set column1=str_to_date(#c1,'%d-%m-%Y') ;
Column1 data was not loaded and I got below warnings.
Data wad truncated for column1 at row1
May I know what is wrong in the sql query which I am using ?
You have to include the columns (#c1, #c2) from the file. Following command works fine.
LOAD DATA LOCAL INFILE 'test.txt' INTO TABLE sample FIELDS TERMINATED BY ',' (#c1, #c2) SET column1=STR_TO_DATE(#c1,'%d-%m-%Y'), column2=#c2;
Hello I'm using LOAD DATA INFILE to populate a table in MySQL.
LOAD DATA INFILE 'test.txt'
INTO TABLE myTestTable
FIELDS TERMINATED BY '\t'
IGNORE 1 LINES;
Everything is working peachy except that there is a datetime column in my data that is formatted without any delimiter between the date and time sections. Like so:
SomeDateColumn
20050101081946
When I read this in, MySQL replaces all the dates with dummy values. Is there a way to have MySQL read this in correctly straight from a file?
Thanks!
You may call STR_TO_DATE when you run LOAD DATA, and convert the text date to a bona fide date on the fly:
LOAD DATA INFILE 'test.txt'
INTO TABLE myTestTable
FIELDS TERMINATED BY '\t'
IGNORE 1 LINES
(
col1, col2, #var1 -- list out all columns here
)
SET SomeDateColumn = STR_TO_DATE(#var1, '%Y%m%d%h%i%s');
Im importing a csv file to a mysql table with the following query;
"LOAD DATA INFILE 'myfielname.csv'
INTO table customers
FIELDS TERMINATED BY ',' OPTIONALLY ENCLOSED BY '\"'
LINES TERMINATED BY '\r'
IGNORE 3 LINES
(sales,regional,accounts)
";
Is there any way to insert a string of characters before a field that is to be imported?
For example: The field 'sales' above refers to account id numbers, which are being used in the application. Id like to append a URL before account number during import so the final record in the table will be as follows:
String I want to come before 'sales', but within the same record: http://www.url.com?id=
If a given sales id was 1234 the final record in the table would be http://www.url.com?id=1234
Thanks in advance for your help.
Try someting like this
LOAD DATA LOCAL INFILE 'C:/test.csv'
INTO TABLE test.test1
FIELDS TERMINATED BY ';'
(#test1col,#test2col)
set test1col=CONCAT('http://url.com?id=',#test1col),test2col=#test2col;
The test csv has 2 columns. I created a test table like this
CREATE TABLE `test1` (
`test1col` varchar(200) DEFAULT NULL,
`test2col` varchar(2000) DEFAULT NULL
) ENGINE=InnoDB DEFAULT CHARSET=utf8;
You could try immediatley with your own, just make sure you name the columns correctly!
Give it a try it worked for me.
Initially I have uploaded Using load Data Infile row is having like 100000 Im Using Ubuntu
Example:data
ToneCode....Artist...MovieName...Language
1....................Mj..........Null........... English
3....................AB..........Null........... English
4....................CD.........Null........... English
5....................EF..........Null........... English
But Now I have To update Column MovieName Starting From ToneCode 1 till 100000 row I’m having data in .csv file to update .
Please suggest how to upload the .Csv file for existing table with data
I think the fastest way to do this, using purely MySQL and no extra scripting, would be as follows:
CREATE a temporary table, two columns ToneCode and MovieName same as in your target table
load the data from your new CSV file into that using LOAD DATA INFILE
UPDATE your target table using the INNER JOIN-like syntax that http://dev.mysql.com/doc/refman/5.1/en/update.html describes:
UPDATE items,month SET items.price=month.price WHERE items.id=month.id;
this would “join” the two tables items and month (by using just the “comma-syntax” for an INNER JOIN) using the id column as the join criterion, and update the items.price column with the value of the month.price column.
I Have found a solution as u Guys mentioned above
Soln: example
create table A(Id int primary Key, Name Varchar(20),Artist Varchar(20),MovieName Varchar(20));
Add all my 100000 row using
Load data infile '/Path/file.csv' into table tablename(A) fields terminated by ',' enclosed by'"'
lines terminated by '\n'
(Id,Name,Artist) here movie value is null
create temporary table TA(Id int primary Key,MovieName Varchar(20));
Uploaded data to temporary table TA
Load data infile '/Path/file.csv' into table tablename(A) fields terminated by ',' enclosed by'"'
lines terminated by '\n'(IDx,MovieName)
Now using join as u said
Update Tablename(TA),TableName(A) set A.MovieName=TA.MovieName Where A.Id=TA.Id