Update specific column in MySQL from CSV - mysql

I am looking to import data from a CSV file into MySQL. The CSV is roughly 100 lines and looks as follows.
data1,num1
data2,num2
The MySQL database looks as follows.
<th>aaa </th>|<th>b</th>|<th>c</th>|<th>...</th>|<th>ggg</th>|
----|-|-|---|---|
<td>data</td>|<td>*</td>|<td>*</td>|<td>...</td>|<td>num</td>|
What I am looking to do is find data1 in the MySQL table and replace num with num1 on that line, and to do this for each line in the CSV.
My database knowledge is on the lower end so any help would be appreciated.
**UPDATE
Here is where I am sitting at now. I cannot seem to get this to update more than the first line from the .csv file.
DROP TEMPORARY TABLE IF EXISTS tempTable;
CREATE TEMPORARY TABLE tempTable
(`data` varchar(20) NOT NULL,
`num` varchar(20) NOT NULL
);
LOAD DATA LOCAL INFILE 'file.csv'
INTO TABLE tempTable
FIELDS TERMINATED BY ','
LINES TERMINATED BY '\r\n';
UPDATE DBTable t1
JOIN tempTable t2 ON t1.data = t2.data
SET t1.num = t2.num;
DROP TEMPORARY TABLE tempTable;

You could load CSV file into a new table, then extract data with a JOIN.
Create a temporary table; try to use for data same data type you have on your table.
CREATE TABLE IF NOT EXISTS `temp_table_csv` (
`data` char(20) NOT NULL,
`num` bigint(20) NOT NULL
);
Then load CSV into this new temporary table; we suppose your CSV file has columns header.
LOAD DATA INFILE 'csvfile.csv'
INTO TABLE `temp_table_csv`
FIELDS TERMINATED BY ','
ENCLOSED BY '"'
LINES TERMINATED BY '\r\n'
IGNORE 1 LINES;
Then make a JOIN and write into a file.
SELECT `sx`.`data`, `dx`.`num`
FROM `temp_table_csv` AS `sx` INNER JOIN `datatable` AS `dx`
ON `dx`.`data` = `sx`.`data`
INTO OUTFILE 'newcsv.csv'
FIELDS TERMINATED BY ','
ENCLOSED BY '"'
LINES TERMINATED BY '\n';
At last you can remove temporary table previously created.
DROP TABLE `temp_table_csv`;

Related

SQL Loader script to load data into multiple table not working records are getting discarded

Need help with the below SQL*Loader script. I have created the below test script to load the data into the 3 tables based on the location and the work_type columns.But data is not getting inserted correctly into the 3 tables. I am not able to understand how to do multiple comparisons on the data in when clause. Please help.
LOAD DATA
INFILE *
REPLACE
INTO
TABLE
xx_dumy_test_emp_table
WHEN (WORK_TYPE='EMP') AND (Work_Location = 'IND')
FIELDS TERMINATED BY ';'
TRAILING NULLCOLS
(
Work_Type
,Employee_Role
,First_name
,Last_Name
,Work_Location
)
INTO
TABLE
xx_dumy_test_oversea_employess
WHEN (WORK_TYPE='EMP') AND (Work_Location <> 'IND')
FIELDS TERMINATED BY ';'
TRAILING NULLCOLS
(
Work_Type POSITION(1:3)
,Employee_Role
,First_name
,Last_Name
,Work_Location
)
INTO
TABLE
xx_dumy_test_mgr_table
WHEN (WORK_TYPE='MGR') AND (Work_Location = 'IND')
FIELDS TERMINATED BY ';'
TRAILING NULLCOLS
(
Work_Type POSITION(1:3)
,Employee_Role
,First_name
,Last_Name
,Work_Location
)
--DATA OF THE SCRIPT
EMP;Test Ops1;Gautam;Hoshing;IND;
MGR;Test Ops2;Steve;Tyler;IND;
EMP;Test Ops3;Hyana;Motler;JPY;
Regards,
Gautam.

if exists update else insert csv data MySQL

I am populating a MySQL table with a csv file pulled from a third party source. Every day the csv is updated and I want to update rows in MySQL table if an occurrence of column a, b and c already exists, else insert the row. I used load data infile for the initial load but I want to update against a daily csv pull. I am familiar with INSERT...ON DUPLICATE, but not in the context of a csv import. Any advice on how to nest LOAD DATA LOCAL INFILE within INSERT...ON DUPLICATE a, b, c - or if that is even the best approach would be greatly appreciated.
LOAD DATA LOCAL INFILE 'C:\\Users\\nick\\Desktop\\folder\\file.csv'
INTO TABLE db.tbl
FIELDS TERMINATED BY ','
ENCLOSED BY '"'
LINES TERMINATED BY '\r\n'
IGNORE 1 lines;
Since you use LOAD DATA LOCAL INFILE, it is equivalent to specifying IGNORE: i.e. duplicates would be skipped.
But
If you specify REPLACE, input rows replace existing rows. In other words, rows that have the same value for a primary key or unique index as an existing row.
So you update-import could be
LOAD DATA LOCAL INFILE 'C:\\Users\\nick\\Desktop\\folder\\file.csv'
REPLACE
INTO TABLE db.tbl
FIELDS TERMINATED BY ','
ENCLOSED BY '"'
LINES TERMINATED BY '\r\n'
IGNORE 1 lines;
https://dev.mysql.com/doc/refman/5.6/en/load-data.html
If you need a more complicated merge-logic, you could import CSV to a temp table and then issue INSERT ... SELECT ... ON DUPLICATE KEY UPDATE
I found that the best way to do this is to insert the file with the standard LOAD DATA LOCAL INFILE
LOAD DATA LOCAL INFILE
INTO TABLE db.table
FIELDS TERMINATED BY ','
ENCLOSED BY '"'
LINES TERMINATED BY '\r\n'
IGNORE 1 lines;
And use the following to delete duplicates. Note that the below command is comparing db.table to itself by defining it as both a and b.
delete a.* from db.table a, db.table b
where a.id > b.id
and a.field1 = b.field1
and a.field2 = b.field2
and a.field3 = b.field3;
To use this method it is essential that the id field is an auto incremental primary key.The above command then deletes rows that contain duplication on field1 AND field2 AND field3. In this case it will delete the row with the higher of the two auto incremental ids, this works just as well if we were to use < instead of >.

Adding a string to a field during import with LOAD DATA LOCAL INFILE

Im importing a csv file to a mysql table with the following query;
"LOAD DATA INFILE 'myfielname.csv'
INTO table customers
FIELDS TERMINATED BY ',' OPTIONALLY ENCLOSED BY '\"'
LINES TERMINATED BY '\r'
IGNORE 3 LINES
(sales,regional,accounts)
";
Is there any way to insert a string of characters before a field that is to be imported?
For example: The field 'sales' above refers to account id numbers, which are being used in the application. Id like to append a URL before account number during import so the final record in the table will be as follows:
String I want to come before 'sales', but within the same record: http://www.url.com?id=
If a given sales id was 1234 the final record in the table would be http://www.url.com?id=1234
Thanks in advance for your help.
Try someting like this
LOAD DATA LOCAL INFILE 'C:/test.csv'
INTO TABLE test.test1
FIELDS TERMINATED BY ';'
(#test1col,#test2col)
set test1col=CONCAT('http://url.com?id=',#test1col),test2col=#test2col;
The test csv has 2 columns. I created a test table like this
CREATE TABLE `test1` (
`test1col` varchar(200) DEFAULT NULL,
`test2col` varchar(2000) DEFAULT NULL
) ENGINE=InnoDB DEFAULT CHARSET=utf8;
You could try immediatley with your own, just make sure you name the columns correctly!
Give it a try it worked for me.

load data terminated by does not import

Table:
ID int not null primary key auto_increment
number int not null unique
my file:
111
222
333
LOAD DATA LOCAL INFILE 'file.txt' IGNORE INTO TABLE MyTable fields terminated by '\n' (number); - everything works fine. But if I have:
file:
111;222;333
LOAD DATA LOCAL INFILE 'file.txt' IGNORE INTO TABLE MyTable fields terminated by ';' (number); - it imports only 111 and stops. Why?
If you want to add a list of values separated by a ";" into a single table field use this. This will basically treat each value as a separate record.
LOAD DATA LOCAL INFILE 'file.txt' IGNORE
INTO TABLE MyTable
LINES TERMINATED BY ';'
(number)
;
If you want to insert the 3 fields from the file into the first three fields in the table, remove (number). By including (number) you were specifying that you only wanted to insert data only into the number field.
LOAD DATA LOCAL INFILE 'file.txt' IGNORE
INTO TABLE MyTable fields terminated by ';';
If you want to insert the three fields from the file into three specific fields in the table (not necessarily the first three) you need to list all three. For example if you wanted to insert them into the fields number, field2 and field3 the command would be:
LOAD DATA LOCAL INFILE 'file.txt' IGNORE
INTO TABLE MyTable fields terminated by ';' (number, field2, field3);

LOAD DATA INFILE, skipping first column in table

I am having trouble finding the correct syntax for loading a csv file into table, while skipping the first column which already exists in the table. My table columns looks like this:
ID COL1 COL2 COL3 LOG_DATE
and my csv looks like this :
dataForCol1,dataForCol2,dataForCol3
So I want to load the values in the csv into COL1 COL2 and COL3 , skipping ID. The closest I can get is with SQL like this:
LOAD DATA LOCAL INFILE 'test.csv'
INTO TABLE test_table
FIELDS TERMINATED BY ','
LINES TERMINATED BY '\n'
(ID,COL1,COL2,COL3,LOG_DATE)
SET ID=0, LOG_DATE=CURRENT_TIMESTAMP
Note that I dont know SQL too well and I am not sure if I am using the SET clause correctly either, but this statement will supply the LOG_DATE column with a timestamp, and It will auto increment the ID column (ID is type: int(11) and auto_incerement) but the other data is off by one column so dataForCol1 is missing and dataForCol2 is in COL1 etc.
The column list should iclude the columns that are in the file only:
LOAD DATA LOCAL INFILE 'test.csv'
INTO TABLE test_table
FIELDS TERMINATED BY ','
LINES TERMINATED BY '\n'
(COL1,COL2,COL3)
SET LOG_DATE=CURRENT_TIMESTAMP
Simply move your ID col to the last field position like this:
EG: COL1 COL2 COL3 ID LOG_DATE
Then use the same code without defining your fields or id:
LOAD DATA LOCAL INFILE 'test.csv'
INTO TABLE test_table
FIELDS TERMINATED BY ','
LINES TERMINATED BY '\n'
SET LOG_DATE=CURRENT_TIMESTAMP
Super easy and works great.
The only way I could solve this same problem is to create a temp table to load all the data into, then insert that data into the permanent table. I was using this in conjunction with a multi-file upload process:
# Check for attempted file/s upload
if (isset($_FILES['files']))
{
# Create a temporary table to insert data into
$sql="CREATE TEMPORARY TABLE `Temp` (Region VARCHAR(60), First_Name VARCHAR(35), Last_Name VARCHAR(35), Title VARCHAR(60), Account_Name VARCHAR(60), Phone VARCHAR(21), Email VARCHAR(60));";
$Result = mysql_query($sql,$MySQL_Read);
# Loop through file/s to be uploaded
foreach ($_FILES['files']['tmp_name'] as $key => $tmp_name)
{
# Upload file
move_uploaded_file($tmp_name, $Path."{$_FILES['files']['name'][$key]}");
echo $Path."{$_FILES['files']['name'][$key]}<br>";
# LOAD DATA from CSV file into Temp
$sql="LOAD DATA INFILE '".$Path."{$_FILES['files']['name'][$key]}' INTO TABLE `Temp` FIELDS TERMINATED BY ',' OPTIONALLY ENCLOSED BY '\"' ESCAPED BY '\\\\' LINES TERMINATED BY '\\r' STARTING BY '' IGNORE 1 LINES;";
$Result = mysql_query($sql,$MySQL_Write);
echo($sql.'<br><br>');
# Delete file
unlink($Path."{$_FILES['files']['name'][$key]}");
}
}
# Insert data from temp table into permanent table
$sql="INSERT INTO `Leads` (`Region`, `First_Name`, `Last_Name`, `Title`, `Account_Name`, `Phone`, `Email`) SELECT `Region`, `First_Name`, `Last_Name`, `Title`, `Account_Name`, `Phone`, `Email` FROM `Temp`;";
$Result = mysql_query($sql,$MySQL_Write);
# Delete temporary table
$sql="DROP TEMPORARY TABLE Temp;";
$Result = mysql_query($sql,$MySQL_Write);