if exists update else insert csv data MySQL - mysql

I am populating a MySQL table with a csv file pulled from a third party source. Every day the csv is updated and I want to update rows in MySQL table if an occurrence of column a, b and c already exists, else insert the row. I used load data infile for the initial load but I want to update against a daily csv pull. I am familiar with INSERT...ON DUPLICATE, but not in the context of a csv import. Any advice on how to nest LOAD DATA LOCAL INFILE within INSERT...ON DUPLICATE a, b, c - or if that is even the best approach would be greatly appreciated.
LOAD DATA LOCAL INFILE 'C:\\Users\\nick\\Desktop\\folder\\file.csv'
INTO TABLE db.tbl
FIELDS TERMINATED BY ','
ENCLOSED BY '"'
LINES TERMINATED BY '\r\n'
IGNORE 1 lines;

Since you use LOAD DATA LOCAL INFILE, it is equivalent to specifying IGNORE: i.e. duplicates would be skipped.
But
If you specify REPLACE, input rows replace existing rows. In other words, rows that have the same value for a primary key or unique index as an existing row.
So you update-import could be
LOAD DATA LOCAL INFILE 'C:\\Users\\nick\\Desktop\\folder\\file.csv'
REPLACE
INTO TABLE db.tbl
FIELDS TERMINATED BY ','
ENCLOSED BY '"'
LINES TERMINATED BY '\r\n'
IGNORE 1 lines;
https://dev.mysql.com/doc/refman/5.6/en/load-data.html
If you need a more complicated merge-logic, you could import CSV to a temp table and then issue INSERT ... SELECT ... ON DUPLICATE KEY UPDATE

I found that the best way to do this is to insert the file with the standard LOAD DATA LOCAL INFILE
LOAD DATA LOCAL INFILE
INTO TABLE db.table
FIELDS TERMINATED BY ','
ENCLOSED BY '"'
LINES TERMINATED BY '\r\n'
IGNORE 1 lines;
And use the following to delete duplicates. Note that the below command is comparing db.table to itself by defining it as both a and b.
delete a.* from db.table a, db.table b
where a.id > b.id
and a.field1 = b.field1
and a.field2 = b.field2
and a.field3 = b.field3;
To use this method it is essential that the id field is an auto incremental primary key.The above command then deletes rows that contain duplication on field1 AND field2 AND field3. In this case it will delete the row with the higher of the two auto incremental ids, this works just as well if we were to use < instead of >.

Related

ignore first two characters on a column while importing csv to mysql

I am trying to import a csv file to mysql table, But I need to remove First two characters on particular column before importing to mysql.
This is my statment :
string strLoadData = "LOAD DATA LOCAL INFILE 'E:/park/Export.csv' INTO TABLE tickets FIELDS terminated by ',' ENCLOSED BY '\"' lines terminated by '\n' IGNORE 1 LINES (SiteId,DateTime,Serial,DeviceId,AgentAID,VehicleRegistration,CarPark,SpaceNumber,GpsAddress,VehicleType,VehicleMake,VehicleModel,VehicleColour,IssueReasonCode,IssueReason,NoticeLocation,Points,Notes)";
Column IssueReasoncode' has data like 'LU12' , But i need to remove the first 2 characters it should have only integers on it and not alpha numeric .
I need to remove 'LU' from that column.
Is it possible to write like this on left(IssueReasonCode +' '2). This column is varchar(45) and cant be changed now because of large data on it.
Thanks
LOAD DATA INFILE has the ability to perform a function on the data for each column as you read it in (q.v. here). In your case, if you wanted to remove the first two characters from the IssueReasonCode column, you could use:
RIGHT(IssueReasonCode, CHAR_LENGTH(IssueReasonCode) - 2)
to remove the first two characters. You specify such column mappings at the end of the LOAD DATA statement using SET. Your statement should look something like the following:
LOAD DATA LOCAL INFILE 'E:/park/Export.csv' INTO TABLE tickets
FIELDS terminated by ','
ENCLOSED BY '\"'
LINES TERMINATED BY '\n'
IGNORE 1 LINES
(SiteId, DateTime, Serial, DeviceId, AgentAID, VehicleRegistration, CarPark, SpaceNumber,
GpsAddress, VehicleType, VehicleMake, VehicleModel, VehicleColour, IssueReasonCode,
IssueReason, NoticeLocation, Points, Notes)
SET IssueReasonCode = RIGHT(IssueReasonCode, CHAR_LENGTH(IssueReasonCode) - 2)
Referencing this and quoting this example , you can try the below to see if it works
User variables in the SET clause can be used in several ways. The
following example uses the first input column directly for the value
of t1.column1, and assigns the second input column to a user variable
that is subjected to a division operation before being used for the
value of t1.column2:
LOAD DATA INFILE 'file.txt' INTO TABLE t1 (column1, #var1) SET
column2 = #var1/100;
string strLoadData = "LOAD DATA LOCAL INFILE 'E:/park/Export.csv' INTO TABLE tickets FIELDS terminated by ',' ENCLOSED BY '\"' lines terminated by '\n' IGNORE 1 LINES (SiteId,DateTime,Serial,DeviceId,AgentAID,VehicleRegistration,CarPark,SpaceNumber,GpsAddress,VehicleType,VehicleMake,VehicleModel,VehicleColour,#IRC,IssueReason,NoticeLocation,Points,Notes) SET IssueReasonCode = substr(#IRC,2) ;";

Is it possible to perform functions during LOAD DATA INFILE without creating a stored procedure?

I have two columns, q1 and q2, that I'd like to sum together and put in the destination column, q.
The way I do it now, I put the data in an intermediate table, then sum during loading, but I'm wondering if it's possible to do it during extraction instead?
Here's my script:
LOAD DATA INFILE 'C:\temp\foo.csv'
INTO TABLE new_foo
FIELDS TERMINATED BY ','
LINES TERMINATED BY '\n'
IGNORE 1 LINES
(q1,q1)
INSERT INTO foo (q) SELECT q1+q2 AS q
FROM foo_temp;
Try:
LOAD DATA INFILE 'C:\temp\foo.csv'
INTO TABLE `foo`
FIELDS TERMINATED BY ','
LINES TERMINATED BY '\n'
IGNORE 1 LINES
(#`q1`, #`q2`)
SET `q` = #`q1` + #`q2`;

Import CSV into MySQL - Offset by 1 Column

I am importing my data from a csv file like this:
LOAD DATA LOCAL INFILE "c:\myfile.csv" INTO TABLE historic FIELDS TERMINATED BY ',' ENCLOSED BY '"' LINES TERMINATED BY '\n';
This ALMOST works perfect. My table has one different row, the first: An auto incremented id column. Otherwise the csv file and the MySQL table match perfectly. When I run the command above, it puts the first column from the csv into the id field. It want it to actually go into the second.
Is there a way I can modify the above statement to either specify the columns or just offset it by 1? (skipping the first column on import).
You can optionally name the columns to be populated by LOAD DATA, and simply omit your id column:
LOAD DATA LOCAL INFILE "c:\myfile.csv" INTO TABLE historic
FIELDS TERMINATED BY ',' ENCLOSED BY '"' LINES TERMINATED BY '\n'
(column2, colum3, column4, column5, ...)
You can load your data specifing the order columns that you're going to use into your table:
LOAD DATA LOCAL FILE '/tmp/test.txt' INTO TABLE historic
(name_field1, name_field2, name_field3...);
Or if you prefer you can load it first to a temporary table and use a select into statement to laod it into your final table (it's slower).

load data terminated by does not import

Table:
ID int not null primary key auto_increment
number int not null unique
my file:
111
222
333
LOAD DATA LOCAL INFILE 'file.txt' IGNORE INTO TABLE MyTable fields terminated by '\n' (number); - everything works fine. But if I have:
file:
111;222;333
LOAD DATA LOCAL INFILE 'file.txt' IGNORE INTO TABLE MyTable fields terminated by ';' (number); - it imports only 111 and stops. Why?
If you want to add a list of values separated by a ";" into a single table field use this. This will basically treat each value as a separate record.
LOAD DATA LOCAL INFILE 'file.txt' IGNORE
INTO TABLE MyTable
LINES TERMINATED BY ';'
(number)
;
If you want to insert the 3 fields from the file into the first three fields in the table, remove (number). By including (number) you were specifying that you only wanted to insert data only into the number field.
LOAD DATA LOCAL INFILE 'file.txt' IGNORE
INTO TABLE MyTable fields terminated by ';';
If you want to insert the three fields from the file into three specific fields in the table (not necessarily the first three) you need to list all three. For example if you wanted to insert them into the fields number, field2 and field3 the command would be:
LOAD DATA LOCAL INFILE 'file.txt' IGNORE
INTO TABLE MyTable fields terminated by ';' (number, field2, field3);

LOAD DATA INFILE id

I ran the following command:
LOAD DATA INFILE '/Users/Tyler/Desktop/players_20120318.txt' INTO TABLE players FIELDS TERMINATED BY ',' LINES TERMINATED BY '\n';
On this data:
PlayerId,IsActive,IsVisible,FirstName,LastName,HeightFeet,HeightInches,Weight,Birthday,Gender,HometownCity,HometownState,HometownZip,HometownCountry,HighSchoolId,HighSchoolIdTemp,HighSchoolGradYear,CollegeYear,Redshirted,Transferred,CollegeId,CollegeIdTemp,CollegeGradYear,OtherAccountId,PreviousCollegeId,CurrentTeamId,LateralRecommendationReason,LateralRecommendationLink,CreationDate,CreatedBy,LastModifiedDate,LastModifiedBy,TwitterLink,FacebookLink,PersonalWebsite,PlayerImage,FirstNameNickName,NeulionID,OtherTeamID,OtherSportTypeID,SourceDataTypeID,PlayerTypeID,LoadID,SameNameTeammate,SameNameSchoolMate,SD_SportID,SD_PlayerID,ZeroNCAAStats,ModifiedByPythonGame,Missing2011,Transfer2011,RecruitingClass
21,True,True,John,Frost,6,1,185,,M,Decatur,AL,35603,,{A0AD8B45-47E1-4039-85DF-756301035073},7453,2009,JR,False,False,{299F909C-88D9-4D26-8ADC-3EC1A66168BB},844,2013,{EBA5A9E6-E03E-4AE5-B9B8-264339EE9259},,0,,,2011-02-16 20:53:34.877000000,,2012-03-08 01:43:37.593000000,{5EBB0160-E69A-4EA2-89D5-932DD4D58632},,,,,,,45759,1,1,5,,,,,,,,,,
1344,True,True,Zach,Alvord,6,0,173,,M,Alpharetta,GA,30022,,{379BF463-67A9-480E-8FFB-9B50AD494953},11597,2010,SO,False,False,{7208C8FB-6780-4379-BC25-5DC5064C85FD},36,2014,{CDACD2C7-7667-406C-9662-02B378B00032},,0,,,2011-02-16 20:53:34.970000000,,2012-03-07 23:28:17.343000000,{5EBB0160-E69A-4EA2-89D5-932DD4D58632},,,,,,,45710,1,1,5,,,,,,,,,,
And mySQL was taking that first column (PlayerID) and assigning it to the id column. It was also shifting everything over one column (first name was filled in with last name).
Is this the expected behavior?
I believe that MySQL will properly insert the data by skipping the id column as long as it's set to auto_increment. Otherwise you can specify the columns individually as Bobby pointed out.
To avoid this problem, specify the columns you're loading data into and leave out the id field:
LOAD DATA INFILE '/Users/Tyler/Desktop/players_20120318.txt' INTO TABLE players (col1, col2, col3...) FIELDS TERMINATED BY ',' LINES TERMINATED BY '\n';