Import CSV to Update only one column in table - mysql

I have a table that looks like this:
products
--------
id, product, sku, department, quantity
There are approximately 800,000 entries in this table. I have received a new CSV file that updates all of the quantities of each product, for example:
productA, 12
productB, 71
productC, 92
So there are approximately 750,000 updates (50,000 products had no change in quantity).
My question is, how do I import this CSV to update only the quantity based off of the product (unique) but leave the sku, department, and other fields alone? I know how to do this in PHP by looping through the CSV and executing an update for each single line but this seems inefficient.

You can use LOAD DATA INFILE to bulk load the 800,000 rows of data into a temporary table, then use multiple-table UPDATE syntax to join your existing table to the temporary table and update the quantity values.
For example:
CREATE TEMPORARY TABLE your_temp_table LIKE your_table;
LOAD DATA INFILE '/tmp/your_file.csv'
INTO TABLE your_temp_table
FIELDS TERMINATED BY ','
(id, product, sku, department, quantity);
UPDATE your_table
INNER JOIN your_temp_table on your_temp_table.id = your_table.id
SET your_table.quantity = your_temp_table.quantity;
DROP TEMPORARY TABLE your_temp_table;

I would load the update data into a seperate table UPDATE_TABLE and perform an update within MySQL using:
UPDATE PRODUCTS P SET P.QUANTITY=(
SELECT UPDATE_QUANTITY
FROM UPDATE_TABLE
WHERE UPDATE_PRODUCT=P.PRODUCT
)
I dont have a MySQL at hand right now, so I can check the syntax perfectly, it might be you need to add a LIMIT 0,1 to the inner SELECT.

Answer from #ike-walker is indeed correct but also remember to double check how your CSV data if formatted. Many times for example CSV files can have string fields enclosed in double quotes ", and lines ending with \r\n if working on Windows.
By default is assumed that no enclosing character is used and line ending is \n.
More info and examples here https://mariadb.com/kb/en/importing-data-into-mariadb/
This can be fixed by using additional options for FIELDS and LINES
CREATE TEMPORARY TABLE your_temp_table LIKE your_table;
LOAD DATA INFILE '/tmp/your_file.csv'
INTO TABLE your_temp_table
FIELDS
TERMINATED BY ','
OPTIONALLY ENCLOSED BY '"' -- new option
LINES TERMINATED BY '\r\n' -- new option
(id, product, sku, department, quantity);
UPDATE your_table
INNER JOIN your_temp_table on your_temp_table.id = your_table.id
SET your_table.quantity = your_temp_table.quantity;
DROP TEMPORARY TABLE your_temp_table;

Related

INNER JOIN 3 tables with reference INFILE

I have a csv with two columns, col1 is a barcode, col2 is stock quantity.
I have the 3 tables.
Table1:product_option_value
Fields: product_option_value_id, product_option_id, product_id, option_id, option_value_id, quantity, subtract, price, price_prefix, points, points_prefix, weight, weight_prefix
Table2: product_option_newvalue
Fields: product_id, product_option_value_id, sku, upc
I am trying to update the QUANTITY field of the table product_option_value using the sku and quantity in my CSV file, the part I’m having trouble with is I have to use product_option_value_id in the product_option_newvalue table to update QUANTITY field in product_option_value, how would I reference between the two?
Here is what I have. It does not work.
CREATE TABLE oc_product_import LIKE oc_product_option_value;
LOAD DATA INFILE '/var/lib/mysql-files/out.csv'
INTO TABLE oc_product_import
FIELDS TERMINATED BY ','
OPTIONALLY ENCLOSED BY '"'
(sku, quantity);
UPDATE oc_product_option_value AS R
INNER JOIN oc_product_import AS P
ON R.product_option_value_id = P.product_option_value_id
SET R.quantity = P.sku;
DROP TABLE product_import;"
Edit: my issue seems to be that the barcode is only stored in product_option_newvalue, and can only be linked to quantity by referencing product_option_value_id, in both tables, To update the quantity in the product_option_value table.
EDIT2: This is similar code that is working for me. But it does not have to reference PRODUCT_OPTION_VALUE_ID issue I’m working with in the two table as the barcode is included in the product table and not an additional table by reference
DROP TABLE IF EXISTS oc_product_import;
CREATE TABLE oc_product_import LIKE oc_product
LOAD DATA INFILE '/var/lib/mysql-files/out.csv'
INTO TABLE oc_product_import
FIELDS TERMINATED BY ','
OPTIONALLY ENCLOSED BY ';'
(sku, quantity);
UPDATE oc_product AS R
INNER JOIN oc_product_import AS P
ON R.sku = P.sku
SET R.quantity = P.quantity;
DROP TABLE oc_product_import;"
I would recommend using MySQL Workbench. It isn't the best, but it definitely makes situations like these easier to manage.
After you get it setup you can right click on a table and select table data import wizard and manually add a csv file that way.
If you get any errors with this let me know.
Also, if you are on mac you can simply brew cask install mysqlworkbench in terminal.

Uploading CSV into MySQL table with simultaneous JOIN

What I'm trying to do is upload a CSV into a table, while appending information from a third table to the target table using JOIN.
The CSV import.csv (with 1M rows) looks like this:
firstname | lastname
The target table "names" looks like this:
firstname | lastname | gender
And the table "gender" (with 700k rows) looks like this:
firstname | gender
So, my ideal query would look something like this:
LOAD DATA LOCAL INFILE "import.csv"
INTO TABLE names n
LEFT JOIN gender g ON(g.firstname=n.firstname)
Something along those lines, to combine the import with the join so the end result in names has the data from gender and the CSV.
However, I know that LOAD DATA LOCAL INFILE can't be combined with JOIN, and attempts to use INSERT plus JOIN for each line are too CPU intensive.
Any ideas?
You can use SET clause of LOAD DATA INFILE to achieve your goal
LOAD DATA LOCAL INFILE '/path/to/your/file.csv'
INTO TABLE names
FIELDS TERMINATED BY ',' OPTIONALLY ENCLOSED BY '"'
LINES TERMINATED BY '\n' -- or '\r\n' if file has been prepared in Windows
IGNORE 1 LINES -- use this if your first line contains column headers
(#first, #last)
SET firstname = #first,
lastname = #last,
gender =
(
SELECT gender
FROM gender
WHERE firstname = #first
LIMIT 1
)
Make sure that:
you have an index on firstname column in gender table
you don't have any indices on names table before you load data. Add them (indices) after you complete the load.
MySql LOAD DATA INFILE syntax doesn't define JOIN.
CREATA TABLE temporary_table...
LOAD DATA INFILE "import.csv" INTO TABLE temporary_table FIELDS TERMINATED BY '|' ENCLOSED BY '"' LINES TERMINATED BY '\n';
INSERT INTO names(t.firstname, t.lastname, g.gender) SELECT FROM temporary_table t LEFT JOIN gender g ON(g.firstname=n.firstname);
In my experience, the best way to load data into a database is to place it in a staging table first where all the columns are characters. Then, transform the data in the database to your final output.
Applying this to your code:
LOAD DATA LOCAL INFILE "import.csv"
INTO TABLE names_staging;
CREATE TABLE names as
select n.firstname, n.lastname, g.gender
from names_staging n LEFT JOIN
gender g
ON g.firstname = n.firstname;
This makes it possible to identify and fix problems from the data load. You can also easily add additional columns such as primary keys and insert dates into the final table.

mysql insert update LOAD DATA LOCAL INFILE

i am using LOAD DATA LOCAL INFILE to load data into temp table mid.then i use a update query to update found in table product.The only matching field in both is the model.
$q = "LOAD DATA LOCAL INFILE 'Mid.csv' INTO TABLE mid
FIELDS TERMINATED BY '\t' LINES TERMINATED BY '\n' IGNORE 1 LINES
(#col1,#col2,#col3,#col4,#col5,#col6) set model=#col1,price=#col3,stock=#col6 ";
mysql_query($q, $db);
mysql_query('UPDATE mid m, products p set p.products_price= m.price,p.products_quantity= m.stock where p.products_model= m.model');
It works and update the product table.the issue i am having is that there new records in mid table which don't get inserted as i am using the update statement.
I have looked at the insert query and update on duplicate.I have seen loads of examples of when it has to work on one table but none where i have to match it against another table.
Either i am searching for the wrong thing or there is another way to to do this.
i would appreciate any help.
regards
naf
I'm not sure what the other columns in the product table are, but here's a basic approach that should work for you based on the 3 columns in your example, assuming the products_model column is unique in the products table:
insert into products (products_price,products_quantity,products_model)
select price, stock, model
from mid
on duplicate key update
products_price = values(products_price),
products_quantity = values(products_quantity)

Import CSV Pulling One Column Field from Existing Table

I'm learning MySQL and PHP (running XAMPP and also using HeidiSQL) but have a live project for work that I'm trying to use it instead of the gazillion spreadsheets in which the information is currently located.
I want to import 1,000+ rows into a table (tbl_searches) where one of the columns is a string (contract_no). Information not in the the spreadsheet required by tbl_searches includes search_id (PK and is AUTO_INCREMENT) and contract_id. So the only field I am really missing is contract_id. I have a table (tbl_contracts) that contains contract_id and contract_no. So I think I can have the import use the string contract_no to reference that table to grab the contract_id for the contract_no, but I don't know how.
[EDIT] I forgot to mention I have successfully imported the info using HeidiSQL after I exported the tbl_contracts to Excel and then used it the Excel VLOOKUP function but that ended up yielding incorrect data somehow.
You can do it like this
LOAD DATA LOCAL INFILE '/path/to/your/file.csv'
INTO TABLE table1
FIELDS TERMINATED BY ','
OPTIONALLY ENCLOSED BY '"'
LINES TERMINATED BY '\n' -- or '\r\n' if the file has been prepared on Windows
(#field1, #contract_no, #field2, #field3,...)
SET column1 = #field1,
contract_id = (SELECT contract_id
FROM tbl_contracts
WHERE contract_no = #contract_no
LIMIT 1),
column2 = #field2,
column3 = #field3
...
try something like this: (I am assuming that you have data in tbl_contracts)
<?php
$handle = fopen("data_for_table_searches.csv", "r");
while (($data = fgetcsv($handle,",")) !== FALSE) { // get CSV data from you file
$contract_id = query("SELECT contract_id FROM tbl_contracts WHERE contract_number = " . $data[<row for contract number>]); // whatever is the equivalent in heidi SQL, to get contract id
query("INSERT INTO tbl_searches values($contract_id, data[0], data[1], data[2],...)"); // whatever is the equivalent in heidi SQL, insert data, including contract id into tbl_searches
}
fclose($handle);
?>
Thanks for everyone's input. peterm's guidance helped me get the data imported. Rahul, I should have mentioned that I was not using PHP for this task, but rather just trying to get the data into the tables using HeidiSQL. user4035 asked for more detail and so that's here too.
I have three tables in the database.
tbl_status has two fields, status_ID (AUTO_INCREMENT) and status_name.
tbl_contracts has two columns, contract_ID (AUTO_INCREMENT) and contract_no (a string).
The last table (tbl_searches) will be the active(?) table in that this is where the users' actions will be recorded.
The first two of these tables were easily populated. tbl_status has 11 rows that will describe the status of the contract and these were just typed into an Excel spreadsheet and imported via CSV through HeidiSQL.
For the second table I had 1,000+ "contracts" to import and so I left the first column in Excel blank and the second column containing the string of the contract and imported them the same way.
The third table has seven fields: search_id (AUTO_INCREMENT), contract_id, contract_no, status_id, notes, initials and search_date (I forgot about that one until just now).
I wanted to insert the spreadsheet that had the search information on it into tbl_searches. It has the contract_no, but not the contract_id. I needed to insert the rows and have the query grab the contract_id from tbl_contracts. It took me a bit to get it right without errors and some unexpected results. (The following query omits the need for search_date.)
LOAD DATA LOCAL INFILE '\\\\PATH\\PATH\\PATH\\PATH\\FILENAME.csv'
INTO TABLE `hoa_work`.`tbl_searches`
FIELDS TERMINATED BY ',' OPTIONALLY ENCLOSED BY '"' ESCAPED BY '"' LINES TERMINATED BY '\r\n'
IGNORE 1 LINES --because the first row of the CSV has column headers
(#search_id, #contract_id, #contract_no, #status_id, #notes, #initials)
SET
search_id = NULL, --is an AUTO_INCREMENT field
contract_id = (SELECT contract_id
FROM tbl_contracts
WHERE contract_no = #contract_no
LIMIT 1),
contract_no = #contract_no,
status_id = #status_id,
notes = #notes,
initials = #initials;
/* Affected rows: 1,011 Found rows: 0 Warnings: 0 Duration for 1 query: 0.406 sec. */
I learned here that the #blah are user variables. If I run the following query it will tell me how the variable is defined. Since I was inserting 1,000+ rows from the CSV file it gave me the answer for the last row that it inserted.
SELECT #contract_no
If you have any suggested improvements on the way I ultimately wrote the query please do tell me.
-Matt

Mysql Load Data for existing column of a table

Initially I have uploaded Using load Data Infile row is having like 100000 Im Using Ubuntu
Example:data
ToneCode....Artist...MovieName...Language
1....................Mj..........Null........... English
3....................AB..........Null........... English
4....................CD.........Null........... English
5....................EF..........Null........... English
But Now I have To update Column MovieName Starting From ToneCode 1 till 100000 row I’m having data in .csv file to update .
Please suggest how to upload the .Csv file for existing table with data
I think the fastest way to do this, using purely MySQL and no extra scripting, would be as follows:
CREATE a temporary table, two columns ToneCode and MovieName same as in your target table
load the data from your new CSV file into that using LOAD DATA INFILE
UPDATE your target table using the INNER JOIN-like syntax that http://dev.mysql.com/doc/refman/5.1/en/update.html describes:
UPDATE items,month SET items.price=month.price WHERE items.id=month.id;
this would “join” the two tables items and month (by using just the “comma-syntax” for an INNER JOIN) using the id column as the join criterion, and update the items.price column with the value of the month.price column.
I Have found a solution as u Guys mentioned above
Soln: example
create table A(Id int primary Key, Name Varchar(20),Artist Varchar(20),MovieName Varchar(20));
Add all my 100000 row using
Load data infile '/Path/file.csv' into table tablename(A) fields terminated by ',' enclosed by'"'
lines terminated by '\n'
(Id,Name,Artist) here movie value is null
create temporary table TA(Id int primary Key,MovieName Varchar(20));
Uploaded data to temporary table TA
Load data infile '/Path/file.csv' into table tablename(A) fields terminated by ',' enclosed by'"'
lines terminated by '\n'(IDx,MovieName)
Now using join as u said
Update Tablename(TA),TableName(A) set A.MovieName=TA.MovieName Where A.Id=TA.Id