Initially I have uploaded Using load Data Infile row is having like 100000 Im Using Ubuntu
Example:data
ToneCode....Artist...MovieName...Language
1....................Mj..........Null........... English
3....................AB..........Null........... English
4....................CD.........Null........... English
5....................EF..........Null........... English
But Now I have To update Column MovieName Starting From ToneCode 1 till 100000 row I’m having data in .csv file to update .
Please suggest how to upload the .Csv file for existing table with data
I think the fastest way to do this, using purely MySQL and no extra scripting, would be as follows:
CREATE a temporary table, two columns ToneCode and MovieName same as in your target table
load the data from your new CSV file into that using LOAD DATA INFILE
UPDATE your target table using the INNER JOIN-like syntax that http://dev.mysql.com/doc/refman/5.1/en/update.html describes:
UPDATE items,month SET items.price=month.price WHERE items.id=month.id;
this would “join” the two tables items and month (by using just the “comma-syntax” for an INNER JOIN) using the id column as the join criterion, and update the items.price column with the value of the month.price column.
I Have found a solution as u Guys mentioned above
Soln: example
create table A(Id int primary Key, Name Varchar(20),Artist Varchar(20),MovieName Varchar(20));
Add all my 100000 row using
Load data infile '/Path/file.csv' into table tablename(A) fields terminated by ',' enclosed by'"'
lines terminated by '\n'
(Id,Name,Artist) here movie value is null
create temporary table TA(Id int primary Key,MovieName Varchar(20));
Uploaded data to temporary table TA
Load data infile '/Path/file.csv' into table tablename(A) fields terminated by ',' enclosed by'"'
lines terminated by '\n'(IDx,MovieName)
Now using join as u said
Update Tablename(TA),TableName(A) set A.MovieName=TA.MovieName Where A.Id=TA.Id
Related
i use below script for insert data to sql from textpad.
#!/bin/bash
mysql --utest -ptest test << EOF
LOAD DATA INFILE 'test.txt'
INTO TABLE content_delivery_process
FIELDS TERMINATED BY ',';
EOF
in my test file i have a format like,
cast , date , name , buy
i can insert but i need format like below,
S.NO | date | name | buy | cast
You can specify the columns you want to import:
From the MySQL Manual:
MySQL LOAD DATA INFILE
The following example loads all columns of the persondata table:
LOAD DATA INFILE 'persondata.txt' INTO TABLE persondata;
By default, when no column list is provided at the end of the LOAD
DATA INFILE statement, input lines are expected to contain a field for
each table column.
If you want to load only some of a table's columns, specify a column
list:
LOAD DATA INFILE 'persondata.txt' INTO TABLE persondata (col1,col2,...);
You must also specify a column list if the order of the fields in the
input file differs from the order of the columns in the table.
Otherwise, MySQL cannot tell how to match input fields with table
columns.
You would include "FIELDS TERMINATED BY '|';" at the end to import data delimited with a '|' symbol.
Hope this helps.
create table [YOUR TABLE] ( `S.NO` INT AUTO_INCREMENT, date DATETIME, name VARCHAR(50), buy VARCHAR(50), cast VARCHAR(50));
Load data local infile 'test.txt' ignore into table [YOUR TABLE] fields terminated by ',' lines terminated by '\n'(cast , date , name , buy);
What I'm trying to do is upload a CSV into a table, while appending information from a third table to the target table using JOIN.
The CSV import.csv (with 1M rows) looks like this:
firstname | lastname
The target table "names" looks like this:
firstname | lastname | gender
And the table "gender" (with 700k rows) looks like this:
firstname | gender
So, my ideal query would look something like this:
LOAD DATA LOCAL INFILE "import.csv"
INTO TABLE names n
LEFT JOIN gender g ON(g.firstname=n.firstname)
Something along those lines, to combine the import with the join so the end result in names has the data from gender and the CSV.
However, I know that LOAD DATA LOCAL INFILE can't be combined with JOIN, and attempts to use INSERT plus JOIN for each line are too CPU intensive.
Any ideas?
You can use SET clause of LOAD DATA INFILE to achieve your goal
LOAD DATA LOCAL INFILE '/path/to/your/file.csv'
INTO TABLE names
FIELDS TERMINATED BY ',' OPTIONALLY ENCLOSED BY '"'
LINES TERMINATED BY '\n' -- or '\r\n' if file has been prepared in Windows
IGNORE 1 LINES -- use this if your first line contains column headers
(#first, #last)
SET firstname = #first,
lastname = #last,
gender =
(
SELECT gender
FROM gender
WHERE firstname = #first
LIMIT 1
)
Make sure that:
you have an index on firstname column in gender table
you don't have any indices on names table before you load data. Add them (indices) after you complete the load.
MySql LOAD DATA INFILE syntax doesn't define JOIN.
CREATA TABLE temporary_table...
LOAD DATA INFILE "import.csv" INTO TABLE temporary_table FIELDS TERMINATED BY '|' ENCLOSED BY '"' LINES TERMINATED BY '\n';
INSERT INTO names(t.firstname, t.lastname, g.gender) SELECT FROM temporary_table t LEFT JOIN gender g ON(g.firstname=n.firstname);
In my experience, the best way to load data into a database is to place it in a staging table first where all the columns are characters. Then, transform the data in the database to your final output.
Applying this to your code:
LOAD DATA LOCAL INFILE "import.csv"
INTO TABLE names_staging;
CREATE TABLE names as
select n.firstname, n.lastname, g.gender
from names_staging n LEFT JOIN
gender g
ON g.firstname = n.firstname;
This makes it possible to identify and fix problems from the data load. You can also easily add additional columns such as primary keys and insert dates into the final table.
i am using LOAD DATA LOCAL INFILE to load data into temp table mid.then i use a update query to update found in table product.The only matching field in both is the model.
$q = "LOAD DATA LOCAL INFILE 'Mid.csv' INTO TABLE mid
FIELDS TERMINATED BY '\t' LINES TERMINATED BY '\n' IGNORE 1 LINES
(#col1,#col2,#col3,#col4,#col5,#col6) set model=#col1,price=#col3,stock=#col6 ";
mysql_query($q, $db);
mysql_query('UPDATE mid m, products p set p.products_price= m.price,p.products_quantity= m.stock where p.products_model= m.model');
It works and update the product table.the issue i am having is that there new records in mid table which don't get inserted as i am using the update statement.
I have looked at the insert query and update on duplicate.I have seen loads of examples of when it has to work on one table but none where i have to match it against another table.
Either i am searching for the wrong thing or there is another way to to do this.
i would appreciate any help.
regards
naf
I'm not sure what the other columns in the product table are, but here's a basic approach that should work for you based on the 3 columns in your example, assuming the products_model column is unique in the products table:
insert into products (products_price,products_quantity,products_model)
select price, stock, model
from mid
on duplicate key update
products_price = values(products_price),
products_quantity = values(products_quantity)
Now am using the below code to upload CSV files into MYSQL database via FTP.It's Working fine.The CSV column have a field name "STATUS". It's have two values A and D. But i want to insert particular rows only based on field "STATUS" A. How can i insert this?.
LOAD DATA LOCAL INFILE '/root/782012_10.csv'
INTO TABLE tbl_dndno
FIELDS TERMINATED BY ',' ENCLOSED BY '"'
LINES TERMINATED BY '\n';
You could load the file into a temporary table, then insert from the temporary table into your main table with a query that selects the rows you want.
I have a table that looks like this:
products
--------
id, product, sku, department, quantity
There are approximately 800,000 entries in this table. I have received a new CSV file that updates all of the quantities of each product, for example:
productA, 12
productB, 71
productC, 92
So there are approximately 750,000 updates (50,000 products had no change in quantity).
My question is, how do I import this CSV to update only the quantity based off of the product (unique) but leave the sku, department, and other fields alone? I know how to do this in PHP by looping through the CSV and executing an update for each single line but this seems inefficient.
You can use LOAD DATA INFILE to bulk load the 800,000 rows of data into a temporary table, then use multiple-table UPDATE syntax to join your existing table to the temporary table and update the quantity values.
For example:
CREATE TEMPORARY TABLE your_temp_table LIKE your_table;
LOAD DATA INFILE '/tmp/your_file.csv'
INTO TABLE your_temp_table
FIELDS TERMINATED BY ','
(id, product, sku, department, quantity);
UPDATE your_table
INNER JOIN your_temp_table on your_temp_table.id = your_table.id
SET your_table.quantity = your_temp_table.quantity;
DROP TEMPORARY TABLE your_temp_table;
I would load the update data into a seperate table UPDATE_TABLE and perform an update within MySQL using:
UPDATE PRODUCTS P SET P.QUANTITY=(
SELECT UPDATE_QUANTITY
FROM UPDATE_TABLE
WHERE UPDATE_PRODUCT=P.PRODUCT
)
I dont have a MySQL at hand right now, so I can check the syntax perfectly, it might be you need to add a LIMIT 0,1 to the inner SELECT.
Answer from #ike-walker is indeed correct but also remember to double check how your CSV data if formatted. Many times for example CSV files can have string fields enclosed in double quotes ", and lines ending with \r\n if working on Windows.
By default is assumed that no enclosing character is used and line ending is \n.
More info and examples here https://mariadb.com/kb/en/importing-data-into-mariadb/
This can be fixed by using additional options for FIELDS and LINES
CREATE TEMPORARY TABLE your_temp_table LIKE your_table;
LOAD DATA INFILE '/tmp/your_file.csv'
INTO TABLE your_temp_table
FIELDS
TERMINATED BY ','
OPTIONALLY ENCLOSED BY '"' -- new option
LINES TERMINATED BY '\r\n' -- new option
(id, product, sku, department, quantity);
UPDATE your_table
INNER JOIN your_temp_table on your_temp_table.id = your_table.id
SET your_table.quantity = your_temp_table.quantity;
DROP TEMPORARY TABLE your_temp_table;