MySQL update rows. Need to update lots of unique rows - mysql

I am using PHP scripts on my mySQL database. What I have is an inventory of products that is being pulled for a search page. I have this working fine, and my search page is working flawlessly. The only thing I cannot figure out is how to update the stock of each product in the database. I have a new file that needs to match up with the product number and then replace the data in one column of the mysql database. Much like this below
mysql database:
ProductNumber..................ProductStock
12345678....................................1
New file:
12345678..................5
Basically I need the new file to match with the product number on the mysql and replace the product stock with the new number. All of this in PHP.

You don't say how many rows you got in that database. Usually individual UPDATEs are pretty slow.
You can
load your "new stocks" file into a temporary table using LOAD DATA INFILE
make an UPDATE using a JOIN to set the values in your products table
This will be a few orders of magnitude faster.
Update with the syntax to use on mysql :
UPDATE products p JOIN temp_products t ON (p.id=t.id) SET p.stock = t.stock;
Note that you need special privileges for LOAD DATA INFILE.
You can also use LOAD DATA INFILE LOCAL (check the docs).
Or, you can parse the file with PHP, generate an INSERT with multi-values in a temp table, and do a joined update. This will be a little slower than LOAD DATA INFILE, but much much faster than doing 27000 queries.

$sql = 'UPDATE table_products SET ProductStock=5 WHERE ProductNumber=12345678';
mysql_query($sql);
So basically: use an UPDATE statement for the database.
How does the file look?

Well you will have to open the file, use explode() and then use preg_match as per the requirements.

You can use mysql's load data local infile to do this. Write a small import script and run it on a cron daily. Is the stock file in csv format?
An example script would be as follows
$dir = dirname(__FILE__) . DIRECTORY_SEPARATOR;
$stock_file = $dir . "files". DIRECTORY_SEPARATOR."stock.csv";
$stock_query = "
load data local infile \"{$stock_file}\"
replace into table products
fields terminated by ',' enclosed by '\"'
lines terminated by '\n'
ignore 1 lines
(ProductNumber,ProductStock);
";
$m->query($stock_query);
Things to consider would be what the fields are terminated by and enclosed by. Also the ignore 1 lines is there to ignore the header row if the file has one.

Related

Load Data InFile SQL statement is not allowed to be executed when creating MySQL Events

I know that the Load Data InFile SQL statement is not allowed to be executed when creating MySQL Events.
I am trying to find an alternative solution to this, but until now I cant.
https://dev.mysql.com/doc/mysql-reslimits-excerpt/5.6/en/stored-program-restrictions.html
Anyone has any idea if there is any other way that we can set to load external file into the tables on a scheduled basis?
Any reason you cant just do the Load Data into a temporary file of the expected input format? Add a few extra columns such as for pre-processing flags if that might help? Then that could be a single load without any other interactions of triggers and such. Then you could have a stored procedure run query / fetch / process / insert, etc. based on the already loaded records into a table. If you skip some, so be it, others process as normal? Just something I have done in the past and might be an option for you.
although I did not quite understand your question if you are on a Linux based system, you could create a simple bash script to load any external data into MySQL . Either dump it locally and load it or pull from external source and load it the same way you would load any file.
For example as below I am trying to import my data from a csv file to my customers table and I want to schedule it every 5 minutes.
Hence I include the load data infile under a mysql event. Code as below
However, it returns an error: ERROR Code 1314: LOAD DATA is not allowed in stored procedures.
Any alternative way that i can import a csv file data into a table and execute it during a schedule of every 5 minutes?
USE sales;
CREATE EVENT import_file_event
ON SCHEDULE EVERY 5 MINUTE
DO
LOAD DATA INFILE 'C:/ProgramData/MySQL/MySQL Server 8.0/Uploads/Importfile.csv'
INTO TABLE customers
FIELDS TERMINATED BY ','
ENCLOSED BY '"'
LINES TERMINATED BY '\n'
IGNORE 1 ROWS
(#first_name,#last_name,#gender,#email_address,#number_of_complaints)
SET first_name = NULLIF(#first_name,''),
last_name = NULLIF(#last_name,''),
gender = NULLIF(#gender,''),
email_address = NULLIF(#email_address,'');

Update table from csv file select first field?

I have a series of CSV files that I had wanted to import into MySQL. To first populate the table I did the following;
mysql -u root -p apn -e "LOAD DATA LOCAL INFILE '/opt/cell.csv' INTO TABLE data FIELDS TERMINATED BY ',';"
Where the CSV contents as;
89xx,31xx,88xx,35xx,ACTIVATED,250.0MB,GPRS,96xx,0,0,2,false,DEFAULT
The one unique field is the first starting with '89xx' (which goes into column named 'iccid').
Now I want to do is update the table, but clueless how to use the first entry in the CSV to update the rest of the row? It will be like the 4th field that I need to get updated overtime as that is the value that will change (data usage for specific cellular device). I don't have that much of a problem emptying the table before doing a whole new import, I was thinking though it was a better practice to just update, I will eventually need update several times a day.
Since I have no practical skills in any language, or mysql for that matter, would it be best to just insert into a temp table and update from that?
You can use
REPLACE
before
INTO
keyword to update/replace your row.
LOCAL INFILE '/opt/cell.csv' REPLACE INTO TABLE data FIELDS TERMINATED BY ',';"
To ignore your duplicate index you can use
IGNORE
keyword before
INTO
Load data manual

LOAD DATA INFILE not working on my server

I have a csv with 400k records that I am trying to import into an existing database table but it is not working. I get no errors it simply says MYSQL returned an empty result.
Here is the query I am executing.
LOAD DATA INFILE 'C:\\xampp\\htdocs\\projects\\csvTest\\newDataList.csv'
INTO TABLE peeps
FIELDS TERMINATED BY ','
LINES TERMINATED BY '\n'
(first, last,address,city,state,zip,imb,email,startDate,endDate,jobName)
SET id = NULL
Funny thing is I run this exact code on my local host machine and it works, but if I run it on my server machine it does not work.
Start and End are reserved words so you either need to enclose the column names in backticks, or better yet re-name those columns

Regular transfer of .csv to database

I am working on a program that will regularly read the data from a .csv file and import it to my database. The csv is a copy from a database on another server so the table structure will be different when I upload to the new one.
What I am unsure of is the best method to do this on a nightly basis, and hopefully automate the process. Any suggestions?
The database is MySQL on an apache server
Consider using LOAD DATA INFILE query on a timed script with PHP, Python, or other language to upload into temp table:
LOAD DATA INFILE 'filename.csv'
INTO TABLE tempname
FIELDS TERMINATED BY ','
OPTIONALLY ENCLOSED BY '"'
LINES TERMINATED BY '\r\n';
Then run an append query migrating the different structured temp table data into final table.
INSERT INTO finaltable (Col1, Col2, Col3,...)
SELECT [Col1], [Col2], [Col3], ... FROM tempname
Best solution in my opinion is create a PHP script to manipulate csv data and match the format of the file to the tables in your database. After which, you can set up a cron job(linux) or scheduled task(windows) to run the script automatically at your desired time and recursion. Hope this helps you.

Trigger for LOAD DATA LOCAL INFILE

I am importing a csv file into a database table using 'LOAD DATA LOCAL INFILE'. The complete query for curiosity is:
LOAD DATA LOCAL INFILE '".addslashes($current_file)."' REPLACE INTO TABLE $current_table FIELDS TERMINATED BY ',' OPTIONALLY ENCLOSED BY '" . '"' . "' ESCAPED BY " . "'\\\\'" . " LINES TERMINATED BY '\n' IGNORE 1 LINES ($toVars) SET $setCols
Now, what that is doing in a nutshell is loading csv rows into table_A. What I want to do is create a memory table that will log every single row that is ever entered into table_A. I am going to achieve this using MYSQL Triggers. Now I am familiar with using Triggers with inserts that are performed. By my question is; how to make a AFTER trigger for a LOAD DATA LOCAL INFILE command?
The trigger should fire only when the LOAD DATA LOCAL INFILE query has completed everything. It shouldn't fire on each incremental insert or whatever the case may be.
Do you have a table that tracks these bulk uploads, like who ran them, how long they took, etc?
If so, you can use it as a sentinel and put the trigger there to do the bulk operations on the freshly loaded data in the other table.
Make sense?