MySQL- Select and Update at the same time - mysql

I have this query
SELECT * FROM outbox where Status=0 ;
then I need to update the selected records so Status should be equal 1
i.e (UPDATE outbox(selected records from SELECT query) SET Status =1 )
any help ?

This is a much harder problem than it sounds. Yes, in the simplistic case where you are only thinking of one user and a few records, it seems easy. But, databases are designed to be ACID-compliant, with multiple users and multiple concurrent transactions that can all be affecting the data at the same time. And there is no single statement in MySQL that does what you want (other databases support an OUTPUT clause, RETURNING or something similar).
One structure that will work in MySQL is to place the items in a temporary table, then do the update, then return them. The following shows the semantics using transactions:
start transaction;
create temporary table TempOutboxStatus0 as
select *
from outbox
where status = 0;
update outbox o
set status = 1
where status = 0;
select *
from TempOutboxStatus0;
commit;
For the update, I actually prefer:
where exists (select 1 from TempOutboxStatus0 t where t.outboxid = o.outboxid);
because its intention is clearer -- and the code is safer in case the conditions subtly change.
Note: you may want to use explicit table locks. Such considerations depend on the storage engine you are using.

BEGIN
Start transaction;
SELECT *
FROM
outbox
where
Status = 0 and
Is_Expired = 0 and
Service_ID=p_service_id
order by
Next_Try_Date asc FOR Update;
update outbox
set
Status=1
where
Status = 0 and
Is_Expired = 0 and
Service_ID=p_service_id;
commit;
END
is this possible .. it seems it works with me

You can do something like that, the outbox is your table:
update outbox set Status = 1 where Status = 0

you can do it like below
$sql=mysql_query("SELECT * FROM outbox where `Status`=0");
while($result=mysql_fetch_array($sql))
{
$update="UPDATE `outbox` SET `Status` =1 where
'your column name'='your previous fetched value');
}

Related

MySQL: bulk updating in table

I'm using MySQL 5.6 and I have this issue.
I'm trying to improve my bulk update strategy for this case.
I have a table, called reserved_ids, provided by an external company, to assign unique IDs to its invoices. There is no other way to make this; I can't use auto_increment fields or simulated sequences.
I have this PL pseudocode to make this assignment:
START TRANSACTION;
OPEN invoice_cursor;
read_loop: LOOP
FETCH invoice_cursor INTO internalID;
IF done THEN
LEAVE read_loop;
END IF;
SELECT MIN(SECUENCIAL)
INTO v_secuencial
FROM RESERVED_IDS
WHERE COUNTRY_CODE = p_country_id AND INVOICE_TYPE = p_invoice_type;
DELETE FROM RESERVED_IDS WHERE SECUENCIAL = v_secuencial;
UPDATE MY_INVOICE SET RESERVED_ID = v_secuencial WHERE INVOICE_ID = internalID;
END LOOP read_loop;
CLOSE invoice_cursor;
COMMIT;
So, it's take one - remove - assign, then take next - remove - assign... and so on.
This works, but it's very very slow.
I don't know if there is any approach to make this assignment in a faster way.
I'm looking for something like INSERT INTO SELECT..., but with UPDATE statement, to assign 1000 or 2000 IDs directly, and no one by one.
Please, any suggestion is very helpful for me.
Thanks a lot.
EDIT 1: I have added WHERE clause details, because it was requested by user #vmachan . In the UPDATE...INVOICE clause, I don't filter by other criteria, because I have the direct and indexed invoice ID, which I want to update. Thanks
Finally, I have this solution. It's much faster than my initial approach.
The UPDATE query is
set #a=0;
set #b=0;
UPDATE MY_INVOICE
INNER JOIN
(
select
F.invoice_id,
I.secuencial as RESERVED_ID,
CONCAT_WS(/* format your final invoice ID */) AS FINAL_MY_INVOICE_NUMBER
FROM
(
select if(#a, #a:=#a+1, #a:=1) as current_row, internal_id
from MY_INVOICE
where reserved_id is null
order by internal_id asc
limit 2000
) F
INNER JOIN
(
SELECT if(#b, #b:=#b+1, #b:=1) as current_row, secuencial
from reserved_ids
order by secuencial asc
limit 2000
) I USING (CURRENT_ROW)
) TEMP MY_INVOICE.internal_id=TEMP.INTERNAL_ID
SET MY_INVOICE.RESERVED_ID = TEMP.RESERVED_ID, MY_INVOICE.FINAL_MY_INVOICE_NUMBER=TEMP.FINAL_MY_INVOICE_NUMBER
So, with autogenerated and correlated secuencial numbers #a and #b, we can join two different and no related tables like MY_INVOICE and RESERVED_IDs.
If you want to check this solution, please execute this tricky update following these steps:
Execute #a and then the first inner select in an isolated way: select if(#a, #a:=#a+1, ...
Execute #b and then the second inner select in an isolated way: select if(#b, #b:=#b+1, ...
Execute #a, #b and the big select that builds the TEMP auxiliar table: select F.invoice_id, ...
Execute the UPDATE
Finally, remove the assigned IDs from RESERVED_ID table.
Assignation time reduced drastically. My initial solution was one by one; with this, you assign 2000 (or more) in one single (ok, and a little tricky) update.
Hope this helps.

Using SELECT...FOR UPDATE, then updating multiple rows

I'm writing an application to:
select a small recordset from a table of subscribers (150k records);
update those rows to indicate that an email is in the process of being sent;
send email to the subscribers in the recordset;
update the rows again to indicate that the email has been sent.
The wrinkle is that the table is simultaneously being accessed by multiple clients to distribute the email workload, which is why there is the intermediate update (to indicate in-process) is used -- to keep the different clients from selecting the same rows, which results in multiple emails being sent to the same subscriber. I've applied some randomizing logic to reduce the likelihood of two clients working with the same data, but it still happens occasionally.
So now I am looking at using SELECT ... FOR UPDATE in order to lock the relevant rows (so another client won't select them). My question: is it better to write the UPDATE statement based on the IDs of the SELECT...FOR UPDATE statement, or to create a loop to update each row individually?
Here's what I've got so far:
DELIMITER $$
CREATE DEFINER=`mydef`#`%` PROCEDURE `sp_SubscribersToSend`(v_limit INTEGER)
BEGIN
START TRANSACTION;
SELECT _ID, email, date_entered, DATE_FORMAT(date_entered, '%b %e, %Y') AS 'date_entered_formatted'
FROM _subscribers
WHERE send_state = 'Send'
AND status = 'Confirmed'
LIMIT v_limit
FOR UPDATE;
[[UPDATE _subscribers SET send_state = 'Sending' WHERE _ID IN (...?)]]
[[OR]]
[[Loop through the resultset and update each row?]]
COMMIT;
END
Seems like a single UPDATE is going to be more efficient; what is the best way to turn the _ID column of the resultset into a comma-delimited list for the IN() clause? (I've been doing this client-side before this) -- or is there a better way altogether?
Instead of trying to create a comma-delimited list, just do an UPDATE with the same criteria as the SELECT
START TRANSACTION;
UPDATE _subscribers
SET send_state = 'Sending'
WHERE send_state = 'Send'
AND status = 'Confirmed'
ORDER BY <something>
LIMIT v_limit;
SELECT _ID, email, date_entered, DATE_FORMAT(date_entered, '%b %e, %Y') AS 'date_entered_formatted'
FROM _subscribers
WHERE send_state = 'Send'
AND status = 'Confirmed'
ORDER BY <something>
LIMIT v_limit;
COMMIT;
The ORDER BY clause is necessary to ensure that both queries process the same rows; if you use LIMIT without ORDER BY, they could select a different subset of rows.
Thanks to Barmar, I took a different tack in the stored procedure:
SET #IDs := null;
UPDATE _subscribers
SET send_state = 'Sending'
WHERE send_state = 'Send'
AND status = 'Confirmed'
AND (SELECT #IDs := CONCAT_WS(',', _ID, #IDs) )
LIMIT v_limit;
SELECT CONVERT(#IDs USING utf8);
As Barmar suggested, it does an UPDATE but also concatenates the IDs of the rows being updated into a variable. Just SELECT that variable, and it gives you a comma-delimited list that can be passed into a PREPARE statement. (I had to use CONVERT because SELECTing the variable was returning a binary/blob value). So...this does not use SELECT...FOR UPDATE as I was originally intending, but it does ensure that the different clients won't be working with the same rows.

SELECT * FROM WHERE does not find all records

When I type:
select * from 'saleslog' where 'Status' = 'Pending';
or
select * from 'saleslog' where 'Status' = "Pending";
or
select * from saleslog where Status = 'Pending';
despite the fact that there are hundreds of rows with "Pending" value in Status column I get only 3 records displayed. The same happens when I look for value other than "Pending".
If however, I type:
select * from saleslog where status like "%Pending%";
then most if not all records are displayed. There are absolutely no spaces or any other characters in front and behind Pending value.
I am wondering if table "saleslog" needs to be repaired and if so, how? I'm kind of new to SQL.
It's possible there are hidden characters in the field that you just can't see such as tab, carriage return or line feed. Have you tried doing an update on the field to try and correct it? Maybe try running the update query below and then run your SELECT query again:
UPDATE saleslog SET status = 'Pending' WHERE status LIKE '%Pending%'
Try the following:
UPDATE `saleslog` SET `status` = TRIM(`status`);
SELECT * FROM `saleslog` WHERE `status` = 'Pending';

pure MySQL loop to update multiple rows

I want to update multiple rows based on a SELECT sql query.
I want to do it ALL IN AN SQL SHELL!
Here is my select:
SELECT #myid := id, #mytitle := title
FROM event
WHERE pid>0 GROUP BY title
ORDER BY start;
Then, I want to do an update with this pseudocode:
foreach($mytitle as $t)
BEGIN
UPDATE event
SET pid=$myid
WHERE title=$t;
END
But I don't know how to ake a loop in SQL.
Maybe there's a way to make it in a single sql query?
I DON'T WANT ANY PHP!!! ONLY SQL SHELL CODE!!!
I want to update every rows with a pid with the id of the first occurence of an event. Start is a timestamp
I think this should do what you want, but if it doesn't (I'm not sure about joining a subquery in an UPDATE query) then you can use a temporary table instead.
UPDATE
event
JOIN (
SELECT
MIN(pid) AS minPID,
title
FROM
event
WHERE
pid > 0
GROUP BY
title
) AS findPIDsQuery ON event.title = findPIDsQuery.title
SET
event.pid = findPIDsQuery.minPID
Pure SQL doesn't really have "loops", per se: it's a set-based descriptive language. I believe the following update will do what you want (though your problem statements leaves much to be desired—we know nothing about the underlying schema).
update event t
set pid = ( select min(id)
from event x
where x.title = t.title
and x.pid > 0
group by x.title
having count(*) > 1
)
Cheers!

Update with SELECT and group without GROUP BY

I have a table like this (MySQL 5.0.x, MyISAM):
response{id, title, status, ...} (status: 1 new, 3 multi)
I would like to update the status from new (status=1) to multi (status=3) of all the responses if at least 20 have the same title.
I have this one, but it does not work :
UPDATE response SET status = 3 WHERE status = 1 AND title IN (
SELECT title FROM (
SELECT DISTINCT(r.title) FROM response r WHERE EXISTS (
SELECT 1 FROM response spam WHERE spam.title = r.title LIMIT 20, 1)
)
as u)
Please note:
I do the nested select to avoid the famous You can't specify target table 'response' for update in FROM clause
I cannot use GROUP BY for performance reasons. The query cost with a solution using LIMIT is way better (but it is less readable).
EDIT:
It is possible to do SELECT FROM an UPDATE target in MySQL. See solution here
The issue is on the data selected which is totaly wrong.
The only solution I found which works is with a GROUP BY:
UPDATE response SET status = 3
WHERE status = 1 AND title IN (SELECT title
FROM (SELECT title
FROM response
GROUP BY title
HAVING COUNT(1) >= 20)
as derived_response)
Thanks for your help! :)
MySQL doesn't like it when you try to UPDATE and SELECT from the same table in one query. It has to do with locking priorities, etc.
Here's how I would solve this problem:
SELECT CONCAT('UPDATE response SET status = 3 ',
'WHERE status = 1 AND title = ', QUOTE(title), ';') AS sql
FROM response
GROUP BY title
HAVING COUNT(*) >= 20;
This query produces a series of UPDATE statements, with the quoted titles that deserve to be updated embedded. Capture the result and run it as an SQL script.
I understand that GROUP BY in MySQL often incurs a temporary table, and this can be costly. But is that a deal-breaker? How frequently do you need to run this query? Besides, any other solutions are likely to require a temporary table too.
I can think of one way to solve this problem without using GROUP BY:
CREATE TEMPORARY TABLE titlecount (c INTEGER, title VARCHAR(100) PRIMARY KEY);
INSERT INTO titlecount (c, title)
SELECT 1, title FROM response
ON DUPLICATE KEY UPDATE c = c+1;
UPDATE response JOIN titlecount USING (title)
SET response.status = 3
WHERE response.status = 1 AND titlecount.c >= 20;
But this also uses a temporary table, which is why you try to avoid using GROUP BY in the first place.
I would write something straightforward like below
UPDATE `response`, (
SELECT title, count(title) as count from `response`
WHERE status = 1
GROUP BY title
) AS tmp
SET response.status = 3
WHERE status = 1 AND response.title = tmp.title AND count >= 20;
Is using GROUP BY really that slow ? The solution you tried to implement looks like requesting again and again on the same table and should be way slower than using GROUP BY if it worked.
This is a funny peculiarity with MySQL - I can't think of a way to do it in a single statement (GROUP BY or no GROUP BY).
You could select the appropriate response rows into a temporary table first then do the update by selecting from that temp table.
you'll have to use a temporary table:
create temporary table r_update (title varchar(10));
insert r_update
select title
from response
group
by title
having count(*) < 20;
update response r
left outer
join r_update ru
on ru.title = r.title
set status = case when ru.title is null then 3 else 1;