Combining 2 Queries with OR operator - mysql

I'm trying to insert something to a table and also delete at the same time so my query is like this
$query = mysqli_query($connect,"SELECT * FROM inventory_item WHERE status = 'Unserviceable' OR DELETE * FROM inventory_item WHERE status = 'Available")
or die ("Error: Could not fetch rows!");
$count = 0;
I wanted to insert datas with Unserviceable status and at the same time delete datas with Available status but its not working.
I'm not really familiar with queries and just starting out.

This is not valid SQL syntax.
If you want to issue two queries, one to INSERT and one to DELETE, then you can send them as two separate calls to mysqli_query(). There appears to be an alternate function mysqli_multi_query() that allows multiple statements to be included which you can read about here.
Finally, if you want the two separate queries to execute as a single unit (that is, if one of them fails then neither is executed) then you should research the subject of database transactions, which allow you to execute multiple queries and commit or roll back the entire set of queries as a unit.

Related

How to execute multiple SQL queries at same time

I want to know If we can execute multiple queries at the same time in MYSQL/SQL. Let me explain a case scenario to elaborate my statement further.
Let's Assume, We have to create and load two tables.create table tbl1(col,col,col,col...); Insert Into tbl1 (val,val,val,val...) and other query as create table tbl2(col,col,col,col...); Insert Into tbl2 (val,val,val,val...). Now, When I execute the statement the flow will be
Create Table1
Insert Into Table1
Create Table2
Insert Into Table2
Is there any method We can use to minimize these 4 steps into a single step? Similar to the functionality of threads that run in parrallel.
You can use two different instances of SSMS or may be different tabs within SSMS.
Other solution to run 2 queries at the same time with a maintenance plan. Here is the link for more details
You can chain multiple queries by using the ";", see here for further details: How to run multiple SQL queries?.
In your setup, 1. needs to be executed before 2. (same is true for 3. before 4.) because you cannot insert data into a database that did not exist. So running these 4 queries in parallel is not possible. However, running 1+2 and 3+4 in parallel is possible.

Limiting SQL Server2008 Query execution

In MySQL if I have multiple queries on one page and I end each query with a ; this will prevent the next query from being ran (accidentally ran). this ensures that only that query is ran.
Is there a similar feature in SQLServer2008? So, If I have multiple queries on one page, is there a character or operator that can be added at the end of a query that stops the query execution, so only that query is ran?
I understand typically only the highlighted query will run, but I have seen the whole page get executed (almost always by some mishap) so I was looking for a way to protect/prevent any mishaps from happening.
EDIT: I am using SQL Server Management Studio (new query page)
thanks
There are various ways to do this. Some methods I've used before are:
Use of RETURN:
Select * From Table Where Foo = 'Bar'
Return
Delete Table Where Foo = 'Bar'
RETURN will end the batch at that point, making it so the first query is run, but not the second.
Use of NOEXEC:
Set NoExec Off
Select * From Table Where Foo = 'Bar'
Set NoExec On
Delete Table Where Foo = 'Bar'
NOEXEC will disable the execution of all sql statements while it is active.
Some good old-fashioned commenting:
Select * From Table Where Foo = 'Bar'
--Delete Table Where Foo = 'Bar'
There are other things you can try using (such as RAISERROR), but in general, the best prevention is to separate your scripts into different files when possible, and to be mindful of what you're executing. Either by way of highlighting, or reviewing the file before you press F5.
If I understand your question correctly, you could place a
RAISERROR('Oops', 18, 1)
between each query. This will halt execution after the first (highlighted) query is run.

UPDATE table one... and INSERT into table two

When I try to UPDATE and INSERT into two different tables on the same user action, it has an error where it only completes the command of either the insert or the update depending on which line comes last.
Is there a way to combine these two ?
$sql = "INSERT INTO photos(user, gallery, filename, uploaddate)
VALUES ('$log_username','profile pictures','$db_file_name',now())";
$sql = "UPDATE users SET avatar='$db_file_name'
WHERE username='$log_username' LIMIT 1";
Have you thought about using a trigger on the photos table? You could setup a tigger to execute every time an insert occurs that would update the users table.
Here's a link to check out:
http://www.mysqltutorial.org/create-the-first-trigger-in-mysql.aspx
1) Execute the first query and then execute the second query.
2) use transaction if mysql if want want the successful execution of both the queries otherwise transaction control mechanism will revert the change.
And what error are you getting when you are firing the queries one by one

Can I INSERT/UPDATE into two tables with one query?

Here is a chunk of the SQL I'm using for a Perl-based web application. I have a number of requests and each has a number of accessions, and each has a status. This chunk of code is there to update the table for every accession_analysis that shares all these fields for each accession in a request.
UPDATE accession_analysis
SET analysis_id = ? ,
reference_id = ? ,
status = ? ,
extra_parameters = ?
WHERE analysis_id = ?
AND reference_id = ?
AND status = ?
AND extra_parameters = ?
and accession_id is (
SELECT accesion_id
FROM accessions
where request_id = ?
)
I have changed the tables so that there's a status table for accession_analysis, so when I update, I update both accession_analysis and accession_analysis_status, which has status, status_text and the id of the accession_analysis, which is a not null auto_increment variable.
I have no strong idea about how to modify this code to allow this. My first pass grabbed all the accessions and looped through them, then filtered for all the fields, then updated. I didn't like that because I had many connections with short SQL commands, which I understood to be bad, but I can't help but think the only way to really do this is to go back to the loop in Perl holding two simpler SQL statements.
Is there a way to do this in SQL that, with my relative SQL inexperience, I'm just not seeing?
The answer depends on which DBMS you're using. The easiest way is to create a trigger on one table that provides the logic of updating the other table. (For any DB newbies -- a trigger is procedural code attached to a table at the DBMS (not application) layer that runs in response to an insert, update or delete on the table.). A similar, slightly less desirable method is to put the logic in a stored procedure and execute that instead of the update statement you're now using.
If the DBMS you're using doesn't support either of these mechanisms, then there isn't a good way to do what you're after while guaranteeing transactional integrity. However if the problem you're solving can tolerate a timing difference in the two tables' updates (i.e. The data in one of the tables is only used at predetermined times, like reporting or some type of batched operation) you could write to one table (live) and create a separate process that runs when needed (later) to update the second table using data from the first table. The correctness of allowing data to be updated at different times becomes a large and immovable design assumption, however.
If this is mostly about connection speed, then one option you have is to write a stored procedure that handles the "double update or insert" transparently. See the manual for stored procedures:
http://dev.mysql.com/doc/refman/5.5/en/create-procedure.html
Otherwise, You probably cannot do it in one statement, see the MySQL INSERT syntax:
http://dev.mysql.com/doc/refman/5.5/en/insert.html
The UPDATE syntax allows for multi-table updates (not in combination with INSERT, though):
http://dev.mysql.com/doc/refman/5.5/en/update.html
Each table needs its own INSERT / UPDATE in the query.
In fact, even if you create a view by JOINing multiple tables, when you INSERT into the view, you can only INSERT with fields belonging to one of the tables at a time.
The modifications made by the INSERT statement cannot affect more than one of the base tables referenced in the FROM clause of the view. For example, an INSERT into a multitable view must use a column_list that references only columns from one base table. For more information about updatable views, see CREATE VIEW.
Inserting data into multiple tables through an sql view (MySQL)
INSERT (SQL Server)
Same is true of UPDATE
The modifications made by the UPDATE statement cannot affect more than one of the base tables referenced in the FROM clause of the view. For more information on updatable views, see CREATE VIEW.
However, you can have multiple INSERTs or UPDATEs per query or stored procedure.

Can I launch a trigger on select statement in mysql?

I am trying to run an INSERT statement on table X each time I SELECT any record from table Y is there anyway that I can accomplish that using MySQL only?
Something like triggers?
Short answer is No. Triggers are triggered with INSERT, UPDATE or DELETE.
Possible solution for this. rather rare scenario:
First, write some stored procedures
that do the SELECTs you want on
table X.
Then, restrict all users to use only
these stored procedures and do not
allow them to directly use SELECT on table
X.
Then alter the stored procedures to
also call a stored procedure that
performs the action you want
(INSERT or whatever).
Nope - you can't trigger on SELECT - you'll have to create a stored procedure (or any other type of logging facility - like a log file or what ever) that you implicitly call on any query statement - easier if you create a wrapper that calls your query, calls the logging and returns query results.
If you're trying to use table X to log the order of SELECT queries on table Y (a fairly common query-logging setup), you can simply reverse the order of operations and run the INSERT query first, then run your SELECT query.
That way, you don't need to worry about linking the two statements with a TRIGGER: if your server crashes between the two statements then you already logged what you care about with your first statement, and whether the SELECT query runs or fails has no impact on the underlying database.
If you're not logging queries, perhaps you're trying to use table Y as a task queue -- the situation I was struggling with that lead me to this thread -- and you want whichever session queries Y first to lock all other sessions out of the rows returned so you can perform some operations on the results and insert the output into table X. In that case, simply add some logging capabilities to table Y.
For example, you could add an "owner" column to Y, then tack the WHERE part of your SELECT query onto an UPDATE statement, run it, and then modify your SELECT query to only show the results that were claimed by your UPDATE:
UPDATE Y SET owner = 'me' WHERE task = 'new' AND owner IS NULL;
SELECT foo FROM Y WHERE task = 'new' AND owner = 'me';
...do some work on foo, then...
INSERT INTO X (output) VALUES ('awesomeness');
Again, the key is to log first, then query.