Mysql2 UPDATE query is taking too much time to update row - mysql

I am updating a row in MySQL database using UPDATE keyword (in an express server using mysql2). If the data I am using to update and the data in the row are same then it is taking time as usual. But if I update the table with different data then it takes longer time. There is my code below.
public update = async (
obj: TUpCred,
): Promise<ResultSetHeader> => {
const sql = 'UPDATE ?? SET ? WHERE ?';
const values = [obj.table, obj.data, obj.where];
const [data] = (await this.connection.query({
sql,
values,
})) as TResultSetHeader;
return data;
};
Then this query takes so long time. Usually it is taking 4 - 6 seconds but sometimes it even takes 10 to 15 seconds to update. The same happens with INSERT query. But other queries like SELECT is taking normal time to execute.

Related

Prisma (ORM) - get the index value of inserted row

I'm trying to use Prisma (ORM) to manage my MySQL database.
When I used MySQL directly I could run mysql_insert_id() after insert command to get the auto_increment indexes values I've just inserted.
How can I achieve this in Prisma?
The return value of insert is the affected rows, not the indexes.
EDIT
If you use the prisma.create() it does return the object with it's new id.
But if you use prisma.createMany() it return only the count of affected rows ?!?!
Someone care to explain the design behind this?
You would need to use Raw Query to execute the insert statement which returns indexes values.
From the documentation:
Use $queryRaw to return actual records.
Use $executeRaw to return a count of affected rows
So you would need to use the queryRaw method.
You can use like this:
const now = new Date();
const createdRecord = await this.prisma.post.create({
data: {
title: input.title!,
content: input.content!,
created_at: now,
updated_at: now
}
})
// now you can access id of created record by createdRecord.id
const id = createdRecord.id
// ...whatever you want with id

Update a value only if it is bigger than the actual

I have a field that stores a numeric value that will go from 0 to 7. It is a counter for some steps to be completed in the application. Each time a step is completed, the counter is updated with the new value. User can go back on the steps and then forward. If he has completed step 3, he can go back to step 1 and then forward till step 3 again. What I want to do is to avoid that when the user returns to step 3 the counter gets updated with 1 and 2 values but remains 3. I want to investigate a way to do it within the update query.
The query is the following:
try{
$pdo->query("UPDATE ruolo SET wiz_step='$step' WHERE id_user='$utente'");
}
catch(PDOException $e){
$status='500';
$data['step']=$step;
$message='Si รจ verificato un errore. Abbiamo aperto una segnalazione al team tecnico.';
}
$message="Step aggiornato correttamente";
}
Is it possible to tell mysql to update wiz_step only if $step is > than the value of wiz_step before the update?
Table structure is just made of three int fields: id primary and autoincrement, id_user and wiz_step.
Note: I assume I am not open to mysql injections since none of the values in the query are coming from a user input. They are all set by the php logic.
As these are all values controlled by code it is quite simple to do, also change to using prepared queries to protect your code from SQL Injection Attack
try{
$data = [':step' => $step, ':step1' => $step, ':uid' => $utente];
$stmt = $pdo->prepare("UPDATE ruolo
SET wiz_step=:step
WHERE id_user=:uid
AND :step1 > wiz_step");
$stmt->execute($data);
}
Here's a slight variation on the answer from #RiggsFolly:
try{
$data = ['step' => $step, 'uid' => $utente];
$stmt = $pdo->prepare("UPDATE ruolo
SET wiz_step=GREATEST(:step, wiz_step)
WHERE id_user=:uid");
$stmt->execute($data);
}
See the GREATEST() function in the MySQL manual. It returns the greater value of its arguments. So if the parameter is greater, it will be used to update the column. If the existing value is greater, then no change will be made, because wiz_step = wiz_step is a no-op.
P.S.: It's not necessary to use the : character in the array keys when you pass parameters to a prepared query. It was needed in an early version of PDO long ago, but not anymore.

MySQL binding multiple parameters to a single query

I have a MySql database, and I'm connecting to it from a .Net app using Dapper. I have the following code:
await connection.ExecuteAsync(
"DELETE FROM my_data_table WHERE somedata IN (#data)",
new { data = datalist.Select(a => a.dataitem1).ToArray() },
trans);
When I do this with more than a single value, I get the following error:
MySqlConnector.MySqlException: 'Operand should contain 1 column(s)'
Is what I'm trying to do possible in MySql / Dapper, or do I have to issue a query per line I wish to delete?
Your original code was almost fine. You just need to remove the parentheses around the parameter. Dapper will insert those for you:
await connection.ExecuteAsync(
"DELETE FROM my_data_table WHERE somedata IN #data",
new { data = datalist.Select(a => a.dataitem1).ToArray() },
trans);

Updating Large Volume of data quickly

I have made an application in Nodejs that every minute calls an endpoint and gets a json array that has about 100000 elements. I need to upsert this elements into my database such that if the element doesn't exist I insert it with column "Point" value set to 0.
So far I'm having a cron job and simple upsert query. But it's so slow:
var q = async.queue(function (data, done) {
db.query('INSERT INTO stat(`user`, `user2`, `point`) '+data.values+' ON DUPLICATE KEY UPDATE point=point+ 10',function (err, result) {
if (err) throw err;
});
},100000);
//Cron job here Every 1 minute execute the lines below:
var values='' ;
for (var v=0;v<stats.length;v++) {
values = '("JACK","' + stats[v] + '", 0)';
q.push({values: values});
}
How can I do such a task in a very short amount of time. Is using mysql a wrong decision? I'm open to any other architecture or solution. Note that I have to do this every minute.
I fixed this problem by using Bulk Upsert (from documentation)! I managed to Upsert over 24k rows in less than 3 seconds. Basically created the query first then ran it:
INSERT INTO table (a,b,c) VALUES (1,2,3),(4,5,6)
ON DUPLICATE KEY UPDATE c=VALUES(a)+VALUES(b);

How to update a number of Rows in a Table using Linq

I am using the following code to update the UserSession column of the Activities. Following code return the records if the ExpiryTimeStamp is less then current date.
Then it Update the UserSession column to 0 for the returned recods in the table.
.Now I wants that if there are 100 records are returned then these should update at one time instead of using the FoREach. Is it posible in Linq
CacheDataDataContext db = new CacheDataDataContext();
var data = (from p in db.Activities
where p.ExpiryTimeStamp < DateTime.Now
select p).ToList();
data.ForEach(ta => ta.UserSession = "0");
db.SubmitChanges();
In short, no: Linq-2-sql does not do batch updates out of the box.
(I am not sure your foreach will work like you wrote - i do not think so - but this is similar and will work)
foreach (var x in data)
{x.UserSession = "0";}
db.SubmitChanges()
BUT, even if you do it like this, Linq-2-sql will send an update statement for each record to the database. So with your example of 100 records returned you will get 100 individual updates send to the database.