Loopback upsert with addition - mysql

I'm trying to upsert a database table with loopback. The raw query is
insert into all_inventory (sku, qty, regal, fach, skuRegalFach)
values (?, 1, ?, ?, ?)
on duplicate key update
qty = qty + 1,
regal = values(regal),
fach = values(fach)
Is there any way to do this with loopback?
Currently I'm facing two problems.
I get:
ER_DUP_ENTRY: Duplicate entry '22323' for key
'all_inventory_SkuRegalFach_uindex'
Because loopback doesn't seem to be able to handle the key correctly.
And I have no idea how to tell loopback to add 1 to the qty field instead of just overriding it with the new value.
I have it working with a raw query right now,
let ds = Inventory.dataSource,
values = [sku, regal, fach, sku + regal + fach],
sql = `insert into all_inventory (sku, qty, regal, fach, skuRegalFach) values (?, 1, ?, ?, ?) on duplicate key update qty = qty + 1, regal = values(regal), fach = values(fach)`
ds.connector.query(sql, values, (err, products) => {
if (err) return console.error(err);
cb(null, products);
});
Is there a way to do this with loopback's ORM?

I'd use a find filter. Either findById or find with an appropriate filter.
If you get a result, then it exists. You can modify qty and then store it. If it doesn't exist, then you just create it.

Related

Avoid updating column if set to 1

I have this query:
INSERT INTO user_list (USER_ID,USERNAME,NAME,ACTIVITY,PRIVATE)
VALUES(?,?,?,1,?)
ON DUPLICATE KEY UPDATE USERNAME=?, NAME=?, ACTIVITY=ACTIVITY+1,PRIVATE=?
PRIVATE can be 0 or 1. I need to insert it according to a parameter, but if PRIVATE is already 1 on that row, it shouldn't be set as 0.
Basically if it turns 1, it can never become 0
How can I do this in a single query?
You can set the value of the column PRIVATE:
PRIVATE = PRIVATE OR VALUES(PRIVATE)
which will retain the value 1 if this is the original value, or change it to the new value if the original value is 0:
INSERT INTO user_list (USER_ID, USERNAME, NAME, ACTIVITY, PRIVATE)
VALUES(?, ?, ?, 1, ?)
ON DUPLICATE KEY UPDATE USERNAME = VALUES(USERNAME),
NAME = VALUES(NAME),
ACTIVITY = ACTIVITY + 1,
PRIVATE = PRIVATE OR VALUES(PRIVATE)
Hmmm . . . Why change the value at all?
INSERT INTO user_list (USER_ID, USERNAME, NAME, ACTIVITY, PRIVATE)
VALUES(?, ?, ?, 1, ?)
ON DUPLICATE KEY UPDATE USERNAME=?, NAME=?, ACTIVITY=ACTIVITY+1;
Also, are you aware of VALUES()?
INSERT INTO user_list (USER_ID, USERNAME, NAME, ACTIVITY, PRIVATE)
VALUES(?, ?, ?, 1, ?)
ON DUPLICATE KEY UPDATE USERNAME = VALUES(USERNAME),
NAME = VALUES(NAME),
ACTIVITY = ACTIVITY + 1;
This assigns the value that was passed in to those columns.
Use case to determine if private should remain as is and overwrite with self if so.
insert into user_list(id,name,privacy)
values
(1,'aaa','0')
on duplicate key
update name = values(name)
,
privacy = case when privacy = 0 then values(privacy)
when privacy = 1 then privacy
end
;

How to create a Nodejs MySQL execute query for values that might not exist

I'm inserting values into a MySQL database using Nodejs mysql2 library.
Here is an example of a prepared statement:
await conn.execute(
'INSERT INTO Friends (id, user, name, gender) VALUES (UUID(), ?, ?, ?)',
[ user, body.name, body.gender ]
);
How can I achieve the above if sometimes the body.gender value is not set? I want several attributes in the http request to be optional and insert all allowed values that have been sent in the http request into the database.
The above code gives an error if I leave body.gender out of the http request.
If there is no some data in body or not sending some data from the client to register in the database, you have to put a null value for that row in that column. You can use this JavaScript feature to do this:
await conn.execute(
'INSERT INTO Friends (id, user, name, gender) VALUES (UUID(), ?, ?, ?)',
[ user || null, body.name || null, body.gender || null ]
);
Using this possibility, in the absence of any of the data sent in the body, its value is undefined and the value of null is placed in the query.

MySQL 5.7 bulk insert with BLOB column

I am attempting to do a bulk insert into MySQL using
INSERT INTO TABLE (a, b, c) VALUES (?, ?, ?), (?, ?, ?)
I have the general log on, and see that this works splendidly for most cases. However, when the table has a BLOB column, it doesn't work as well.
I am trying to insert 20 records.
Without the BLOB, I see all 20 records in the same query in the general log, 20 records inserted in the same query.
WITH the BLOB, I see only 2 records per query in the general log, it takes 10 queries in total.
Is this a problem with MySQL, the JDBC Driver, or am I missing something else. I would prefer to use a BLOB as I have data in protobufs.
Here is an example table...
CREATE TABLE my_table (
id CHAR(36) NOT NULL,
name VARCHAR(256) NOT NULL,
data BLOB NOT NULL,
PRIMARY KEY (id)
);
Then, create your batch inserts in code...
val ps = conn.prepareStatement(
"INSERT INTO my_table(id, name, data) VALUES (?, ?, ?)")
records.grouped(1000).foreach { group =>
group.foreach { r =>
ps.setString(1, UUID.randomUUID.toString)
ps.setString(2, r.name)
ps.setBlob(3, new MariaDbBlob(r.data))
ps.addBatch()
}
ps.executeBatch()
}
If you run this and inspect the general log, you will see...
"2018-10-12T18:37:55.714825Z 4 Query INSERT INTO my_table(id, name, fqdn, data) VALUES ('b4955537-2450-48c4-9953-e27f3a0fc583', '17-apply-test', _binary '
17-apply-test\"AAAA(?2Pending8?????,J$b4955537-2450-48c4-9953-e27f3a0fc583
1:2:3:4:5:6:7:8Rsystem'), ('480e470c-6d85-4bbc-b718-21d9e80ac7f7', '18-apply-test', _binary '
18-apply-test\"AAAA(?2Pending8?????,J$480e470c-6d85-4bbc-b718-21d9e80ac7f7
1:2:3:4:5:6:7:8Rsystem')
2018-10-12T18:37:55.715489Z 4 Query INSERT INTO my_table(id, name, data) VALUES ('7571a651-0e0b-4e78-bff0-1394070735ce', '19-apply-test', _binary '
19-apply-test\"AAAA(?2Pending8?????,J$7571a651-0e0b-4e78-bff0-1394070735ce
1:2:3:4:5:6:7:8Rsystem'), ('f77ebe28-73d2-4f6b-8fd5-284f0ec2c3f0', '20-apply-test', _binary '
20-apply-test\"AAAA(?2Pending8?????,J$f77ebe28-73d2-4f6b-8fd5-284f0ec2c3f0
As you can see, each INSERT INTO only has 2 records in it.
Now, if you remove the data field from the schema and insert and re-run, you will see the following output (for 10 records)...
"2018-10-12T19:04:24.406567Z 4 Query INSERT INTO my_table(id, name) VALUES ('d323d21e-25ac-40d4-8cff-7ad12f83b8c0', '1-apply-test'), ('f20e37f2-35a4-41e9-8458-de405a44f4d9', '2-apply-test'), ('498f4e96-4bf1-4d69-a6cb-f0e61575ebb4', '3-apply-test'), ('8bf7925d-8f01-494f-8f9f-c5b8c742beae', '4-apply-test'), ('5ea663e7-d9bc-4c9f-a9a2-edbedf3e5415', '5-apply-test'), ('48f535c8-44e6-4f10-9af9-1562081538e5', '6-apply-test'), ('fbf2661f-3a23-4317-ab1f-96978b39fffe', '7-apply-test'), ('3d781e25-3f30-48fd-b22b-91f0db8ba401', '8-apply-test'), ('55ffa950-c941-44dc-a233-ebecfd4413cf', '9-apply-test'), ('6edc6e25-6e70-42b9-8473-6ab68d065d44', '10-apply-test')"
All 10 records are in the same query
I tinkered until I found the fix...
val ps = conn.prepareStatement(
"INSERT INTO my_table(id, name, data) VALUES (?, ?, ?)")
records.grouped(1000).foreach { group =>
group.foreach { r =>
ps.setString(1, UUID.randomUUID.toString)
ps.setString(2, r.name)
//ps.setBlob(3, new MariaDbBlob(r.data))
ps.setBytes(r.data)
ps.addBatch()
}
ps.executeBatch()
Using PreparedStatement.setBytes instead of using MariaDbBlob seemed to do the trick

Mysql 2 insert queries

I am trying to execute 2 queries.
First one should insert data (especially "product") or update in case the db already has a row with such title.
Second one should insert new category for product which was inserted\updated from 1st query and ignore any inserts, if table already has such product with such category
Here is my code :
conn = DatabaseConnection.getConnection();
stmt = conn.createStatement();
conn.setAutoCommit(false);
String updateSQL = "INSERT INTO product (title, price, `status`) " +
"VALUES(?, ?, ?)" +
"ON DUPLICATE KEY UPDATE price = ?, `status` = ?;"
PreparedStatement preparedStatement = conn.prepareStatement(updateSQL);
preparedStatement.setString(1, product.getTitle());
preparedStatement.setBigDecimal(2, product.getPrice());
preparedStatement.setInt(3, product.getStatus().ordinal());
preparedStatement.executeUpdate();
updateSQL = "INSERT IGNORE INTO product_categories (product_id, category_id) " +
"VALUES (last_insert_id(), ?);";
preparedStatement = conn.prepareStatement(updateSQL);
preparedStatement.setLong(1, categoryId);
preparedStatement.executeUpdate();
conn.commit();
So, the problem is that I use last_insert_id() which means that i will use incorrect row in 2nd query if 1st query just updated the data.
So, I would like to know how could I synchronize these 2 queries.
Since you don't have access to last_insert_id() in the second query, you'll have to fetch it as in the answers for this question.
Here's an example:
...
preparedStatement.executeUpdate(); // this is the first query
ResultSet rs = preparedStatement.getGeneratedKeys();
if ( rs.next() )
{
long last_insert_id = rs.getLong(1);
updateSQL = "INSERT IGNORE INTO product_categories (product_id, category_id) " +
"VALUES (?, ?);";
preparedStatement = conn.prepareStatement(updateSQL);
preparedStatement.setLong(1, last_insert_id);
preparedStatement.setLong(2, categoryId);
preparedStatement.executeUpdate();
}
conn.commit();
If the first query didn't result in an INSERT, then there isn't enough information to add the product to the product_category, in which case this is skipped all together. This does assume that the product is already in the category. If you're not sure about that, and want to execute the second query regardless, you could query for the product_id:
SELECT id FROM product WHERE title = ?
and then use that id instead of the last_insert_id variable, or, you could change the second query and use title as a key (although I'd stick with an id):
INSERT IGNORE INTO product_categories (product_id, category_id)
VALUES (SELECT id FROM product WHERE title = ?), ?)

mysql if select then insert

I would like to only insert or update a row, if the following SELECT returns a 0 or no rows.
SELECT (value = ? AND status = ? AND connected = ?)
FROM channels, data
WHERE data.channel_id = channels.channel_id AND channels.channel_name = ? AND sample_time < ?
ORDER BY sample_time DESC
LIMIT 1
Basically, it is a data archiver that only writes changes. That is it only writes data for a given sample_time, if the data is not the same as what was written for the previous sample_time. This SELECT gets the data for a given channel for the previous sample_time and compares it to the data that has come along for the current sample_time. So if this returns 0, that is the data is different, it should go ahead and write it. And if no data has been written for this channel yet, then it should return no rows, and the new data should be written. The following is my query for writing the data:
INSERT INTO data (acquire_time, sample_time, channel_id, value, status, connected)
SELECT ?, ?, channels.channel_id, ?, ?, ?
FROM channels
WHERE channel_name = ?
ON DUPLICATE KEY UPDATE acquire_time = ?, value = ?, status = ?, connected = ?
New data for the current sample_time may overwrite previous data for the current sample_time using the ON DUPLICATE KEY UPDATE, just not if it is the same as the data stored for the previous sample_time.
The duplicate key is the combination of the channel_id and sample_time. There is also a unique index on the channel_id and acquire_time.
Thank you for your time.
After your clarifications below, I believe this will do what you want:
INSERT INTO data (acquire_time, sample_time, channel_id, value, status, connected)
SELECT ?, ?, channels.channel_id, ?, ?, ?
FROM channels
WHERE channel_name = ?
AND NOT EXISTS (
SELECT 1
FROM (
SELECT value, status, connected
FROM channels, data
WHERE data.channel_id = channels.channel_id AND channels.channel_name = ?
AND sample_time < ?
ORDER BY sample_time DESC
LIMIT 1
) a
WHERE a.value = ? and a.status = ? and a.connected = ?
)
ON DUPLICATE KEY UPDATE acquire_time = ?, value = ?, status = ?, connected = ?;