.
I'm making a database for my final year project at my University and i'm currently stuck.
I have three tables :product, customer and product_order.
product has an auto_incremented primary key product_ID
and customer also has an auto_incremented primary key customer_ID.
product_order table is where my problem lies( iknow that mysql doesn't support #insert and #scope_identity, also #last_insert_id doesn't work for me )
I have two foreign keys both from product and customer table inside here (which are auto_increment ids and using the variable I saw online
SELECT #last := LAST_INSERT_ID();
only returns the last id from the product_ID and customer_ID.
I have these insert statements:
insert into Product values(1000 ,'Logitech Webcam C270 HD mic USB', 260, 5, 'Accessory');
insert into customer VALUES(2000, 'Rachel ' , 'Mc Roy' , 'Rach#gmail.com', 'female','1985/06/05','wlovely8', '41, Cantubury Lane, San Franscique', 2938493);
insert into product_order values(#last_id_in_Customer, #last_id_in_Product, 'Logitech Webcam C270 HD mic USB', 260, 1);
i am inserting the id's to set the range of where i want the id's to fall into; hence the reason why i put 2000 and 1000; in the other inserts I utilized null to insert the data into the table automatically
select * from product_order;
2001 2001 Logitech Webcam C270 HD mic USB 260 1
My output from the query i only get to insert once because the #last variable only returns once.
So my question : is there a better way of doing this?
Btw this is for a customer login webpage, so the customer wouldn't have to enter a unique primary key.
Yes ,there is a better way of doing this Assuming that when customer is registered he will get customer_id auto-incremented ,when customer log in to system store that in session variable and whenever he selects any product ,retrieve the customer_id from session and product_id from product table and add it in order_product table
Related
I have two tables : Shop and Product
Table Shop
(id INT AUTO_INCREMENT,
shop_id INT,
PRIMARY KEY(id)
);
Table Product
(
product_id INT AUTO_INCREMENT,
p_name VARCHAR(100),
p_price INT,
shop_id INT,
PRIMARY KEY(product_id),
FOREIGN KEY(shop_id) REFERENCES Shop(id)
);
On a server using Node and mysql2 package for queries.
On a client side, I'm displaying all Products that are related to specific Shop in a table.
User can change Products, and when he is pressing Save, requests are being made, sending new data, and storing her.
User can either change existing Products, or add new ones.
But i have concerns, how it will behave with a relatively big amount of products per one shop. Let's say there are 1000 of them.
The data that was inserted - marked with the flag saved_in_db=false.
Existing data, that was changed - changed=true.
Considered a few approaches :
On a server, filtering array of records received from a client, INSERT into db newly created, that are not stored yet.
But to UPDATE existing Products, i need to create a bunch of UPDATE Products SET p_name=val_1 WHERE id = ? queries, and execute them at once.
To take all Products with the specified Shop_id, DELETE them, and INSERT a new bulk of data. Not making separation between already existing records, or changed.
In this approach, i see two cons.
First - sending constant amount of data from client to server.
Second - running out of ids in DB. Because if there are 10 shops, with 1000 Products in each, and every user frequently updates records, every update, even if one new record was added, or changed, will increment id by around 1000.
Is it the only way, to update a certain amount of records in DB, executing a bunch of UPDATE queries one after another?
You could INSERT...ON DUPLICATE KEY UPDATE.
INSERT INTO Products (product_id, p_name)
VALUES (123, 'newname1'), (456, 'newname2'), (789, 'newname3'), ...more...
ON DUPLICATE KEY UPDATE p_name = VALUES(p_name);
This does not change the primary key values, it only updates the columns you tell it to.
You must include the product id's in the INSERT VALUES, because that's how it detects that you're inserting a row that already exists in the table.
I have a table which consists of columns for users, categories and amount.
A user can buy an amount products from each category. I want to store only the very last purchase.
User Category Amount
1 100 15
1 103 25
Imagine that this user has just bought 30 pieces from 100 or from 110. Either additional category or a new category. This can be handled using following pseudo code:
SELECT amount FROM table WHERE user=1 AND category=100
if row exists
UPDATE table SET amount=30 WHERE user=1 AND category=100
else
INSERT INTO table (user, category, amount) VALUES(1, 100, 30)
The other way to do is, just always deleting the old value (ignoring the error message when not exists( and always inserting a new one.
DELETE FROM table WHERE user=1 AND category=100
INSERT INTO table VALUES(1, 100, 30)
Which of these patterns is preferred from performance point of view?
Does it matter which PK and FK exists?
mysql supports replace, so no need of delete insert or update. But this one assumes a unique key or primary key on your table as reference
REPLACE
INTO yourtable (user, category, amount)
VALUES (1, 100, 30);
I am using MySql, and I want to implement a query.
I have 5 Tables and in MySql they look like this.
Table1- Site:
Site_ID
domain_name
site_name
Table2- Locations:
site_id (Same as from Site)
Table3- Users:
user_id (AI primary key)
site_id
Table4- Users_Roles:
role_id(AI Primary key)
site_id
Table5- Users_Addresss:
user_address_id(AI Primary Key)
user_id (Same as from Users)
site_id
With one single query, I want to insert into all of these tables. My Database is normalized
I am not able to think of the query that would do the operation.
I will be using this query in a php file and trigger it with the ajax.
First you need to insert a record into Site table
INSERT INTO Site (domain_name,site_name) VALUES ('www.google.com', 'Test site');
Then assign the last insert id i.e. Site_id into a variable like below
SET #site_id = LAST_INSERT_ID(); // This is the Site_id
INSERT INTO Users (site_id) VALUES (#site_id);
Now do the same for all the tables.
Thanks
I have a stocks table (for products/stocks of a retail store) and a serials table (barcodes issued for each stock).
Basically when new stocks are introduced to the databases, the system issues a serial number for each stock... based on the index/pri autoincrement value of the serials table.
Problem is they both depend on each other...
I'll explain:
STOCKS TABLE
stock_id int(11)
product_name varchar(50)
serial int(30) <--- relies on the serials generated by system, stored in the SERIALS TABLE
SERIALS
sn_id int(11)
stock_id int(11) <-- relies on the new stocks inserted in the stocks table
serial int(30) <---- serial NO generated for specific stock.
Where STOCKS inserted needs to store the Serial Number generated for it,
as well as the SERIALS generated must be recorded in the table w/ the stock_id (index/pri) of the stocks being inserted..
This basically means 3 SQL statements / new stock:
get the next auto inc value of serials table (used to generate the serials properly)
insert the stocks into the table with the serials for each
get the insert_id of the said stock and insert that into the serials table
This works but I'm wondering if there's a better approach? So far here's what I got running:
create a serial_lock file on the home directory (this prevents other scripts from issuing new serial numbers to other stocks , = avoiding conflict on concurrent runs..
GENERATE required Serial Nos by getting the next auto_increment value of the serials table and store this in variable for now e.g.
$assigned_serials_array[$index] = $prefix . $index; // results in BN-0001 ("BN-" is the prefix and the rest is padded auto inc value incremented per loop
INSERT INTO stocks , each stock , get the insert_ID
INSERT INTO serials, a record of the serial being issued to that specific stock
after loop is done, delete the lock file
PS.
my original actually does an INSERT already to the serials table, and then does an update on that serials table after a stock_id is generated.. I didn't feel comfortable with that one because of another SQL statement being issued, although it's the safest way though and I don't need to worry about lock file and conflicts.
hmmmm.. any thoughts?
EDIT:
I decided to change my method..
for each SERIAL GENERATED, is a STOCK (stock_id).. I decided to forget about the incremental sequencing of serial numbers 00001 0002 0003
Decided to go ahead and use the stock_id of the specific stock being issued an SN..
so..
get next insert id, generate SN based on that,
INSERT STOCK , w/ generated SN
INSERT SERIAL record, referencing the stock_id to the same next insert id as well..
Done!
I just really wanted to have a perfectly sequenced SN ..
Do not create lock files - this is just wrong.
Instead, DO use transactions. This example in Perl:
my $dbh = DBI->connect("dbi:mysql...", "user", "password");
$dbh->begin_work(); # start new transaction
$dbh->do("INSERT INTO serials ..."); # generate new serial
my $new_serial = $dbh->{mysql_insertid};
$dbh->do("INSERT INTO stocks (..., serialno) VALUES (..., $new_serial)");
# do some more work like inserting into other tables
$dbh->commit(); # finally, commit the transaction
Note that you need to use InnoDB engine for transactions to work
I have a chat user entry in a MySql table -
id (primary key, autoincrement) - the chat user id
user_id - the users id as it pertains to our product
room_id - the id of the room your in
If a chat operator enters the room, leaves, and then comes back - as it is now - will create two entries in the room (INSERT INTO chatuser ...)
I know INSERT IF NOT EXISTS syntax, however I want to IF NOT EXISTS not on the primary key, but WHERE user_id = x AND room_id = y (which will be the same if they re-enter a room they have been in)
something like INSERT INTO chatuser SET user_id=4, room_id=2 IF ENTRY DOESNT EXIST WITH THOSE VALUES ALREADY ;)
thank you
If you have a unique index on (user_id,room_id), you can do INSERT ... ON DUPLICATE KEY UPDATE (or INSERT IGNORE if you don't need to update anything when the record already exists):
INSERT INTO table_1 (user_id, room_id, enter_date)
VALUES (1,1, NOW())
ON DUPLICATE KEY UPDATE
enter_date = NOW()
// or
INSERT IGNORE INTO table_1 (user_id, room_id)
VALUES (1,1)
I think what we have here is a design issue. You have 3 tables, a user table, a room table and a table that links that two. The last one is what we're working with here. Now, if you want to log every time a user enters a room, the table design you have is ideal. However it seems that is not what you are doing. It seems you want an entry for just the last visit. In this case your table should not have an auto incrementing primary key, and instead should have userID and roomID to both be parts of a multi-column primary key.
Modify your table to that set-up then run your insert statement with ON DUPLICATE KEY UPDATE clause.