how to create a query in MYSQL, to compare a random number with the previous random numbers and if it exists it should generate another random number
You can try like this to create unique random numbers without duplication:
SELECT CAST(RAND() * 99999 AS INT) as mycol
FROM yourtable
WHERE "mycol" NOT IN (SELECT PreviousRand FROM yourtable)
LIMIT 1
I am assuming the your are storing the values in the table.
Related
I am a designing a script to take some data and average it out. I have a MySQL database with two tables, Table 1 contains continuous data from sensor, it is stored as [id, data, timestamp]. Table has the format [id, hasAverage, timestamp].
What I want:
(1) is to every time take 6 values from table 1 and average them
(2) put the average into hasAverage in table 2
For part (1) I have made this SQL query which does the same:
SELECT AVG(data) FROM (SELECT data FROM Table1 ORDER BY id DESC LIMIT 0, 6) items;
When executed this gives average as output.
For Part (2), how can I put output from table 1 to table 2?
I am using MySQL in Xampp.
You could do an INSERT INTO <table> ... SELECT ..., called an insert-select query.
Example:
INSERT INTO table2 (average)
SELECT AVG(data) FROM (SELECT data FROM Table1 ORDER BY id DESC LIMIT 0, 6) items;
Simply replace average with whatever column name stores the value in your table2.
I have a table with about 50K rows. I need to multiply this data 10 fold to have at least 5M rows for testing the performance. Now, its taken me several long minutes to import 50K from a CSV file so I don't want to create a 5M record file and then import it into SQL.
Is there a way to duplicate the existing rows over and over again to create 5M records? I don't mind if the rows are identical, they should just have a diferrent id which is the Primary (Auto Increment) column.
I'm currently doing this on XAMPP with phpMyAdmin.
Insert into my_table (y,z) select y, z from my_table;
where x is your autoincrementing id.
REPEAT a (remarkably small) number of times
Option 1 : Use union
insert into your_table (col1,col2)
select col1,col2 from your_table
union all
select col1,col2 from your_table
union all
select col1,col2 from your_table
union all
select col1,col2 from your_table
continued...
Option 2 : Use dummy table with 10 records and do cross join
Create a dummy table with 10 rows
insert into your_table (col1,col2)
select col1,col2 from your_table, dummy_table
If you have ~50K rows, then copying them 99 times will give you ~5M rows.
To do so, you can create a procedure and use a loop to copy them 99 times.
DELIMITER $$
CREATE PROCEDURE populate()
BEGIN
DECLARE counter INT DEFAULT 1;
WHILE counter < 100 DO
insert into mytable(colA, colB) select colA, colB from mytable;
SET counter = counter + 1;
END WHILE;
END $$
DELIMITER ;
Then you can call the procedure using
call populate();
I have an insert that uses a GROUP_CONCAT. In certain scenarios, the insert fails with Row XX was cut by GROUP_CONCAT. I understand why it fails but I'm looking for a way to have it not error out since the insert column is already smaller than the group_concat_max_len. I don't want to increase group_concat_max_len.
drop table if exists a;
create table a (x varchar(10), c int);
drop table if exists b;
create table b (x varchar(10));
insert into b values ('abcdefgh');
insert into b values ('ijklmnop');
-- contrived example to show that insert column size varchar(10) < 15
set session group_concat_max_len = 15;
insert into a select group_concat(x separator ', '), count(*) from b;
This insert produces the error Row 2 was cut by GROUP_CONCAT().
I'll try to provide a few clarifications -
The data in table b is unknown. There is no way to say set group_concat_max_len to a value greater than 18.
I do know the insert column size.
Why group_concat 4 GB of data when you want the first x characters?
When the concatenated string is longer than 10 chars, it should insert the first 10 characters.
Thanks.
Your example GROUP_CONCAT is probably cooking up this value:
abcdefgh, ijklmnop
That is 18 characters long, including the separator.
Can you try something like this?
set session group_concat_max_len = 4096;
insert into a
select left(group_concat(x separator ', '),10),
count(*)
from b;
This will trim the GROUP_CONCAT result for you.
You temporarily can set the group_concat_max_len if you need to, then set it back.
I don't know MySQL very well, nor if there is a good reason to do this in the first place, but you could create a running total length, and limit the GROUP_CONCAT() to where that length is under a certain max, you'll still need to set your group_concat_max_len high enough to handle the longest single value (or utilize CASE logic to substring them to be under the max length you desire.
Something like this:
SELECT SUBSTRING(GROUP_CONCAT(col1 separator ', '),1,10)
FROM (SELECT *
FROM (SELECT col1
,#lentot := COALESCE(#lentot,0) + CHAR_LENGTH(col1) AS lentot
FROM Table1
)sub
WHERE lentot < 25
)sub2
Demo: SQL Fiddle
I don't know if it's SQL Fiddle being quirky or if there's a problem with the logic, but sometimes when running I get no output. Not big on MySQL so could definitely be me missing something. It doesn't seem like it should require 2 subqueries but filtering didn't work as expected unless it was nested like that.
Actually, a better way is to use DISTINCT.
I had a situation to add new two fields into existing stored procedure, in a way that a value for that new fields had been obtained by a LEFT JOIN, and because it may have contained a NULL value, a single "concat" value was multiplicated for some cases more than a 100 times.
Because, a group with that new field value contained many NULL values, GROUP_CONCAT exceeded maximum value (in my case 16384).
MySQL/SQLite
I want to insert a randomly generated number (of 9 positions) into multiple rows BUT they need to be the same for all rows matched in the query.
update products set tag_seed=( SELECT ABS(RANDOM() % 999999999) ) where [...];
Partialy works... Each row will have a different random number. I need them to be same.
This is logical since you will generate a new random for every update query. The easiest solution is to generate a random integer, store it in a local variable and use that variable in your queries:
SET #rand := (SELECT ABS(RAND() * 1000000000));
update products set tag_seed=#rand where [...];
I'm pretty stuck on a Mysql query.
I have a table with three columns;
user_id | person_id | score.
The table is going to be used to store top 5 highscores for each person.
I need at query that checks if there is less than five rows for a specific person.
Is there is less, insert new row. But if there is five rows I have to replace the lowest score with the new one.
It is for a webservice written in PHP and the data about the new score is posted to the method as params.
Been stuck for some hours now — is it even possible to make this happen in one query ?
You can use stored procedure in mysql. I dont know the names of the tables but if you look closer you will understand how it works.
DELIMITER $$
DROP PROCEDURE IF EXISTS test $$
CREATE PROCEDURE test( IN testparam VARCHAR(22) )
BEGIN
DECLARE count INT(11);
SET count = (SELECT COUNT(*) FROM persons );
IF count < 5 THEN
insert into table_needed_for_insert values(testparam);
ELSE
update table_needed_for_insert where score=(select min(score) from table_needed_for_insert);
END IF;
select * from table_needed_for_insert
END $$
DELIMITER;
And how to execute this thing CALL test(1); 1 is the parameter, you can create as many as you need.
And from php you can call directly as like
$result = mysql_query("call test(".$param.")");
And here you can check a tutorial on mysql stored procedures:
http://www.mysqltutorial.org/mysql-stored-procedure-tutorial.aspx
It might be possible if you have a unique key which identifies the lowest score. Then you could use the
INSERT ... ON DUPLICATE KEY construct. But you would have to install a trigger which keeps explicit track of the lowest score.
I would propose this scenario (I have not tried it, it is just an idea):
as I understand, you only need 5 ids. you can run a subqueries like these
SELECT MAX(id) AS last_id FROM table
SELECT MIN(score), id AS lowest_id FROM table
then
insert or replace into table (id, ...) values ( MIN(last_id+1, lowest_id), ... )
there are possible mistakes and also only one subquery is possible, but I hope you get the main idea
The simplest way imo is to insert data,
INSERT INTO top_scores (user_id, person_id, score_id) VALUES (1,2,3)
then delete inappropriate rows
DELETE top_scores FROM top_scores
INNER JOIN
(SELECT * FROM top_scores WHERE person_id = 2 ORDER BY score ASC LIMIT 5, 1000000) AS inappropriate_rows
USING (user_id, person_id, score)