Using sum for two conditions with same attribute - mysql

When I am creating a table to count the user between 18-40, I need to have two conditions about BirthYear in the expression like sum((age>=18 and age<=40) and (Gender='M')), but the return for this one always equal 0, the whole query and output like below, (MySQL 5.30)
create table AgeUser
select
Id as 'Id',
sum((age<18) and (Gender='M')) as 'MaleUsersUnder18',
sum((age>=18 and age<=40) and (Gender='M')) as 'MaleUsers18To40',
sum((age>40) and (Gender='M')) as 'MaleUsersOver40',
sum((age<18) and (Gender='F')) as 'FemaleUsersUnder18',
sum((age>=18 and age<=40) and (Gender='F')) as 'FemaleUsers18To40',
sum((age>40) and (Gender='F')) as 'FemaleUsersOver40'
from User group by Id;
id MUUnder18 MU18To40 MUOver40 FUUnder18 FU18To40 FUOver40
72 2137 0 1316 645 0 123
79 2613 0 1616 1064 0 676
82 592 0 363 203 0 554
example for User table
The Id mean the Service Station ID
Id userid Name age gender UserType
72 12 L 18 M customer
How do I fix the query?

Best sum something
sum(case when age<18 and Gender='M' then 1 else 0 end) as 'MaleUsersUnder18',
etc..

Related

MySQL locate first and last event per day

I have a string of events being logged on a 5 minute basis throughout the day in a MySQL DB. I need to identify the first event (where logid > 0) of the day as well as the last (where logid=0), but struggling to find a simple SQL solution.
A 0 will be stored in the logid field in every row starting at midnight until the first event is triggered, at which point it will change to a number > 0. Then various events will be triggered logging a number > 0 for the remainder of the day, at which point the field will once again be logged as 0 until midnight, when the process starts over again.
Is there a quick and simple way to pull the rows identifying the time when the events start, and another result showing when the events end?
CREATE TABLE logs(
id INT AUTO_INCREMENT,
date DATETIME,
logid INT,
PRIMARY KEY (id)
) ENGINE=INNODB;
This is the test data:
id date logid
1 2018-11-12 01:05:00 0
2 2018-11-12 01:10:00 0
3 2018-11-12 01:15:00 0
4 2018-11-12 01:20:00 0
5 2018-11-12 01:05:00 0
…
84 2018-11-12 06:35:00 0
85 2018-11-12 06:35:00 1
86 2018-11-12 06:40:00 1
87 2018-11-12 06:45:00 1
88 2018-11-12 06:50:00 1
…
164 2018-11-12 15:20:00 1
165 2018-11-12 15:25:00 0
166 2018-11-12 15:30:00 0
167 2018-11-12 15:35:00 0
Desired Result set:
85 2018-11-12 06:35:00 1
165 2018-11-12 15:25:00 0
I'm not concerned about logid up until the first instance where it is greater than 0. But I need to identify the first instance where logid > 0, and then the next chronological instance where logid = 0 again.
My primary attempt was to group and order on the date and logid (edit: failed attempt removed for clarity)
Here's my latest attempt
(SELECT *
FROM logs
WHERE logid>0
GROUP BY date
ORDER BY date
limit 1
)UNION ALL(
SELECT *
FROM logs
WHERE logid>0
GROUP BY date
ORDER BY date DESC
limit 1)
Getting closer, but not quite there. This gives me the correct first row where logid = 1, but it gives me the last row where logid = 1 (id 164) rather than the following row where logid = 0 (id=165).
Is it possible to select the penultimate row of a set if I change limit 1 to 2?
Any other pointers to keep me moving forward?
This question doesn't seem to be a problem for others, but I thought I would post the answer I came up with in case anyone runs into a similar situation in the future.
SET #v1 := (SELECT date
FROM logs
WHERE logid > 0
GROUP BY date
ORDER BY date
limit 1);
(SELECT *
FROM logs
WHERE date>#v1 and logid>0
GROUP BY date
ORDER BY date
limit 1
) UNION ALL (
SELECT *
FROM logs
WHERE date>#v1 and logid=0
GROUP BY date
ORDER BY date
limit 1
)

mysql pick up a random entry selected from certain criterial

I want sort one user from table who never had a note different from 0
then update the record,
then insert a new row
user note
12 1
23 0
88 0
45 0
12 0
23 0
12 0
88 2
sort a user except user 12 and user 88 becouse the have already a note somewhere
somethink like
SELECT * FROM table WHERE note=0 ORDER BY rand() LIMIT 1
the problem is that i've many users duplicate so i dont know how to exlude that...
let's say that i randomly choose the user 23
the table should become
user note
12 1
23 0
88 0
45 0
12 0
23 X <--- mark the random user choosen
12 0
88 2
23 0 <--- add a new line
in the next random pick up only the number 45 will be avaiable becouse other user has somewhere a note != 0
for this last request i've to do 2 query UPDATE then INSERT or i can do with just one query?
You can avoid subqueries for improved performance and go like this:
SELECT *
FROM YourTable
GROUP BY name
HAVING SUM(note)=0 ORDER BY rand() LIMIT 1;
Here is an SQL Fiddle DEMO.
You can use nested select :
SELECT * FROM table WHERE note=0 and user not in (select user from table where note>0) ORDER BY rand() LIMIT 1
however u should really use primary unique index on users

count ocurrences over each day in mysql

I'm having some problems figuring out how to solve but I can't come with an answer at all.This is my problem.
I have a mySQL table like the following:
cust_id,date_removed,station_removed,date_arrived,station_arrived
6,"2010-02-02 13:57:00",56,"2010-02-02 13:58:00",77
6,"2010-02-02 15:12:00",66,"2010-02-02 15:12:00",56
30,"2010-02-05 11:36:00",32,"2010-02-05 11:37:00",14
30,"2010-02-05 11:37:00",14,"2010-02-05 11:37:00",20
30,"2010-02-05 12:41:00",85,"2010-02-05 12:43:00",85
30,"2010-02-05 12:44:00",85,"2010-02-05 12:46:00",85
30,"2010-02-06 13:15:00",8,"2010-02-06 13:17:00",20
30,"2010-02-06 13:18:00",23,"2010-02-06 13:19:00",23
30,"2010-02-06 13:20:00",32,"2010-02-06 13:21:00",39
30,"2010-02-06 13:21:00",11,"2010-02-06 13:21:00",23
30,"2010-02-06 13:21:00",76,"2010-02-06 13:22:00",32
which the corresponding datatypes in each field is the following:
cust_id: varchar()
date_removed: datetime
station_removed: int
date_arrived: datetime
station_arrived: int
Next, I was asked to make a query to get the count over every station used along the day, to get a table like this one:
station 2010-02-02 2010-02-05 2010-02-06
56 2 0 0
66 1 0 0
32 0 1 2
14 0 2 0
85 0 2 0
8 0 0 1
23 0 0 2
11 0 0 1
76 0 0 1
77 1 0 0
20 0 1 1
39 0 0 1
where the columns are the days and the rows are each station. I'm not a very good mySQL user neither.
Could somebody help me on this one?.
Thank you in advance
Use this query :
select stations.name as station,
(select count(*) from table where date(date_arrived)='2010-02-02' and (station_removed=stations.name or station_arrived=stations.name)) as '2010-02-02'
(select count(*) from table where date(date_arrived)='2010-02-05' and (station_removed=stations.name or station_arrived=stations.name)) as '2010-02-05'
(select count(*) from table where date(date_arrived)='2010-02-06' and (station_removed=stations.name or station_arrived=stations.name)) as '2010-02-06'
from
(select station_removed as name from table
union
select station_arrived from table ) stations ;

Mysql find max column value based on array list

Hi i have following mysql table
id item_name user_id wishlist item_url id_category cost
30 kiko 76 1 70 10
31 test1 76 1 70 20
32 test12 76 1 68,67 30
How can i get max cost item. Which means i need max cost item based on category id.
My attempt of using SELECT MAX(cost),id FROM item_tbl WHERE (FIND_IN_SET('68','70,68,67')) not return a correct output thought it gives max(cost) of 30. Thanks
This might give you the desired output.
SELECT ID_CATEGORY, MAX(COST) FROM ITEM_TBL
GROUP BY ID_CATEGORY;

MySQL SUM outputs wrong value in a query consisting multiple joins

I'm getting information on these tables using the following query, however defenderhit and defenderdamage SUM values are getting multiplied with the row count of first join's row number.
Table 'battles':
battle_id city_id attacker defender battle_time
1 07 6 0 1342918014
Table 'battlehits':
battle_id family_id user_id hits damage
1 0 0 1000 50000
1 6 15 108 3816
1 6 2 81 2046
1 6 1 852 1344
MySQL Query:
SELECT b.battle_id, b.city_id, b.attacker, b.defender, b.battle_time,
SUM(COALESCE(bh1.damage,0)) AS attackerdamage, SUM(COALESCE(bh2.damage,0)) AS defenderdamage,
SUM(COALESCE(bh1.hits,0)) AS attackerhit, SUM(COALESCE(bh2.hits,0)) AS defenderhit
FROM battles AS b
LEFT JOIN battlehits AS bh1 ON b.attacker = bh1.family_id
LEFT JOIN battlehits AS bh2 ON b.defender = bh2.family_id
WHERE b.battle_id=1
GROUP BY b.battle_id LIMIT 1
Result of this query is as following:
battle_id city_id attacker defender battle_time attackerdamage defenderdamage attackerhit defenderhit
1 07 6 0 1342918014 7206 150000 1041 3000
As you can see in the table data, defenderhit and defenderdamage SUM values are supposed to be 1000 and 50000, but they're multiplied by 3.
What am I doing in here? What's the problem?
Thanks in advance.
You are getting three rows, before the group by/sum. You have one row for each of the three attacker rows from battlehits. Each of these is paired with the same defender row from battlehits, causing the defender data to be tripled. To see this, remove the group by and limit clauses and take out the sum()s. You are effectively creating the cross product of all defenders X all attackers, and then summing.
This shows the three rows with duplicated defender data. This is a consequence of doing a join on a one to many to many relationship, instead of a one to one to one.
SELECT b.battle_id, b.city_id, b.attacker, b.defender, b.battle_time,
COALESCE(bh1.damage,0) AS attackerdamage, COALESCE(bh2.damage,0) AS defenderdamage,
COALESCE(bh1.hits,0) AS attackerhit, COALESCE(bh2.hits,0) AS defenderhit
FROM battles AS b
LEFT JOIN battlehits AS bh1 ON b.attacker = bh1.family_id
LEFT JOIN battlehits AS bh2 ON b.defender = bh2.family_id
WHERE b.battle_id=1;
Output:
battle_id city_id attacker defender battle_time attackerdamage defenderdamage attackerhit defenderhit
1 7 6 0 1342918014 3816 50000 108 1000
1 7 6 0 1342918014 2046 50000 81 1000
1 7 6 0 1342918014 1344 50000 852 1000
You need to split this into separate queries. One for the attacker sums and another for the defender sums.
You can emulate a SUM of distinct rows by using
(SUM(t.field_to_sum) / COUNT(t.primary_key) * COUNT(DISTINCT t.primary_key))