I have a query that select the top 5 of the UID that most logged in.
I show results like that:
"A name" connected 457 times, failed 124 times
My actual query only get the " 457 " times, I don't know how to select the failed logins in the same query.
I have a field in my database named " passed ". The value is 0 if failed, 1 if correct.
Current query:
SELECT uid, COUNT(uid) AS cnt
FROM logins
GROUP BY uid
ORDER BY cnt DESC
LIMIT 5
Database structure:
CREATE TABLE IF NOT EXISTS `logins` (
`id` int(11) NOT NULL AUTO_INCREMENT,
`uid` int(11) NOT NULL,
`username` varchar(255) NOT NULL,
`ip` varchar(15) NOT NULL,
`time` int(10) NOT NULL,
`passed` int(1) NOT NULL,
PRIMARY KEY (`id`)
) ENGINE=MyISAM DEFAULT CHARSET=latin1 AUTO_INCREMENT=1544 ;
Could you please help me to get working that out?
Thanks in advance
SELECT uid, COUNT(uid) AS cnt, COUNT( IF( passed = 0, 1, NULL ) ) AS failed
FROM logins
GROUP BY uid
ORDER BY cnt DESC
LIMIT 5
SELECT uid, COUNT(uid) AS cnt, (SELECT COUNT(uid) FROM logins WHERE passed=0) AS failed ...
You can use an inner select statement to achieve this.
Related
I have one to many table relationship :
one user for multiple event
one event for multiple event_attribute
Now, I group by userId and want to know how many for each event attribute ?
I am using group_concat like this:
group_concat(
concat(event_event_attribute.event_attr_id,
count( distinct event_event_attribute.value)
) group by event_attr_id)
)
group by userId
So here, I first group by userId, then group concat event-attribute, at least I hope to have :
(attr1, 10),(attr2, 30)....
all in one row.
But this does not work at all
Any suggestions?
To be more specific, this is the DB schema I am using:
CREATE TABLE `user` (
`id` int(11) NOT NULL,
`name` varchar(45) DEFAULT NULL,
PRIMARY KEY (`id`),
UNIQUE KEY `id_UNIQUE` (`id`)
);
CREATE TABLE `event` (
`id` int(11) NOT NULL,
`name` varchar(45) DEFAULT NULL,
`user_id` int(11) DEFAULT NULL,
PRIMARY KEY (`id`)
);
CREATE TABLE `event_attr` (
`id` int(11) NOT NULL,
`att_name` varchar(45) DEFAULT NULL,
`event_id` varchar(45) DEFAULT NULL,
PRIMARY KEY (`id`)
);
INSERT INTO `user` VALUES (1,'user1'),(2,'user2'),(3,'user3');
INSERT INTO `event` VALUES (1,'event1',1),(2,'event2',1),(3,'event3',1),(4,'event4',2),(5,'event5',2),(6,'event6',3);
INSERT INTO `event_attr` VALUES (1,'att1','1'),(2,'att2','1'),(3,'att3','1'),(4,'att1','2'),(5,'att2',NULL);
Now if I am running:
select u.id, group_concat(e.name)
from user u
join event e on u.id=e.user_id
group by u.id
I will get:
1 event1,event2,event3
2 event4,event6
3 event 6
That is fine. But one step forward, I need to know count for each event_attt for each user, such as:
1 evet_att1:3;event_att2:2
2 event_att3:1
Then it is not possible. Can I use just one query to get above expected response?
should be the inverse alias concat the aggreagted values and not aggregated the concat
select concat (group_concat(event_event_attribute.event_attr_id )
,' - ',
count( distinct event_event_attribute.value) )
from event_event_attribute
group by userid
Otherwise could be you need an subquery for obtain the count group by event_attr_id
select group_concat(
concat(event_attr_id), ',', count_value)
)
from t (
select user_id, event_event_attribute.event_attr_id, count( distinct event_event_attribute.value) count_value
from event_event_attribute
group by event_attr_id
) t
group by user_id
this is my table
CREATE TABLE IF NOT EXISTS `calls` (
`id` int(11) NOT NULL AUTO_INCREMENT,
`date` timestamp NOT NULL,
`type` int(11) NOT NULL,
PRIMARY KEY (`id`)
) ENGINE=InnoDB
and there are data on it its
now i use this query for get type data with same day
SELECT user.name_first calls.date as days, (SELECT GROUP_CONCAT(type) FROM calls WHERE DAY(date) = DAY(days) ) FROM calls JOIN user ON user.user_id = calls.user_id WHERE calls.id IN (SELECT MAX(id) FROM calls GROUP BY DAY(calls.date))
as we see i do this calls.date as days and try to get type for this data by
(SELECT GROUP_CONCAT(type) FROM calls WHERE DAY(date) = DAY(days) )
its work on mysql result in mysql
but when i try it on sqlite its say
no such column: days (code 1): , while compiling
i read sqlite support alias
I have a chatting application. I have an api which returns list of users who the user talked. But it takes a long to mysql return a list messages when it reachs 100000 rows of data.
This is my messages table
CREATE TABLE IF NOT EXISTS `messages` (
`_id` int(11) NOT NULL AUTO_INCREMENT,
`fromid` int(11) NOT NULL,
`toid` int(11) NOT NULL,
`message` text NOT NULL,
`attachments` text NOT NULL,
`status` tinyint(1) NOT NULL DEFAULT '0',
`date` datetime NOT NULL,
`delete` varchar(50) NOT NULL,
`uuid_read` varchar(250) NOT NULL,
PRIMARY KEY (`_id`),
KEY `fromid` (`fromid`,`toid`,`status`,`delete`,`uuid_read`)
) ENGINE=InnoDB DEFAULT CHARSET=utf8 AUTO_INCREMENT=118561 ;
and this is my users table (simplified)
CREATE TABLE IF NOT EXISTS `users` (
`id` int(11) NOT NULL AUTO_INCREMENT,
`login` varchar(50) DEFAULT '',
`sex` tinyint(1) DEFAULT '0',
`status` varchar(255) DEFAULT '',
`avatar` varchar(30) DEFAULT '0',
`last_active` datetime DEFAULT NULL,
`active` tinyint(1) DEFAULT '1',
PRIMARY KEY (`id`)
) ENGINE=InnoDB DEFAULT CHARSET=utf8 AUTO_INCREMENT=15523 ;
And here is my query (for user with id 1930)
select SQL_CALC_FOUND_ROWS `u_id`, `id`, `login`, `sex`, `birthdate`, `avatar`, `online_status`, SUM(`count`) as `count`, SUM(`nr_count`) as `nr_count`, `date`, `last_mesg` from
(
(select `m`.`fromid` as `u_id`, `u`.`id`, `u`.`login`, `u`.`sex`, `u`.`birthdate`, `u`.`avatar`, `u`.`last_active` as online_status, COUNT(`m`.`_id`) as `count`, (COUNT(`m`.`_id`)-SUM(`m`.`status`)) as `nr_count`, `tm`.`date` as `date`, `tm`.`message` as `last_mesg` from `messages` as m inner join `messages` as tm on `tm`.`_id`=(select MAX(`_id`) from `messages` as `tmz` where `tmz`.`fromid`=`m`.`fromid`) left join `users` as u on `u`.`id`=`m`.`fromid` where `m`.`toid`=1930 and `m`.`delete` not like '%1930;%' group by `u`.`id`)
UNION
(select `m`.toid as `u_id`, `u`.`id`, `u`.`login`, `u`.`sex`, `u`.`birthdate`, `u`.`avatar`, `u`.`last_active` as online_status, COUNT(`m`.`_id`) as `count`, 0 as `nr_count`, `tm`.`date` as `date`, `tm`.`message` as `last_mesg` from `messages` as m inner join `messages` as tm on `tm`.`_id`=(select MAX(`_id`) from `messages` as `tmz` where `tmz`.`toid`=`m`.`toid`) left join `users` as u on `u`.`id`=`m`.`toid` where `m`.`fromid`=1930 and `m`.`delete` not like '%1930;%' group by `u`.`id`)
order by `date` desc ) as `f` group by `u_id` order by `date` desc limit 0,10
Please help to optimize this query
What I need,
Who user talked to (name, sex, and etc)
What was the last message (from me or to me)
Count of messages (all)
Count of unread messages (only to me)
The query works well, but takes too long.
The output must be like this
You have some design problems on your query and database.
You should avoid keywords as column names, as that delete column or the count column;
You should avoid selecting columns not declared in the group by without an aggregation function... although MySQL allows this, it's not a standard and you don't have any control on what data will be selected;
Your not like construction may cause a bad behavior on your query because '%1930;%' may match 11930; and 11930 is not equal to 1930;
You should avoid like constructions starting and ending with % wildcard, which will cause the text processing to take longer;
You should design a better way to represent a message deletion, probably a better flag and/or another table to save any important data related with the action;
Try to limit your result before the join conditions (with a derived table) to perform less processing;
I tried to rewrite your query the best way I understood it. I've executed my query in a messages table with ~200.000 rows and no indexes and it performed in 0,15 seconds. But, for sure you should create the right indexes to help it perform better when the amount of data increase.
SELECT SQL_CALC_FOUND_ROWS
u.id,
u.login,
u.sex,
u.birthdate,
u.avatar,
u.last_active AS online_status,
g._count,
CASE WHEN m.toid = 1930
THEN g.nr_count
ELSE 0
END AS nr_count,
m.`date`,
m.message AS last_mesg
FROM
(
SELECT
MAX(_id) AS _id,
COUNT(*) AS _count,
COUNT(*) - SUM(m.status) AS nr_count
FROM messages m
WHERE 1=1
AND m.`delete` NOT LIKE '%1930;%'
AND
(0=1
OR m.fromid = 1930
OR m.toid = 1930
)
GROUP BY
CASE WHEN m.fromid = 1930
THEN m.toid
ELSE m.fromid
END
ORDER BY MAX(`date`) DESC
LIMIT 0, 10
) g
INNER JOIN messages AS m ON 1=1
AND m._id = g._id
LEFT JOIN users AS u ON 0=1
OR (m.fromid <> 1930 AND u.id = m.fromid)
OR (m.toid <> 1930 AND u.id = m.toid)
ORDER BY m.`date` DESC
;
I am using mysql as database and i have a table like the one below.
CREATE TABLE IF NOT EXISTS `logins` (
`id` int(255) NOT NULL AUTO_INCREMENT,
`userid` varchar(255) NOT NULL,
`date` varchar(255) NOT NULL,
`status` varchar(255) NOT NULL,
KEY `id` (`id`)
) ENGINE=InnoDB DEFAULT CHARSET=latin1 AUTO_INCREMENT=346 ;
I want to sort mysql results with order by.The problem is when i use this sql it takes only the first recod of date. Which is an older date. I want the newest date. last login date of user.
SELECT * FROM `logins` WHERE `status`='valid' GROUP BY `userid` ORDER BY `date` DESC
Any suggestions?
To do this you use a sub query to get the latest record for each user id and then join that to the logins table to get the rest of the details
SELECT logins.*
FROM logins
INNER JOIN
(
SELECT userid, MAX(`date`) AS max_date
FROM `logins`
WHERE `status` = 'valid'
GROUP BY `userid`
) sub0
ON logins.userid = sub0.userid
AND logins.`date` = sub0.max_date
WHERE `status` = 'valid'
You almost had it. Assuming id and userId doesn't evolve from one login to another, asking the MAX date should give you the expected result.
SELECT id, userId, MAX(`date`) AS lastDate, 'valid'
FROM `logins`
WHERE `status`='valid'
GROUP BY `userid`
ORDER BY `lastDate` DESC
Please note that you would need a JOIN if there were data that change between logins in the table.
I have a table like this:
CREATE TABLE vhist ( id int(10)
unsigned NOT NULL auto_increment,
userId varchar(45) NOT NULL,
mktCode int(10) unsigned NOT NULL,
insertDate datetime NOT NULL,
default NULL, PRIMARY KEY (`id`)
) ENGINE=InnoDB DEFAULT CHARSET=utf8;
A user can have more than one record.
I need an SQL statement which will keep the most recent 50 records and delete any thing after that limit.
I need that in a single sql statement.
I tried this but failed
delete from vhist v where v.id not in
(select v.id from vhist v where
v.userId=12 order by insertDate desc
limit 50)
but this failed on MYSQL saying IN cannot be used with a limit.
Any help?
You need a subquery, like this:
DELETE FROM vhist WHERE id NOT IN (
SELECT id FROM (
SELECT id FROM vhist WHERE userId = 12 ORDER BY insertDate DESC LIMIT 50
) as foo
);