I have this small sample from a database:
id_doc
fecIn
fecAlt
04564494
2019-12-22
2020-01-22
04564498
2019-12-22
2020-02-06
04512870
2020-01-03
2020-01-05
04566760
2020-01-07
2020-02-12
04500207
2020-02-29
2020-03-20
04502614
2020-03-19
2020-05-27
04503127
2020-03-31
2020-06-02
I want to count months between 'fecIn' and 'fecAlt',
without taking into account the days in the dates.
So i Try:
SELECT
d.id_doc,
d.fecIn,
d.fecAlt,
DATE_FORMAT(d.fecIn,'%Y-%m') as MonthIn,
DATE_FORMAT(d.fecAlt,'%Y-%m') as MonthAlt
FROM test.docs d
So far I get only year and month. I stuck here . Is there a way to count months from the way I get 'MonthIn' and 'MonthAlt'?
The Result I want is the last column:
id_doc
fecIn
fecAlt
MontIn
MonthAlt
Result Expected
04564494
2019-12-22
2020-01-22
2019-12
2020-01
2
04564498
2019-12-22
2020-02-06
2019-12
2020-02
3
04512870
2020-01-03
2020-01-05
2020-01
2020-01
1
04566760
2020-01-07
2020-02-12
2020-01
2020-02
2
04500207
2020-02-29
2020-03-20
2020-02
2020-03
2
04502614
2020-03-19
2020-05-27
2020-03
2020-05
3
04503127
2020-03-31
2020-06-02
2020-03
2020-06
4
I tryed with:
TIMESTAMPDIFF
but no with the result that I want.
Some help please.
Related
Here I have a query which returning aggregate of values based on 5 minutes intervals of time stamp.
Here is the query
SELECT
DATE_FORMAT(`recived_on`,'%Y-%m-%d %H:00') +
INTERVAL (MINUTE(`recived_on`) - MINUTE(`recived_on`) MOD 5) MINUTE AS receivedOn,SUM(quantity) AS Quantity
FROM tablename WHERE DATE(`recived_on`) = '20210129'
GROUP BY DATE_FORMAT(`recived_on`,'%Y-%m-%d %H:00') +
INTERVAL (MINUTE(`recived_on`) - MINUTE(`recived_on`) MOD 5) MINUTE;
This query is returning values like below.
2021-01-29 00:05:00 1
2021-01-29 00:15:00 1
2021-01-29 00:45:00 1
2021-01-29 01:05:00 1
2021-01-29 03:00:00 1
2021-01-29 04:45:00 1
2021-01-29 06:15:00 2
2021-01-29 06:40:00 1
For example between 00:05:00 and 00:15:00 there were no records . So it was not showing anything. But I need timestamp data like data for every 5 mins even it is zero . For example if there were no records then it should return 0 like below
2021-01-29 00:05:00 1
2021-01-29 00:10:00 0
2021-01-29 00:15:00 1
2021-01-29 00:20:00 0
2021-01-29 00:25:00 0
2021-01-29 00:30:00 0
2021-01-29 00:35:00 0
2021-01-29 00:40:00 0
2021-01-29 00:45:00 0
Any help would be greatly appreciated.
I can't find solution how can I sort data form three datasets. I have one static dataset and two matrix tables which I want to connect in one report. Every table has the same ID which I can use to connect them (the same number of rows as well) but don't know how could I do this? Is it possibile to connect few datasets?
table1:
N ID St From To
1 541 7727549 08:30:00 14:00:00
2 631 7727575 07:00:00 15:00:00
3 668 7727552 09:00:00 17:00:00
4 679 18:00:00 00:00:00
5 721 17:00:00 00:00:00
table:2
ID P1 P2 P3 P4
541 12:00:00 - 12:10:00
631 08:45:00 - 08:55:00 11:30:00 - 11:40:00 13:00:00 - 13:15:00
668 12:05:00 - 12:15:00 13:45:00 - 13:55:00 14:55:00 - 15:10:00
679 21:15:00 - 21:30:00
721 20:40:00 - 20:50:00 21:50:00 - 22:05:00
table3:
ID W1 W2 W3
541 11:28:58 - 11:39:13
631 08:46:54 - 08:58:43 11:07:04 - 11:17:05
668 11:26:11 - 11:41:44
679
721 11:07:19 - 11:17:06
I'm trying to group my data by 7 days interval.
for example.
I have a data which is you can find it below.
count startDate finish_date
1247 2017-03-09 08:43:18 2017-03-09 16:05:34
1681 2017-03-10 08:30:13 2017-03-10 16:31:55
1464 2017-03-11 08:36:50 2017-03-11 16:42:03
1343 2017-03-12 08:26:57 2017-03-12 16:39:58
1333 2017-03-13 08:35:34 2017-03-13 16:26:18
1215 2017-03-14 08:36:58 2017-03-14 16:13:20
1817 2017-03-16 08:24:49 2017-03-16 17:18:19
1675 2017-03-17 08:22:30 2017-03-17 16:36:58
1546 2017-03-18 08:33:52 2017-03-18 16:51:52
1443 2017-03-20 08:11:00 2017-03-20 16:26:38
1481 2017-03-21 08:26:04 2017-03-21 16:57:30
1574 2017-03-23 08:19:07 2017-03-23 16:12:46
1270 2017-03-24 08:25:25 2017-03-24 16:37:59
1765 2017-03-25 08:22:58 2017-03-25 16:44:24
1200 2017-03-26 08:37:47 2017-03-26 14:59:51
1479 2017-03-27 08:17:50 2017-03-27 15:18:32
And I wanted to group them by 7 days interval.
I tried this. for it.
select count(*), min(locationDate) as startDate, max(locationDate) as finish_date from location where tagCode = 24901 and xLocation >= 278 and xLocation <= 354 and yLocation >= 239 and yLocation <= 426 and locationDate
>= DATE_SUB('2017-03-01 00:00:01',INTERVAL 7 day) and locationDate <= '2017-03-27 23:59:59' group by DATEDIFF(locationDate, '2017-03-01 00:00:01') div 7
And data is like.
count startDate finish_date
8283 2017-03-09 08:43:18 2017-03-14 16:13:20
7962 2017-03-16 08:24:49 2017-03-21 16:57:30
7291 2017-03-23 08:19:07 2017-03-27 15:22:05
Problem is Second Week it must start from 2017-03-15 and third week need to start 2017-03-22 but because of there is no data in on days its not starting how can I fix it ?
As I asked you in my comment, I think the result you wrote would be good with the input you provided, but it wouldn't be with a different input (like having 2017-03-15 but not 2017-03-16).
A solution could be to write the query kind of like this
select sum(count) as count, min(location_date), max(location_date)
from (
select t1.location_date,
t1.count,
date_sub(location_date, interval (datediff(t1.location_date, t2.min_date) % 7) day) week_start
from location t1
cross join
(select min(location_date) as min_date from location) t2
where t1.tagCode = 24901 and
t1.xLocation between 278 and 354 and
t1.yLocation between 239 and 426 and
t1.locationDate >= DATE_SUB('2017-03-01 00:00:01',INTERVAL 7 day) and
t1.locationDate <= '2017-03-27 23:59:59'
) t3
group by week_start
I tested a simplified version of this on a simplified version of your input, there might be typos...
Edit
To display both interval starting date and ending date, try with this
select sum(count) as count, week_start, week_end
from (
select t1.count,
date_sub(location_date, interval (datediff(t1.location_date, t2.min_date) % 7) day) week_start,
date_sub(location_date, interval (datediff(t1.location_date, t2.min_date) % 7) - 6 day) week_end
from location t1
cross join
(select min(location_date) as min_date from location) t2
where t1.tagCode = 24901 and
t1.xLocation between 278 and 354 and
t1.yLocation between 239 and 426 and
t1.locationDate >= DATE_SUB('2017-03-01 00:00:01',INTERVAL 7 day) and
t1.locationDate <= '2017-03-27 23:59:59'
) t3
group by week_start, week_end
I just use GROUP BY DATE_FORMAT:
SELECT someTimeStamp,SUM(amount) AS Total FROM sometable WHERE 1 GROUP BY DATE_FORMAT(someTimeStamp,"%Y%v")
I think you can do that :
you need to change the result of your query from this :
1 1247 2017-03-09 08:43:18 2017-03-09 16:05:34
2 1681 2017-03-10 08:30:13 2017-03-10 16:31:55
3 1464 2017-03-11 08:36:50 2017-03-11 16:42:03
4 1343 2017-03-12 08:26:57 2017-03-12 16:39:58
5 1333 2017-03-13 08:35:34 2017-03-13 16:26:18
6 1215 2017-03-14 08:36:58 2017-03-14 16:13:20
7 1817 2017-03-16 08:24:49 2017-03-16 17:18:19
8 1675 2017-03-17 08:22:30 2017-03-17 16:36:58
9 1546 2017-03-18 08:33:52 2017-03-18 16:51:52
10 1443 2017-03-20 08:11:00 2017-03-20 16:26:38
11 1481 2017-03-21 08:26:04 2017-03-21 16:57:30
12 1574 2017-03-23 08:19:07 2017-03-23 16:12:46
13 1270 2017-03-24 08:25:25 2017-03-24 16:37:59
14 1765 2017-03-25 08:22:58 2017-03-25 16:44:24
15 1200 2017-03-26 08:37:47 2017-03-26 14:59:51
16 1479 2017-03-27 08:17:50 2017-03-27 15:18:32
to This using the logic of computing the number of days between the max date and the min date of the first line :
-- max date of the row min date of the first row
select FLOOR(datediff('2017-03-12 16:05:34', '2017-03-09 08:43:18')/7);
select FLOOR( datediff('2017-03-16 17:18:19', '2017-03-09 08:43:18')/7);
-- what is important that you always compute the max date - the min date of the first row the same row like in your example is : '2017-03-09 08:43:18'
select FLOOR( datediff(max_date, '2017-03-09 08:43:18')/7);
rec_sum min_date max_date day_diff
1 1247 2017-03-09 08:43:18 2017-03-09 16:05:34 0
2 1681 2017-03-10 08:30:13 2017-03-10 16:31:55 0
3 1464 2017-03-11 08:36:50 2017-03-11 16:42:03 0
4 1343 2017-03-12 08:26:57 2017-03-12 16:39:58 0
5 1333 2017-03-13 08:35:34 2017-03-13 16:26:18 0
6 1215 2017-03-14 08:36:58 2017-03-14 16:13:20 0
7 1817 2017-03-16 08:24:49 2017-03-16 17:18:19 1
8 1675 2017-03-17 08:22:30 2017-03-17 16:36:58 1
9 1546 2017-03-18 08:33:52 2017-03-18 16:51:52 1
10 1443 2017-03-20 08:11:00 2017-03-20 16:26:38 1
11 1481 2017-03-21 08:26:04 2017-03-21 16:57:30 1
12 1574 2017-03-23 08:19:07 2017-03-23 16:12:46 2
13 1270 2017-03-24 08:25:25 2017-03-24 16:37:59 2
14 1765 2017-03-25 08:22:58 2017-03-25 16:44:24 2
15 1200 2017-03-26 08:37:47 2017-03-26 14:59:51 2
16 1479 2017-03-27 08:17:50 2017-03-27 15:18:32 2
-- now you can group the new result by the new field the result of division.
select
sum(rec_sum) ,
min(min_date),
max(max_date)
from (query result in the previous list)
group by day_diff
i know it's a little bit hard but i think you can do it the hard way is the day_diff computing .
My goal is to generate a report that will provide me with data to submit to a payroll company.
Our pay periods are the 1st through the 15th and then the 16th through the end of the month and I can easily generate a report that shows how many hours an employee works between those given dates.
My problem is with overtime. Overtime is calculated based on an employees work in a given week. Our week start and end dates are Sunday through Saturday. Here is a specific example of the challenge.
The pay period that ends on 2016-01-31 is part of weeks 3, 4, 5 and 6. It has seven days in week 4 and 5, and only one day in week 3 and 6. I would like to pay (on this particular pay period) any overtime the employee worked in weeks 3, 4 and 5. I will pay them for week 6 overtime in the next pay period once that week is complete.
I need a little help. Specifically a direction to start looking. Would this be a situation where I would embed a SELECT within a SELECT? Since I am operating on two different spins on the same data source ... I am a little baffled.
Anyone have any experience here? Thoughts?
Report for Pay Period Ending 2016-01-31
Employee Hours OT Hr
----------------------- ------- -------
Joe Employee 95.00 1.00
- Week 3 (1 Day) 7.00 1.00
- Week 4 (7 Days) 40.00 0.00
- Week 5 (7 Days) 40.00 0.00
- Week 6 (1 Day) 8.00 0.00
This sample report shows that an employee worked 96 hours over 16 days during the pay period ending 2016-01-31. The one hour of overtime and 7 hours of regular time in week 3 suggests they worked extra on hours in the previous pay period that propelled their week 3 total hours over 40.
Business Rule: Overtime is paid for an employees hours that exceed 40 during a week. Weeks are defined as the time from Sunday to Saturday. Payments are made on Pay Dates. Pay Dates are defined as the 15th and last day of each month. If a Pay Period occurs in the middle of a week. Employees are paid for overtime hours on both side of the pay date as defined above.
Sample Data as Requested
Employee Start End Week PayPeriod Duration
John Employee 2016-01-02 09:23:42 2016-01-02 15:13:43 1 1/15/2016 5.83
John Employee 2016-01-04 09:42:30 2016-01-04 17:58:19 2 1/15/2016 8.26
John Employee 2016-01-05 09:46:04 2016-01-05 13:30:03 2 1/15/2016 3.73
John Employee 2016-01-05 14:03:02 2016-01-05 18:06:34 2 1/15/2016 4.06
John Employee 2016-01-06 10:30:43 2016-01-06 17:14:18 2 1/15/2016 6.73
John Employee 2016-01-07 10:05:22 2016-01-07 13:43:59 2 1/15/2016 3.64
John Employee 2016-01-07 14:14:20 2016-01-07 18:05:50 2 1/15/2016 3.86
John Employee 2016-01-08 09:55:59 2016-01-08 17:47:58 2 1/15/2016 7.87
John Employee 2016-01-11 10:28:22 2016-01-11 17:54:04 3 1/15/2016 7.43
John Employee 2016-01-12 09:33:30 2016-01-12 10:08:43 3 1/15/2016 0.59
John Employee 2016-01-12 10:39:59 2016-01-12 18:29:24 3 1/15/2016 7.82
John Employee 2016-01-13 10:41:16 2016-01-13 13:39:29 3 1/15/2016 2.97
John Employee 2016-01-13 13:39:29 2016-01-13 15:05:05 3 1/15/2016 1.43
John Employee 2016-01-13 15:05:06 2016-01-13 17:25:30 3 1/15/2016 2.34
John Employee 2016-01-14 10:32:28 2016-01-14 14:01:33 3 1/15/2016 3.48
John Employee 2016-01-14 14:20:47 2016-01-14 18:07:42 3 1/15/2016 3.78
John Employee 2016-01-15 09:40:31 2016-01-15 17:19:34 3 1/15/2016 7.65
John Employee 2016-01-16 09:40:31 2016-01-16 17:19:34 3 1/31/2016 7.65
John Employee 2016-01-18 10:01:39 2016-01-18 15:40:43 4 1/31/2016 5.65
John Employee 2016-01-18 15:53:38 2016-01-18 18:38:27 4 1/31/2016 2.75
John Employee 2016-01-19 10:43:24 2016-01-19 18:13:04 4 1/31/2016 7.49
John Employee 2016-01-20 10:38:38 2016-01-20 14:16:09 4 1/31/2016 3.63
John Employee 2016-01-20 14:16:09 2016-01-20 17:55:07 4 1/31/2016 3.65
John Employee 2016-01-21 10:39:31 2016-01-21 18:56:42 4 1/31/2016 8.29
John Employee 2016-01-22 10:57:55 2016-01-22 15:44:03 4 1/31/2016 4.77
John Employee 2016-01-22 15:57:54 2016-01-22 18:11:28 4 1/31/2016 2.23
John Employee 2016-01-25 10:08:57 2016-01-25 19:14:21 5 1/31/2016 9.09
John Employee 2016-01-26 10:45:35 2016-01-26 14:17:13 5 1/31/2016 3.53
John Employee 2016-01-26 14:40:51 2016-01-26 18:31:56 5 1/31/2016 3.85
John Employee 2016-01-27 09:53:33 2016-01-27 18:05:40 5 1/31/2016 8.20
John Employee 2016-01-28 10:36:57 2016-01-28 16:28:16 5 1/31/2016 5.86
John Employee 2016-01-28 16:43:20 2016-01-28 19:42:17 5 1/31/2016 2.98
John Employee 2016-01-31 10:00:40 2016-01-31 16:27:46 6 1/31/2016 6.45
John Employee 2016-02-01 10:45:42 2016-02-01 14:04:03 6 2/15/2016 3.31
John Employee 2016-02-01 14:15:06 2016-02-01 17:45:05 6 2/15/2016 3.50
John Employee 2016-02-01 17:45:05 2016-02-01 19:01:34 6 2/15/2016 1.27
John Employee 2016-02-02 11:03:49 2016-02-02 17:40:21 6 2/15/2016 6.61
John Employee 2016-02-03 11:08:06 2016-02-03 17:15:38 6 2/15/2016 6.13
John Employee 2016-02-04 11:20:59 2016-02-04 17:27:15 6 2/15/2016 6.10
John Employee 2016-02-04 17:27:15 2016-02-04 20:19:34 6 2/15/2016 2.87
John Employee 2016-02-05 10:47:57 2016-02-05 17:53:54 6 2/15/2016 7.10
John Employee 2016-02-08 10:51:45 2016-02-08 15:15:28 7 2/15/2016 4.40
John Employee 2016-02-08 15:34:52 2016-02-08 17:30:54 7 2/15/2016 1.93
John Employee 2016-02-09 11:01:09 2016-02-09 13:11:02 7 2/15/2016 2.16
John Employee 2016-02-09 13:11:02 2016-02-09 17:38:03 7 2/15/2016 4.45
John Employee 2016-02-09 17:38:03 2016-02-09 18:34:20 7 2/15/2016 0.94
John Employee 2016-02-10 10:43:39 2016-02-10 11:25:38 7 2/15/2016 0.70
John Employee 2016-02-10 11:25:38 2016-02-10 17:58:11 7 2/15/2016 6.54
John Employee 2016-02-11 10:16:30 2016-02-11 14:06:35 7 2/15/2016 3.83
John Employee 2016-02-11 14:30:17 2016-02-11 17:25:23 7 2/15/2016 2.92
John Employee 2016-02-12 10:46:50 2016-02-12 17:46:38 7 2/15/2016 7.00
I need to pay 5.15 hours of overtime in PayPeriod 2016-01-15 because he did not go over 40 until after the 2015-01-15 pay period closed.
If the employee had crossed over 40 hours before the 2016-01-15 pay period closed, I would have needed to pay overtime on both the 2016-01-15 and the 2016-01-31 pay periods for week three hours.
Here's my thought on overtime calculation:
you'll start checking the start date for the period, which day is it? If it is not Sunday, that means you need to pay some overtime for earlier dates? Well, for how many days? You'll always start from the sunday of that week for calculating the overtime for that week. Some useful mySql functions here are: DAYOFWEEK() and DATE_ADD()
Then you'll continue until the end date for that period is reached/passed, and you'll calculate the overtime for each week (sunday-saturday). If a saturday is passed your end date for the payroll period, you'll just skip that week for your overtime calculation.
I am not sure why my numbers are drastically off from each other.
A query with no max id:
SELECT id, DATE_FORMAT(t_stamp, '%Y-%m-%d %H:00:00') as date, COUNT(*) as count
FROM test_ips
WHERE id > 0
AND viewip != ""
GROUP BY HOUR(t_stamp)
ORDER BY t_stamp ASC;
I get:
1 2012-07-18 19:00:00 1313
106 2012-07-18 20:00:00 1567
107 2012-07-19 09:00:00 847
225 2012-07-19 10:00:00 5095
421 2012-07-19 11:00:00 205
423 2012-07-19 12:00:00 900
461 2012-07-19 13:00:00 619
490 2012-07-20 15:00:00 729
575 2012-07-20 16:00:00 1682
1060 2012-07-20 17:00:00 2063
2260 2012-07-20 18:00:00 1417
5859 2012-07-20 21:00:00 1303
7060 2012-07-20 22:00:00 1340
8280 2012-07-20 23:00:00 1211
9149 2012-07-21 00:00:00 1675
10418 2012-07-21 01:00:00 721
11127 2012-07-21 02:00:00 825
But if I add a max id:
AND id <= 8279
I get:
1 2012-07-18 19:00:00 1313
106 2012-07-18 20:00:00 1201
107 2012-07-19 09:00:00 118
225 2012-07-19 10:00:00 196
421 2012-07-19 11:00:00 2
423 2012-07-19 12:00:00 38
461 2012-07-19 13:00:00 20
490 2012-07-20 15:00:00 85
575 2012-07-20 16:00:00 483
1060 2012-07-20 17:00:00 1200
2260 2012-07-20 18:00:00 1200
5859 2012-07-20 21:00:00 1201
7060 2012-07-20 22:00:00 1220
The numbers are WAY off from each other. Something is goofy.
EDIT: Here is my table structure:
id t_stamp bID viewip unique
1 2012-07-18 19:22:20 5 192.168.1.1 1
2 2012-07-18 19:22:21 1 192.168.1.1 1
3 2012-07-18 19:22:22 5 192.168.1.1 0
4 2012-07-18 19:22:22 3 192.168.1.1 1
You are not grouping by ID and I think you intend to.
Try:
SELECT id, DATE_FORMAT(t_stamp, '%Y-%m-%d %H:00:00') as date, COUNT(*) as count
FROM test_ips
WHERE id > 0
AND viewip != ""
GROUP BY id, DATE_FORMAT(t_stamp, '%Y-%m-%d %H:00:00')
ORDER BY t_stamp;
Your query is not consistent.
In your select statement you are displaying the full date.
But you are grouping your data by the hour. So your count statement is taking the count of all the data for each hour of the day.
As an example take your first result:
1 2012-07-18 19:00:00 1313
The count of 1313 contains the records for all of your dates (7/18, 7/19, 7/20, 7/21, 7/22, etc) that have an hour of 19:00.
But the way you have your query setup, it looks like it should be the count of all records for 2012-07-18 19:00:00.
So when you add AND id <= 8279" The dates of 7/21 and some of 7/20 or no longer being counted so your count values are now lower.
I'm guessing you are meaning to group by the date and hour and not just the hour.