Find the business day number in a pay period given a date, using MySQL - mysql

I am wanting to aggregate data using a derived value that I am struggling to calculate: "business day number in the pay period". I have provided a further description below:
In any given month, there are two pay periods. The first period commences on the first business day of the month, and concludes at COB on the last business day leading up to and including the 15th (eg: it may conclude on 14th because 15th is a Saturday). The second period commences on the following business day after the previous period has concluded, and ends at COB on the last business day of the month. How can I transform a datetime column into an integer representing which day of the pay period a tuple belongs to, that way data can be aggregated based on "business day number in a pay period".
If possible, I would prefer not to require intermediate tables or user-defined functions to do this; preferably using arithmetic only. Essentially a query like:
SELECT (<arithmetic on datetime col here>) as businessDayNumber, count(someCol)
FROM someTbl
GROUP BY businessDayNumber;
Here is sample data which provides the outcome that I desire:
CREATE TABLE sampleData (
dataId INT AUTO_INCREMENT PRIMARY KEY,
dataDt DATE NOT NULL,
someValue INT NOT NULL
);
INSERT INTO sampleData (dataDt, someValue) VALUES ('2020-01-01', 51),
('2020-01-01', 62),
('2020-01-01', 23),
('2020-01-01', 54),
('2020-01-02', 61),
('2020-01-02', 35),
('2020-01-02', 47),
('2020-01-02', 69),
('2020-01-02', 32),
('2020-01-02', 83),
('2020-01-02', 13),
('2020-01-03', 51),
('2020-01-03', 62),
('2020-01-03', 23),
('2020-01-03', 54),
('2020-01-03', 61),
('2020-01-03', 35),
('2020-01-06', 54),
('2020-01-06', 61),
('2020-01-06', 35),
('2020-01-06', 47),
('2020-01-06', 69),
('2020-01-06', 32),
('2020-01-06', 83),
('2020-01-06', 13),
('2020-01-07', 51),
('2020-01-07', 62),
('2020-01-07', 23),
('2020-01-07', 54),
('2020-01-07', 61),
('2020-01-07', 35),
('2020-01-07', 47),
('2020-01-07', 69),
('2020-01-07', 32),
('2020-01-08', 51),
('2020-01-08', 62),
('2020-01-08', 23),
('2020-01-08', 54),
('2020-01-08', 61),
('2020-01-08', 35),
('2020-01-08', 47),
('2020-01-08', 69),
('2020-01-08', 32),
('2020-01-08', 83),
('2020-01-08', 13),
('2020-01-09', 35),
('2020-01-09', 47),
('2020-01-09', 69),
('2020-01-09', 32),
('2020-01-09', 83),
('2020-01-09', 13),
('2020-01-09', 54),
('2020-01-09', 61),
('2020-01-09', 35),
('2020-01-09', 47),
('2020-01-10', 69),
('2020-01-10', 32),
('2020-01-10', 83),
('2020-01-10', 13),
('2020-01-10', 51),
('2020-01-10', 62),
('2020-01-13', 83),
('2020-01-13', 13),
('2020-01-13', 54),
('2020-01-13', 61),
('2020-01-13', 35),
('2020-01-13', 47),
('2020-01-14', 69),
('2020-01-14', 32),
('2020-01-14', 83),
('2020-01-14', 13),
('2020-01-14', 51),
('2020-01-14', 62),
('2020-01-14', 23),
('2020-01-14', 54),
('2020-01-15', 61),
('2020-01-15', 35),
('2020-01-15', 47),
('2020-01-15', 69),
('2020-01-15', 32),
('2020-01-16', 51),
('2020-01-16', 62),
('2020-01-16', 23),
('2020-01-16', 54),
('2020-01-16', 61),
('2020-01-16', 35),
('2020-01-16', 47),
('2020-01-16', 69),
('2020-01-16', 32),
('2020-01-16', 83),
('2020-01-16', 13),
('2020-01-16', 51),
('2020-01-16', 62),
('2020-01-17', 23),
('2020-01-17', 54),
('2020-01-17', 61),
('2020-01-17', 35),
('2020-01-17', 47),
('2020-01-17', 69),
('2020-01-17', 32),
('2020-01-17', 83),
('2020-01-17', 13),
('2020-01-17', 54),
('2020-01-20', 47),
('2020-01-20', 69),
('2020-01-20', 32),
('2020-01-20', 83),
('2020-01-20', 13),
('2020-01-20', 51),
('2020-01-20', 62),
('2020-01-20', 23),
('2020-01-20', 54),
('2020-01-20', 61),
('2020-01-20', 35),
('2020-01-20', 47),
('2020-01-20', 69),
('2020-01-20', 32),
('2020-01-21', 83),
('2020-01-21', 13),
('2020-01-21', 54),
('2020-01-21', 61),
('2020-01-21', 35),
('2020-01-21', 47),
('2020-01-21', 69),
('2020-01-21', 32),
('2020-01-21', 83),
('2020-01-21', 13),
('2020-01-21', 51),
('2020-01-21', 62),
('2020-01-21', 23),
('2020-01-21', 54),
('2020-01-21', 61),
('2020-01-21', 35),
('2020-01-21', 47),
('2020-01-21', 69),
('2020-01-21', 32),
('2020-01-21', 83),
('2020-01-21', 13),
('2020-01-22', 54),
('2020-01-22', 61),
('2020-01-22', 35),
('2020-01-22', 47),
('2020-01-22', 69),
('2020-01-22', 32),
('2020-01-22', 83),
('2020-01-23', 13),
('2020-01-23', 51),
('2020-01-23', 62),
('2020-01-23', 23),
('2020-01-23', 54),
('2020-01-23', 61),
('2020-01-24', 35),
('2020-01-24', 47),
('2020-01-24', 69),
('2020-01-24', 32),
('2020-01-25', 35),
('2020-01-25', 47),
('2020-01-25', 69),
('2020-01-27', 35),
('2020-01-27', 47),
('2020-01-27', 69),
('2020-01-27', 32),
('2020-01-27', 83),
('2020-01-27', 13),
('2020-01-27', 51),
('2020-01-27', 62),
('2020-01-28', 23),
('2020-01-28', 54),
('2020-01-28', 61),
('2020-01-28', 35),
('2020-01-28', 47),
('2020-01-28', 69),
('2020-01-28', 32),
('2020-01-29', 69),
('2020-01-29', 32),
('2020-01-29', 83),
('2020-01-29', 13),
('2020-01-29', 51),
('2020-01-29', 62),
('2020-01-29', 23),
('2020-01-30', 54),
('2020-01-30', 61),
('2020-01-30', 35),
('2020-01-30', 47),
('2020-01-30', 69),
('2020-01-30', 32),
('2020-01-31', 35),
('2020-01-31', 47),
('2020-01-31', 69),
('2020-01-31', 32),
('2020-01-31', 83),
('2020-01-31', 13),
('2020-01-31', 54),
('2020-01-31', 61),
('2020-02-02', 47),
('2020-02-03', 54),
('2020-02-03', 61),
('2020-02-04', 35),
('2020-02-04', 51),
('2020-02-04', 62),
('2020-02-04', 23),
('2020-02-04', 54),
('2020-02-06', 61),
('2020-02-06', 35),
('2020-02-06', 47),
('2020-02-06', 69),
('2020-02-07', 23),
('2020-02-07', 54),
('2020-02-07', 61),
('2020-02-07', 35),
('2020-02-07', 47),
('2020-02-08', 23),
('2020-02-08', 54),
('2020-02-08', 61),
('2020-02-08', 35),
('2020-02-08', 47),
('2020-02-08', 69),
('2020-02-08', 35),
('2020-02-08', 47),
('2020-02-08', 69),
('2020-02-08', 32),
('2020-02-09', 83),
('2020-02-09', 13),
('2020-02-09', 54),
('2020-02-09', 61),
('2020-02-09', 35),
('2020-02-09', 47),
('2020-02-09', 69),
('2020-02-09', 32),
('2020-02-09', 83),
('2020-02-09', 13),
('2020-02-09', 51),
('2020-02-09', 62),
('2020-02-09', 23),
('2020-02-09', 54),
('2020-02-10', 61),
('2020-02-10', 35),
('2020-02-10', 47),
('2020-02-10', 69),
('2020-02-10', 32),
('2020-02-10', 51),
('2020-02-11', 62),
('2020-02-11', 23),
('2020-02-11', 54),
('2020-02-11', 32),
('2020-02-11', 83),
('2020-02-12', 13),
('2020-02-12', 51),
('2020-02-13', 62),
('2020-02-13', 23),
('2020-02-13', 54),
('2020-02-13', 61),
('2020-02-13', 35),
('2020-02-13', 47),
('2020-02-14', 69),
('2020-02-14', 32),
('2020-02-14', 83),
('2020-02-14', 13),
('2020-02-14', 54),
('2020-02-14', 61),
('2020-02-14', 35),
('2020-02-14', 47),
('2020-02-15', 69),
('2020-02-15', 32),
('2020-02-15', 83),
('2020-02-15', 13),
('2020-02-15', 51),
('2020-02-16', 62),
('2020-02-16', 23),
('2020-02-16', 54),
('2020-02-16', 61),
('2020-02-16', 61),
('2020-02-16', 35),
('2020-02-16', 47),
('2020-02-16', 69),
('2020-02-16', 32),
('2020-02-16', 83),
('2020-02-16', 13),
('2020-02-16', 51),
('2020-02-16', 62),
('2020-02-17', 23),
('2020-02-18', 35),
('2020-02-18', 47),
('2020-02-18', 69),
('2020-02-18', 32),
('2020-02-18', 83),
('2020-02-18', 13),
('2020-02-18', 51),
('2020-02-18', 62),
('2020-02-18', 23),
('2020-02-18', 54),
('2020-02-18', 61),
('2020-02-18', 35),
('2020-02-18', 47),
('2020-02-18', 69),
('2020-02-18', 32),
('2020-02-19', 51),
('2020-02-19', 62),
('2020-02-19', 23),
('2020-02-19', 54),
('2020-02-19', 61),
('2020-02-19', 35),
('2020-02-20', 47),
('2020-02-20', 69),
('2020-02-20', 32),
('2020-02-20', 83),
('2020-02-20', 13),
('2020-02-20', 51),
('2020-02-20', 62),
('2020-02-20', 23),
('2020-02-20', 54),
('2020-02-20', 61),
('2020-02-20', 35),
('2020-02-20', 47),
('2020-02-20', 69),
('2020-02-20', 32),
('2020-02-21', 83),
('2020-02-21', 13),
('2020-02-21', 54),
('2020-02-21', 61),
('2020-02-21', 35),
('2020-02-21', 47),
('2020-02-21', 69),
('2020-02-21', 32),
('2020-02-21', 83),
('2020-02-21', 13),
('2020-02-21', 51),
('2020-02-21', 62),
('2020-02-21', 23),
('2020-02-21', 54),
('2020-02-21', 61),
('2020-02-21', 35),
('2020-02-21', 47),
('2020-02-21', 69),
('2020-02-21', 32),
('2020-02-21', 83),
('2020-02-21', 13),
('2020-02-22', 54),
('2020-02-22', 61),
('2020-02-22', 35),
('2020-02-22', 47),
('2020-02-22', 69),
('2020-02-22', 32),
('2020-02-22', 83),
('2020-02-23', 13),
('2020-02-23', 51),
('2020-02-23', 62),
('2020-02-23', 23),
('2020-02-23', 54),
('2020-02-23', 61),
('2020-02-24', 35),
('2020-02-24', 47),
('2020-02-24', 69),
('2020-02-24', 32),
('2020-02-25', 35),
('2020-02-25', 47),
('2020-02-25', 69),
('2020-02-25', 32),
('2020-02-25', 83),
('2020-02-25', 13),
('2020-02-25', 51),
('2020-02-25', 62),
('2020-02-25', 23),
('2020-02-25', 54),
('2020-02-25', 61),
('2020-02-26', 35),
('2020-02-26', 47),
('2020-02-26', 69),
('2020-02-26', 32),
('2020-02-26', 83),
('2020-02-26', 13),
('2020-02-26', 54),
('2020-02-26', 61),
('2020-02-27', 35),
('2020-02-27', 47),
('2020-02-27', 69),
('2020-02-27', 32),
('2020-02-27', 83),
('2020-02-27', 13),
('2020-02-27', 51),
('2020-02-27', 62),
('2020-02-28', 69),
('2020-02-28', 32),
('2020-02-29', 69),
('2020-02-29', 32),
('2020-02-29', 83);
and in a SQL Fiddle.

WITH cte AS ( SELECT someValue,
DENSE_RANK() OVER (PARTITION BY DATE_FORMAT(dataDt, '%Y%m'),
DAY(dataDT) <= 15
ORDER BY dataDT) businessDayNumber
FROM sampleData
WHERE DAYOFWEEK(dataDT) BETWEEN 2 AND 6 )
SELECT businessDayNumber, COUNT(someValue)
FROM cte
GROUP BY businessDayNumber;
fiddle

Related

Split into multi rows based on target table

I have two table where i want to join and show quantity with details. table are join with ITM,DIA , and total Qty is equal in both table on ITM/DIA combination
I want to split table2 quantity on table1 and populate table2 data along with table1 data.
I have below data for your reference, "table1" and "table2". and you can see my expected result in table "tableResult"
CREATE TABLE table1
(`ITM` varchar(5), `DIA` varchar(4), `LOC` varchar(4), `ID` varchar(3), `QTY` int)
;
INSERT INTO table1
(`ITM`, `DIA`, `LOC`, `ID`, `QTY`)
VALUES
('Item1', 'DIA1', 'LOC1', 'ID1', 3),
('Item1', 'DIA1', 'LOC2', 'ID2', 4),
('Item1', 'DIA1', 'LOC2', 'ID2', 6),
('Item1', 'DIA2', 'LOC2', 'ID2', 6),
('Item1', 'DIA2', 'LOC3', 'ID3', 18),
('Item1', 'DIA2', 'LOC4', 'ID4', 90),
('Item1', 'DIA2', 'LOC4', 'ID5', 23),
('Item1', 'DIA3', 'LOC5', 'ID6', 50),
('Item1', 'DIA3', 'LOC6', 'ID7', 20),
('Item2', 'DIA1', 'LOC4', 'ID8', 44),
('Item2', 'DIA2', 'LOC5', 'ID8', 21),
('Item2', 'DIA3', 'LOC6', 'ID9', 20)
;
CREATE TABLE table2
(`ITM` varchar(5), `DIA` varchar(4), `NTA` varchar(5), `QTY` int)
;
INSERT INTO table2
(`ITM`, `DIA`, `NTA`, `QTY`)
VALUES
('Item1', 'DIA1', 'NTA1', 10),
('Item1', 'DIA1', 'NTA2', 3),
('Item1', 'DIA2', 'NTA3', 30),
('Item1', 'DIA2', 'NTA4', 7),
('Item1', 'DIA2', 'NTA5', 100),
('Item1', 'DIA3', 'NTA6', 70),
('Item2', 'DIA1', 'NTA7', 22),
('Item2', 'DIA1', 'NTA8', 20),
('Item2', 'DIA2', 'NTA9', 6),
('Item2', 'DIA2', 'NTA10', 15),
('Item2', 'DIA3', 'NTA11', 8),
('Item2', 'DIA3', 'NTA11', 12)
;
CREATE TABLE tableResult
(`ITM` varchar(5), `DIA` varchar(4), `LOC` varchar(4), `ID` varchar(3), `QTY` int, `NTA` varchar(5), `NewQTY` int)
;
INSERT INTO tableResult
(`ITM`, `DIA`, `LOC`, `ID`, `QTY`, `NTA`, `NewQTY`)
VALUES
('Item1', 'DIA1', 'LOC1', 'ID1', 3, 'NTA1', 3),
('Item1', 'DIA1', 'LOC2', 'ID2', 4, 'NTA1', 4),
('Item1', 'DIA1', 'LOC2', 'ID2', 6, 'NTA1', 3),
('Item1', 'DIA1', 'LOC2', 'ID2', 6, 'NTA2', 3),
('Item1', 'DIA2', 'LOC2', 'ID2', 6, 'NTA3', 6),
('Item1', 'DIA2', 'LOC3', 'ID3', 18, 'NTA3', 18),
('Item1', 'DIA2', 'LOC4', 'ID4', 90, 'NTA3', 6),
('Item1', 'DIA2', 'LOC4', 'ID4', 90, 'NTA4', 7),
('Item1', 'DIA2', 'LOC4', 'ID4', 90, 'NTA5', 77),
('Item1', 'DIA2', 'LOC4', 'ID5', 23, 'NTA5', 23),
('Item1', 'DIA3', 'LOC5', 'ID6', 50, 'NTA6', 50),
('Item1', 'DIA3', 'LOC6', 'ID7', 20, 'NTA6', 20),
('Item2', 'DIA1', 'LOC4', 'ID8', 44, 'NTA7', 22),
('Item2', 'DIA1', 'LOC4', 'ID8', 44, 'NTA8', 20),
('Item2', 'DIA2', 'LOC5', 'ID8', 21, 'NTA9', 6),
('Item2', 'DIA2', 'LOC5', 'ID8', 21, 'NTA10', 15),
('Item2', 'DIA3', 'LOC6', 'ID9', 20, 'NTA11', 8),
('Item2', 'DIA3', 'LOC6', 'ID9', 20, 'NTA11', 12)
;
Could you please share your solution on this? appreciate lot on your valuable ideas..

How to set all 50 states to a country code as 'US'

Okay so I'm using Google Charts API to create a map that displays sales based on location. Some code in my chart are countries so France is displayed in my CSV file as FR. However, the API ONLY does Countries so my data in the file that are states such as NC, CA, NY etc... need to be stored as US. Would a case statement for each state be the best way to go?
States I need these states to be set equal to 'US'
StateID
-------
AL
CA
HI
NY
etc...
try this sql...
SELECT
CASE WHEN
CUST_STATE_CD in ('AK', 'AZ', 'AR', 'CA', 'CO', 'CT', 'DE', 'DC', 'FL', 'GA', 'HI', 'ID', 'IL', 'IN', 'IA', 'KS', 'KY', 'LA', 'ME', 'MD', 'MA', 'MI', 'MN', 'MS', 'MO', 'MT', 'NE', 'NV', 'NH', 'NJ', 'NM', 'NY', 'NC', 'ND', 'OH', 'OK', 'OR', 'PA', 'PR', 'RI', 'SC', 'SD', 'TN', 'TX', 'UT', 'VT', 'VA', 'WA', 'WV', 'WI', 'WY')
THEN
'US'
ELSE
CUST_STATE_CD
END as state,
count(CUST_NM) as totalCust
FROM
sales_filev1
GROUP BY
CASE WHEN
CUST_STATE_CD in ('AK', 'AZ', 'AR', 'CA', 'CO', 'CT', 'DE', 'DC', 'FL', 'GA', 'HI', 'ID', 'IL', 'IN', 'IA', 'KS', 'KY', 'LA', 'ME', 'MD', 'MA', 'MI', 'MN', 'MS', 'MO', 'MT', 'NE', 'NV', 'NH', 'NJ', 'NM', 'NY', 'NC', 'ND', 'OH', 'OK', 'OR', 'PA', 'PR', 'RI', 'SC', 'SD', 'TN', 'TX', 'UT', 'VT', 'VA', 'WA', 'WV', 'WI', 'WY')
THEN
'US'
ELSE
CUST_STATE_CD
END

Top 20 Group Ranking Query - Optimization

I am creating a reporting structure where I need to output the top 20 days of aggregate stats for each unique Company - Region. I have completed this task but feel that my code is overly complicated and I am requesting help optimizing it.
I have 2 tables involved in this process. The first lists all the possible Company - Region - Group - Subgroups. The second has hourly stats by the Group - Subgroup.
SQL Fiddle link: http://sqlfiddle.com/#!9/29a7b/1
NOTE: currently getting a SELECT command denied to user '<user>'#'<ip>' for table 'table_stats' error on my SQL Fiddle, would appreciate help resolving this as well.
table_companies declaration and dummy data:
CREATE TABLE `table_companies` (
`pk_id` int(10) unsigned NOT NULL AUTO_INCREMENT,
`company` varchar(45) NOT NULL,
`region` varchar(45) NOT NULL,
`group` varchar(45) NOT NULL,
`subgroup` varchar(45) NOT NULL,
PRIMARY KEY (`pk_id`),
UNIQUE KEY `pk_id_id_UNIQUE` (`pk_id`)
);
INSERT INTO table_companies
(`pk_id`, `company`, `region`, `group`, `subgroup`)
VALUES
(1, 'company1', 'region1', 'group1', 'subgroup1'),
(2, 'company1', 'region1', 'group1', 'subgroup2'),
(3, 'company1', 'region2', 'group2', 'subgroup3'),
(4, 'company1', 'region3', 'group3', 'subgroup4'),
(5, 'company2', 'region1', 'group4', 'subgroup5'),
(6, 'company2', 'region3', 'group5', 'subgroup6'),
(7, 'company2', 'region3', 'group6', 'subgroup7'),
(8, 'company2', 'region4', 'group7', 'subgroup8'),
(9, 'company2', 'region5', 'group8', 'subgroup9'),
(10, 'company3', 'region6', 'group9', 'subgroup10'),
(11, 'company3', 'region7', 'group10', 'subgroup11'),
(12, 'company3', 'region8', 'group11', 'subgroup12'),
(13, 'company4', 'region9', 'group12', 'subgroup13'),
(14, 'company4', 'region10', 'group13', 'subgroup14'),
(15, 'company5', 'region11', 'group14', 'subgroup15'),
(16, 'company5', 'region12', 'group15', 'subgroup16')
;
table_stats declaration:
Simplified to only contain a couple of the hours per day for only 1 group - subgroup.
CREATE TABLE `table_stats` (
`pk_id` int(10) unsigned NOT NULL,
`date_time` datetime NOT NULL,
`group` varchar(45) NOT NULL,
`subgroup` varchar(45) NOT NULL,
`stat` int(10) unsigned NOT NULL,
PRIMARY KEY (`pk_id`),
UNIQUE KEY `pk_id_UNIQUE` (`pk_id`),
UNIQUE KEY `om_unique` (`date_time`,`group`,`subgroup`)
);
INSERT INTO table_stats
(`pk_id`, `date_time`, `group`, `subgroup`, `stat`)
VALUES
(1, '2015-12-01 06:00:00', 'group9', 'subgroup10', 14),
(2, '2015-12-01 12:00:00', 'group9', 'subgroup10', 14),
(3, '2015-12-02 06:00:00', 'group9', 'subgroup10', 2),
(4, '2015-12-02 12:00:00', 'group9', 'subgroup10', 51),
(5, '2015-12-03 06:00:00', 'group9', 'subgroup10', 30),
(6, '2015-12-03 12:00:00', 'group9', 'subgroup10', 6),
(7, '2015-12-04 06:00:00', 'group9', 'subgroup10', 9),
(8, '2015-12-04 12:00:00', 'group9', 'subgroup10', 77),
(9, '2015-12-05 06:00:00', 'group9', 'subgroup10', 70),
(10, '2015-12-05 12:00:00', 'group9', 'subgroup10', 7),
(11, '2015-12-06 06:00:00', 'group9', 'subgroup10', 38),
(12, '2015-12-06 12:00:00', 'group9', 'subgroup10', 5),
(13, '2015-12-07 06:00:00', 'group9', 'subgroup10', 86),
(14, '2015-12-07 12:00:00', 'group9', 'subgroup10', 73),
(15, '2015-12-08 06:00:00', 'group9', 'subgroup10', 45),
(16, '2015-12-08 12:00:00', 'group9', 'subgroup10', 14),
(17, '2015-12-09 06:00:00', 'group9', 'subgroup10', 66),
(18, '2015-12-09 12:00:00', 'group9', 'subgroup10', 38),
(19, '2015-12-10 06:00:00', 'group9', 'subgroup10', 12),
(20, '2015-12-10 12:00:00', 'group9', 'subgroup10', 77),
(21, '2015-12-11 06:00:00', 'group9', 'subgroup10', 21),
(22, '2015-12-11 12:00:00', 'group9', 'subgroup10', 18),
(23, '2015-12-12 06:00:00', 'group9', 'subgroup10', 28),
(24, '2015-12-12 12:00:00', 'group9', 'subgroup10', 74),
(25, '2015-12-13 06:00:00', 'group9', 'subgroup10', 20),
(26, '2015-12-13 12:00:00', 'group9', 'subgroup10', 37),
(27, '2015-12-14 06:00:00', 'group9', 'subgroup10', 66),
(28, '2015-12-14 12:00:00', 'group9', 'subgroup10', 59),
(29, '2015-12-15 06:00:00', 'group9', 'subgroup10', 26),
(30, '2015-12-15 12:00:00', 'group9', 'subgroup10', 0),
(31, '2015-12-16 06:00:00', 'group9', 'subgroup10', 77),
(32, '2015-12-16 12:00:00', 'group9', 'subgroup10', 31),
(33, '2015-12-17 06:00:00', 'group9', 'subgroup10', 59),
(34, '2015-12-17 12:00:00', 'group9', 'subgroup10', 71),
(35, '2015-12-18 06:00:00', 'group9', 'subgroup10', 7),
(36, '2015-12-18 12:00:00', 'group9', 'subgroup10', 73),
(37, '2015-12-19 06:00:00', 'group9', 'subgroup10', 72),
(38, '2015-12-19 12:00:00', 'group9', 'subgroup10', 28),
(39, '2015-12-20 06:00:00', 'group9', 'subgroup10', 50),
(40, '2015-12-20 12:00:00', 'group9', 'subgroup10', 11),
(41, '2015-12-21 06:00:00', 'group9', 'subgroup10', 71),
(42, '2015-12-21 12:00:00', 'group9', 'subgroup10', 4),
(43, '2015-12-22 06:00:00', 'group9', 'subgroup10', 78),
(44, '2015-12-22 12:00:00', 'group9', 'subgroup10', 69),
(45, '2015-12-23 06:00:00', 'group9', 'subgroup10', 83),
(46, '2015-12-23 12:00:00', 'group9', 'subgroup10', 55),
(47, '2015-12-24 06:00:00', 'group9', 'subgroup10', 71),
(48, '2015-12-24 12:00:00', 'group9', 'subgroup10', 20),
(49, '2015-12-25 06:00:00', 'group9', 'subgroup10', 90),
(50, '2015-12-25 12:00:00', 'group9', 'subgroup10', 26),
(51, '2015-12-26 06:00:00', 'group9', 'subgroup10', 1),
(52, '2015-12-26 12:00:00', 'group9', 'subgroup10', 73),
(53, '2015-12-27 06:00:00', 'group9', 'subgroup10', 4),
(54, '2015-12-27 12:00:00', 'group9', 'subgroup10', 18),
(55, '2015-12-28 06:00:00', 'group9', 'subgroup10', 4),
(56, '2015-12-28 12:00:00', 'group9', 'subgroup10', 30),
(57, '2015-12-29 06:00:00', 'group9', 'subgroup10', 56),
(58, '2015-12-29 12:00:00', 'group9', 'subgroup10', 53),
(59, '2015-12-30 06:00:00', 'group9', 'subgroup10', 33),
(60, '2015-12-31 12:00:00', 'group9', 'subgroup10', 8)
;
Query to optimize:
SELECT * FROM
(
SELECT t3.company,t3.region,t3.day, t3.day_stat,COUNT(*) as rank
FROM
(
SELECT t2.company,t2.region,DAY(t1.date_time) as day,SUM(t1.stat) as day_stat
FROM schema1.table_stats t1
INNER JOIN table_companies t2
ON t1.group=t2.group AND t1.subgroup=t2.subgroup
WHERE
MONTH(t1.date_time)=12 AND
YEAR(t1.date_time)=2015
group by t2.company,t2.region,DAY(t1.date_time)
ORDER BY t2.company,t2.region,day_stat DESC
) t3
JOIN
(
SELECT t2.company,t2.region,DAY(t1.date_time) as day,SUM(t1.stat) as day_stat
FROM schema1.table_stats t1
INNER JOIN table_companies t2
ON t1.group=t2.group AND t1.subgroup=t2.subgroup
WHERE
MONTH(t1.date_time)=12 AND
YEAR(t1.date_time)=2015
group by t2.company,t2.region,DAY(t1.date_time)
ORDER BY t2.company,t2.region,day_stat DESC
) t4
ON
t4.day_stat >= t3.day_stat AND
t4.company = t3.company AND
t4.region = t3.region
GROUP BY t3.company,t3.region,t3.day_stat
ORDER BY t3.company,t3.region,rank
) t5
WHERE t5.rank<=20
;
Summary of query: from the 2 deepest subqueries it starts by joining both tables, grouping and aggregating the stat by the company, region and day. This is also where it restricts the month and year. Then it joins this result to a duplicate of itself to be able to generate the rank. Last select limits results to top 20 for each subgroup.
Expected result:
Apologies for presenting as a SQL declaration
INSERT INTO results
(`company`, `region`, `day`, `day_stat`, `rank`)
VALUES
('company3', 'region6', 7, 159, 1),
('company3', 'region6', 22, 147, 2),
('company3', 'region6', 23, 138, 3),
('company3', 'region6', 17, 130, 4),
('company3', 'region6', 14, 125, 5),
('company3', 'region6', 25, 116, 6),
('company3', 'region6', 29, 109, 7),
('company3', 'region6', 16, 108, 8),
('company3', 'region6', 9, 104, 9),
('company3', 'region6', 12, 102, 10),
('company3', 'region6', 19, 100, 11),
('company3', 'region6', 24, 91, 12),
('company3', 'region6', 10, 89, 13),
('company3', 'region6', 4, 86, 14),
('company3', 'region6', 18, 80, 15),
('company3', 'region6', 5, 77, 16),
('company3', 'region6', 21, 75, 17),
('company3', 'region6', 26, 74, 18),
('company3', 'region6', 20, 61, 19),
('company3', 'region6', 8, 59, 20)
;
tl;dr: Apologies for the long post. Asking to optimize http://sqlfiddle.com/#!9/29a7b/1.
The modifications I've made:
Completely modified your query
Added a composite index in table_companies table on group,subgroup
Added a composite index in table_stats table on group, subgroup
Modified Query:
SELECT
C.company,
C.region,
DAY(S.date_time) day,
SUM(S.stat) day_stat
FROM table_companies C
INNER JOIN table_stats S
ON C.`group` = S.`group` AND C.subgroup = S.subgroup
WHERE MONTH(S.date_time) = 12 AND YEAR(S.date_time) = 2015
GROUP BY C.company, C.region, DAY(S.date_time)
ORDER BY day_stat DESC
LIMIT 20;
WORKING DEMO
There's no rank column in the result set. Since the results are sorted according to rank in descending order so that you can implicitly treat the position of a row in the result set as the rank. Nevertheless if you really need the rank column then here is a working demo of it
Composite index(table_companies):
ALTER TABLE `table_companies` ADD INDEX `idx_table_compnaies_group_subgroup` (
`group`,
`subgroup`
);
Composite index(table_stats):
ALTER TABLE `table_stats` ADD INDEX `idx_table_stats_group_subgroup` (
`group`,
`subgroup`
);
Explain Result:
id select_type table type possible_keys key key_len ref rows Extra
1 SIMPLE S ALL idx_table_compnaies_group_subgroup 60 Using where; Using temporary; Using filesort
1 SIMPLE C ref idx_table_companies_group_subgroup idx_table_companies_group_subgroup 57 schema1.S.group,schema1.S.subgroup 1 Using index condition
Good news is MySQL can use these indexes(because these are under possible keys). Although it's showing ALL as type for table_companies. All I can say it's a small set of data. You cannot judge performance based on small set of data.
More:
I guess you have primary keys in those tables. If you don't have any then create.
EDIT:
SELECT
C.company,
C.region,
tt.day,
tt.total AS day_stat,
tt.rank
FROM table_companies C
INNER JOIN
(
SELECT
t.*,
IF(t.businessUnit = #sameBusinessUnit, #rn := #rn + 1, #rn := 1) AS rank,
#sameBusinessUnit := t.businessUnit
FROM
(
SELECT
S1.`group`,
S1.subgroup,
CONCAT(S1.`group`,S1.subgroup) AS businessUnit,
DAY(S1.date_time) AS day,
SUM(S1.stat) total
FROM table_stats S1
GROUP BY S1.group,S1.subgroup,DAY(S1.date_time)
ORDER BY total DESC
)AS t
CROSS JOIN (SELECT #rn := 1, #sameBusinessUnit := '') var
) AS tt
ON C.`group`=tt.`group` AND C.subgroup = tt.subgroup
WHERE tt.rank <= 20
ORDER BY tt.`group`,tt.`subgroup`,tt.rank;
WORKING DEMO(Version 2.0)
Just include one index for group so the join become more efficient
CREATE TABLE table_companies
(`pk_id` int, `company` varchar(8),
`region` varchar(8), `group` varchar(7), `subgroup` varchar(10),
PRIMARY KEY (`pk_id`),
UNIQUE KEY `pk_id_id_UNIQUE` (`pk_id`),
INDEX idx_group (`group`, `subgroup`)
)
;

SQL query: How to validate the prerequisite for courses that finished and how to find time conflict of courses?

Am trying to write query that advise student what courses to register . The query will select the suitable courses and will validate 1) the courses they finished .and what left for them to take 2) The prerequisite courses to be finished .3) validate the time conflict. In order to recommend for him best courses.
I did those table and join them , but the join operation is not working . What syntax is the correct ?What if there are no prerequisite how i will check that ?some prerequisite are for senior or junior is that need separate table?
ERROR 1064 (42000): You have an error in your SQL syntax; check the manual that corresponds to your
MYSQL server version for the right syntax to use near 'studyplan sp on (t.std_id=sp.std_is)
left outer join prerequsit p on (p.preid = c.' at line 3
select c.*
from std t
inner join schedule22 c studyplan sp
on (t.std_id=sp.std_id)
left outer join prerequsit p
on (p.preid=c.courseid)
inner join schedule22 c
on (c.courseid=p.courseid)
where t.std=1 AND
sp.complated='No' AND
sp.passed='No' AND
p.preid=courseid;
Student
enter code here
std_id username pass fname email
1 hjh 154 jdf example#live.com
Studyplan
Courseid code `prerequisite std_id completed passed
2 UNS 100 No Prerequisite 1 Y Y
3, 'ENG 100', 'No Prerequisite', 1, 'Y', 'Y'),
5, 'MTT 101', 'MTG 100', 1, 'Y', 'Y'),
6, 'MTT 202', 'MTT 101', 1, 'Y', 'N'),
(7, 'STT 100', 'No Prerequisite', 1, 'N', 'N'),
(8, 'MTT 102','MTT 101', 1, 'N', 'N'),
(9, 'ENG 200','english1', 1, 'N', 'N'),
(10, 'OE1',3, 'NULL', 1, 'N', 'N'),
(11, 'ENG 201','ENG 200', 1, 'N', 'N'),
(12, 'CSC 302', 'MTT 202', 1, 'N', 'N'),
(13, 'STT 201',, 'STT 100', 1, 'N', 'N'),
(15, 'CSC 201','MTT 101 or MTT 102', 1, 'N', 'N'),
(16, 'CSC 202', 'CSC 201', 1, 'N', 'N'),
(17, 'PSY 201', 'ENG 100 + UNS 100', 1, 'N', 'N'),
(18, 'NSC 201', 'No Prerequisite', 1, 'N', 'N'),
(19, 'CSC 307', 'CSC 201', 1, 'N', 'N'),
(20, 'CSC 301','CSC 202', 1, 'N', 'N'),
(21, 'ITE 390', 'Junior Level', 1, 'N', 'N'),
(22, 'CSC 305', 'Junior Level', 1, 'Y', 'Y'),
(23, 'ITE 305', ' 'Junior Level', 1, 'Y', 'Y'),
(24, 'ITE 414', ', 'junior Level', 1, 'Y', 'Y'),
(25, 'CSC 308', 'CSC 301', 1, 'N', 'N'),
(26, 'ITE 402', 'CSC 305', 1, 'N', 'N'),
(27, 'CSC 311', 'CSC 201', 1, 'N', 'N'),
(28, 'ITE 422', 'CSC 305', 1, 'N', 'N'),
(29, 'CIS 401', 'CSC 302', 1, 'N', 'N'),
(30, 'ITE 409', 'Senior Level', 1, 'N', 'N'),
(31, 'CIS 401', 'CSC 302', 1, 'N', 'N'),
(32, 'CSC 401', 'ITE 305', 1, 'N', 'N'),
(33, 'ITE 409', 'Null', 1, 'N', 'N'),
(34, 'ITE 408', 'CSC 305', 1, 'N', 'N')
Schedule
enter code here
semester`, `courseid`, `coursecode`, `section`,`date`, `time`, `,`sch_id`)
('fall', 9, 'ENG 100', 51,'MoWe', '1:45PM-3:15PM', 'staff',1),
('fall', 16, 'CSC202', 51, 'Mo-We',' 1:45PM-3:15PM', 'staff',1),
('fall', 26, 'ITE402', 51, 'Tu','10:30-12pm', 'staff',1),
('fall', 6, 'MTT 202', 51,'Su-Tu', '12:00-2:00PM', 'staff',1),
('fall', 8, 'MTT 102', 51','SuTu',' 12:00-2:00PM', 'staff',1),
('fall', 12, 'CSC 302', 51,'Mo-We',' 10:00-12:00PM', 'staff',1),
('fall', 15, 'CSC 201', 52,'Mo-We',' 10:00-12:00PM', 'staff',1),
('fall', 21, 'ITE 390', 51, 'Su-Tu',' 12:00-2:00PM', 'staff',1),
('fall', 5, 'MTT 101', 51, 'Su',' 4:00PM -7:00PM', 'staff',1),
('fall', 28, 'ITE 422', 51, Su-Tu',' 12:00-2:00PM', 'staff',1);
prerequsit`
enter code here
(`courseid`, `preid`) VALUES
(5, 1,),
(6, 2),
(8, 3),
(9, 4),
(11, 5),
(12, 6),
(13, 7),
(14, 8),
(15, 9),
(16, 10),
(17, 11),
(18, 12),
(19, 13),
(20, 14),
(21, 21),
(22, 22),
(23, 23),
(24, 24),
(25, 20),
(26, 22),
(27, 25),
(28, 22),
(29, 12),
(30, 30),
(32, 23),
(34, 22,),
(35, 12),
(36, 22),
(37, 3),
Your query contains schedule22 c twice in the from clause. That's an error. There may be more.

Group results by day and month (php timestamp) showing total revenue per day

Using mysql how can I group together by day and month showing tghe total revenue?
E.g. (not based on below data)
day month revenue
1 01 10.97
2 01 3.57
3 01 0
etc.
Heres an example of my data:
CREATE TABLE IF NOT EXISTS `sales` (
`id` bigint(255) NOT NULL AUTO_INCREMENT,
`timestamp` int(12) NOT NULL,
`product` int(5) NOT NULL,
`publisher` int(5) NOT NULL,
`market` int(5) NOT NULL,
`revenue` float NOT NULL,
`Units` int(5) NOT NULL,
`Downloads` int(11) NOT NULL,
PRIMARY KEY (`id`)
) ENGINE=MyISAM DEFAULT CHARSET=latin1 AUTO_INCREMENT=138 ;
--
-- Dumping data for table `sales`
--
INSERT INTO `sales` (`id`, `timestamp`, `revenue`) VALUES
(1, 1394150400, 3.65),
(2, 1394064000, 0),
(4, 1393977600, 0),
(5, 1393891200, 7.42),
(6, 1393804800, 0),
(7, 1393718400, 0),
(8, 1393632000, 0),
(9, 1393545600, 0),
(10, 1393459200, 0),
(11, 1393372800, 0),
(12, 1393286400, 3.65),
(13, 1393200000, 3.65),
(14, 1393177032, 0),
(15, 1393090632, 3.65),
(16, 1393004232, 0),
(17, 1392917832, 0),
(18, 1392831432, 0),
(19, 1392745032, 0),
(20, 1392658632, 0),
(21, 1392572232, 0),
(24, 1391881032, 0),
(23, 1392485832, 0),
(25, 1392336000, 0),
(26, 1392249600, 0),
(27, 1392163200, 0),
(28, 1392076800, 0),
(29, 1391990400, 3.81),
(30, 1391904000, 0),
(31, 1391817600, 0),
(32, 1391731200, 3.65),
(33, 1391644800, 3.58),
(34, 1391558400, 3.58),
(35, 1391472000, 0),
(36, 1391385600, 0),
(37, 1391299200, 0),
(38, 1391212800, 7.23),
(39, 1391126400, 0),
(40, 1391040000, 0),
(41, 1390953600, 3.81),
(42, 1390867200, 4.52),
(43, 1390780800, 0),
(44, 1390694400, 3.65),
(45, 1390608000, 3.81),
(46, 1390585032, 0),
(47, 1390435200, 0),
(48, 1390348800, 3.58),
(49, 1390262400, 0),
(50, 1390176000, 0),
(51, 1390089600, 0),
(52, 1390003200, 0),
(53, 1389916800, 3.58),
(54, 1389893832, 0),
(55, 1389744000, 0),
(56, 1389657600, 0),
(57, 1389571200, 0),
(58, 1389484800, 0),
(59, 1389398400, 3.65),
(60, 1389312000, 3.18),
(61, 1389225600, 0),
(62, 1389139200, 0),
(63, 1389052800, 0),
(64, 1389052800, 0),
(65, 1388966400, 3.65),
(66, 1388880000, 4.05),
(67, 1388793600, 0),
(68, 1388707200, 3.65),
(69, 1388620800, 0),
(70, 1388534400, 0),
(71, 1394236800, 0),
(72, 1394236800, 2.51),
(73, 1394236800, 0),
(74, 1394150400, 5.02),
(75, 1394150400, 2.76),
(76, 1394064000, 7.5),
(77, 1394064000, 8.28),
(78, 1393977600, 0),
(79, 1393977600, 0),
(80, 1393891200, 7.5),
(81, 1393891200, 2.36),
(82, 1393804800, 0),
(83, 1393804800, 0),
(84, 1393718400, 2.76),
(85, 1393718400, 0),
(86, 1393632000, 0),
(87, 1393545600, 0),
(88, 1393545600, 2.76),
(89, 1393459200, 2.51),
(90, 1393459200, 2.51),
(91, 1393433613, 2.51),
(92, 1393433613, 0),
(93, 1393286400, 2.54),
(94, 1393286400, 2.76),
(95, 1393200000, 2.52),
(96, 1393200000, 5.51),
(97, 1394323200, 0),
(98, 1394323200, 5.01),
(99, 1394323200, 5.52),
(100, 1394409600, 0),
(101, 1394409600, 2.05),
(102, 1394409600, 5.27),
(103, 1393113600, 5.08),
(104, 1393027200, 5.09),
(105, 1392854400, 5.32),
(106, 1392854400, 7.63),
(107, 1392940800, 0),
(108, 1392595200, 0),
(109, 1392508800, 7.64),
(110, 1392422400, 0),
(111, 1392336000, 2.58),
(112, 1392163200, 5.57),
(113, 1391990400, 0),
(114, 1391817600, 0),
(115, 1391731200, 15.99),
(116, 1391472000, 10.66),
(117, 1391385600, 2.54),
(118, 1391299200, 2.54),
(119, 1391212800, 5.34),
(120, 1391040000, 0),
(121, 1390953600, 2.55),
(122, 1390780800, 10.9),
(123, 1390608000, 12.72),
(124, 1390435200, 7.64),
(125, 1390262400, 2.55),
(126, 1390089600, 9.92),
(127, 1389916800, 2.55),
(128, 1389744000, 2.55),
(129, 1389571200, 5.1),
(130, 1389398400, 2.55),
(131, 1389225600, 5.1),
(132, 1389052800, 7.65),
(133, 1388880000, 5.1),
(134, 1388793600, 9.99),
(135, 1388620800, 0),
(136, 1394582400, 4.14),
(137, 1394582400, 2.76);
SELECT DATE_FORMAT(FROM_UNIXTIME(`timestamp`),'%d') DAY, DATE_FORMAT(FROM_UNIXTIME(`timestamp`),'%m') MONTH, SUM(`revenue`)
FROM sales
GROUP BY DAY,MONTH
ORDER BY MONTH,DAY
Check the FROM_UNIXTIME Function Here