Related
I have the followin
SELECT
au.country as country_code,
COALESCE(SUM(uwm.amount), 0) as amountInbound
FROM user_wallet_movement uwm
LEFT JOIN user_wallet uw ON uwm.wallet_id = uw.id
LEFT JOIN app_user au ON uw.user_id = au.id
WHERE
status = 'execute'
and direction = 'inbound'
and mov_date > '2020-07-01'
and au.country IN ('AD', 'AC', 'AE', 'AF', 'AG', 'AI', 'AL', 'AM', 'AN', 'AO', 'AQ', 'AR', 'AS', 'AT', 'AU', 'AW', 'AZ', 'BA', 'BB', 'BD', 'BE', 'BF', 'BG', 'BH', 'BI', 'BJ', 'BM', 'BN', 'BO', 'BR', 'BS', 'BT', 'BV', 'BW', 'BY', 'BZ', 'CA', 'CC', 'CD', 'CF', 'CG', 'CH', 'CI', 'CK', 'CL', 'CM', 'CN', 'CO', 'CR', 'CU', 'CV', 'CX', 'CY', 'CZ', 'DE', 'DJ', 'DK', 'DM', 'DO', 'DZ', 'EC', 'EE', 'EG', 'EH', 'ER', 'ES', 'ET', 'FI', 'FJ', 'FK', 'FM', 'FO', 'FR', 'GA', 'GB', 'GD', 'GE', 'GF', 'GH', 'GI', 'GL', 'GM', 'GN', 'GP', 'GQ', 'GR', 'GT', 'GU', 'GW', 'GY', 'HK', 'HM', 'HN', 'HR', 'HT', 'HU', 'ID', 'IE', 'IL', 'IN', 'IO', 'IQ', 'IR', 'IS', 'IT', 'JM', 'JO', 'JP', 'KE', 'KG', 'KH', 'KI', 'KM', 'KN', 'KP', 'KR', 'KW', 'KY', 'KZ', 'LA', 'LB', 'LC', 'LI', 'LK', 'LR', 'LS', 'LT', 'LU', 'LV', 'LY', 'MA', 'MC', 'MC', 'MD', 'ME', 'MG', 'MH', 'MK', 'ML', 'MM', 'MN', 'MP', 'MQ', 'MR', 'MS', 'MT', 'MU', 'MV', 'MW', 'MX', 'MY', 'MZ', 'NA', 'NC', 'NE', 'NF', 'NG', 'NI', 'NL', 'NO', 'NP', 'NR', 'NU', 'NZ', 'OM', 'PA', 'PE', 'PF', 'PG', 'PH', 'PK', 'PL', 'PM', 'PN', 'PR', 'PT', 'PW', 'PY', 'QA', 'RE', 'RO', 'RS', 'RU', 'RW', 'SA', 'SB', 'SC', 'SD', 'SE', 'SG', 'SH', 'SI', 'SJ', 'SK', 'SL', 'SM', 'SN', 'SO', 'SR', 'ST', 'SV', 'SY', 'SZ', 'TC', 'TD', 'TF', 'TG', 'TH', 'TJ', 'TK', 'TL', 'TM', 'TN', 'TO', 'TR', 'TT', 'TV', 'TZ', 'UA', 'UG', 'UM', 'US', 'UY', 'UZ', 'VA', 'VC', 'VE', 'VG', 'VI', 'VN', 'VU', 'WF', 'WS', 'xA', 'xE', 'xF', 'XK', 'xN', 'xO', 'xS', 'YE', 'YT', 'ZA', 'ZM', 'ZW')
GROUP BY country_code
ORDER BY country_code
Which should return me the amount of money spent in each country in the list.
My output is
AE 0.35365110000000016
AR 1.0367374499999995
AT 0.11195171000000001
AU 1.7345992
BE 1.9242438800000006
BG 5.043282479999997
CA 0.5906319000000001
CH 0.5082077999999999
CO 0.14248785
CR 0.036722840000000014
CU 0.11325390999999999
CY 0.18752883999999997
CZ 0.11454307999999999
DE 8.057752660000036
DO 0.8858295500000001
EE 0.7410690900000001
ES 31.125371000000023
FR 0.4851664099999999
GB 1.44115391
HR 0.023154
HU 1.0131190899999998
IE 0.3229343799999997
IN 0.026833529999999984
IT 2199.1061043693944
KE 0.21115987
KR 0.161765
LU 0.20279967
MC 0.2127708600000001
MT 0.028277630000000005
MX 0.45381685
NL 0.1408655
PE 0.00108554
QA 1.8347713
RO 7.0233499800000105
RS 0.25260947000000006
RU 0.16577983
SE 3.4979126399999947
SH 1.1328741000000002
SI 0.00178069
SK 0.04637177
SZ 0.3603625199999996
US 2.41114205
VE 0.53491791
So, as you can see, there are countries in the list which not appear in the output because the amount is null.
How can I include them in the list with the value 0?
Thank you
EDIT:
Not all countries in the list are in the table; I also would like to be in the output countries that are not in the table but are in the list
the problem is if you don't have data in wallet tables for given table join will return nothing. instead you can use left join :
SELECT
au.country as country_code,
COALESCE(SUM(uwm.amount), 0) as amountInbound
FROM
app_user au
left join user_wallet uw ON uw.user_id = au.id
left JOIN user_wallet_movement uwm ON uwm.wallet_id = uw.id
WHERE
status = 'execute'
and direction = 'inbound'
and mov_date > '2020-07-01'
au.country IN ('AD', 'AC', 'AE', 'AF', 'AG', 'AI', 'AL', 'AM', 'AN', 'AO', 'AQ', 'AR', 'AS', 'AT', 'AU', 'AW', 'AZ', 'BA', 'BB', 'BD', 'BE', 'BF', 'BG', 'BH', 'BI', 'BJ', 'BM', 'BN', 'BO', 'BR', 'BS', 'BT', 'BV', 'BW', 'BY', 'BZ', 'CA', 'CC', 'CD', 'CF', 'CG', 'CH', 'CI', 'CK', 'CL', 'CM', 'CN', 'CO', 'CR', 'CU', 'CV', 'CX', 'CY', 'CZ', 'DE', 'DJ', 'DK', 'DM', 'DO', 'DZ', 'EC', 'EE', 'EG', 'EH', 'ER', 'ES', 'ET', 'FI', 'FJ', 'FK', 'FM', 'FO', 'FR', 'GA', 'GB', 'GD', 'GE', 'GF', 'GH', 'GI', 'GL', 'GM', 'GN', 'GP', 'GQ', 'GR', 'GT', 'GU', 'GW', 'GY', 'HK', 'HM', 'HN', 'HR', 'HT', 'HU', 'ID', 'IE', 'IL', 'IN', 'IO', 'IQ', 'IR', 'IS', 'IT', 'JM', 'JO', 'JP', 'KE', 'KG', 'KH', 'KI', 'KM', 'KN', 'KP', 'KR', 'KW', 'KY', 'KZ', 'LA', 'LB', 'LC', 'LI', 'LK', 'LR', 'LS', 'LT', 'LU', 'LV', 'LY', 'MA', 'MC', 'MC', 'MD', 'ME', 'MG', 'MH', 'MK', 'ML', 'MM', 'MN', 'MP', 'MQ', 'MR', 'MS', 'MT', 'MU', 'MV', 'MW', 'MX', 'MY', 'MZ', 'NA', 'NC', 'NE', 'NF', 'NG', 'NI', 'NL', 'NO', 'NP', 'NR', 'NU', 'NZ', 'OM', 'PA', 'PE', 'PF', 'PG', 'PH', 'PK', 'PL', 'PM', 'PN', 'PR', 'PT', 'PW', 'PY', 'QA', 'RE', 'RO', 'RS', 'RU', 'RW', 'SA', 'SB', 'SC', 'SD', 'SE', 'SG', 'SH', 'SI', 'SJ', 'SK', 'SL', 'SM', 'SN', 'SO', 'SR', 'ST', 'SV', 'SY', 'SZ', 'TC', 'TD', 'TF', 'TG', 'TH', 'TJ', 'TK', 'TL', 'TM', 'TN', 'TO', 'TR', 'TT', 'TV', 'TZ', 'UA', 'UG', 'UM', 'US', 'UY', 'UZ', 'VA', 'VC', 'VE', 'VG', 'VI', 'VN', 'VU', 'WF', 'WS', 'xA', 'xE', 'xF', 'XK', 'xN', 'xO', 'xS', 'YE', 'YT', 'ZA', 'ZM', 'ZW')
GROUP BY country_code
ORDER BY country_code
Probably your inner joins are limiting the rows in your output, try using outer joins:
SELECT
au.country as country_code,
COALESCE(SUM(uwm.amount), 0) as amountInbound
FROM user_wallet_movement uwm
LEFT OUTER JOIN user_wallet uw ON uwm.wallet_id = uw.id
LEFT OUTER JOIN app_user au ON uw.user_id = au.id
WHERE
status = 'execute'
and direction = 'inbound'
and mov_date > '2020-07-01'
and au.country IN (...)
GROUP BY country_code
ORDER BY country_code
If your data has all the countries, then a simple fix is to use conditional aggregation:
SELECT au.country as country_code,
SUM(CASE WHEN status = 'execute' and direction = 'inbound' and mov_date > '2020-07-01' THEN uwm.amount ELSE 0 END) as amountInbound
FROM user_wallet_movement uwm JOIN
user_wallet uw
ON uwm.wallet_id = uw.id JOIN
app_user au
ON uw.user_id = au.id
WHERE au.country IN ('AD', 'AC', 'AE', 'AF', 'AG', 'AI', 'AL', 'AM', 'AN', 'AO', 'AQ', 'AR', 'AS', 'AT', 'AU', 'AW', 'AZ', 'BA', 'BB', 'BD', 'BE', 'BF', 'BG', 'BH', 'BI', 'BJ', 'BM', 'BN', 'BO', 'BR', 'BS', 'BT', 'BV', 'BW', 'BY', 'BZ', 'CA', 'CC', 'CD', 'CF', 'CG', 'CH', 'CI', 'CK', 'CL', 'CM', 'CN', 'CO', 'CR', 'CU', 'CV', 'CX', 'CY', 'CZ', 'DE', 'DJ', 'DK', 'DM', 'DO', 'DZ', 'EC', 'EE', 'EG', 'EH', 'ER', 'ES', 'ET', 'FI', 'FJ', 'FK', 'FM', 'FO', 'FR', 'GA', 'GB', 'GD', 'GE', 'GF', 'GH', 'GI', 'GL', 'GM', 'GN', 'GP', 'GQ', 'GR', 'GT', 'GU', 'GW', 'GY', 'HK', 'HM', 'HN', 'HR', 'HT', 'HU', 'ID', 'IE', 'IL', 'IN', 'IO', 'IQ', 'IR', 'IS', 'IT', 'JM', 'JO', 'JP', 'KE', 'KG', 'KH', 'KI', 'KM', 'KN', 'KP', 'KR', 'KW', 'KY', 'KZ', 'LA', 'LB', 'LC', 'LI', 'LK', 'LR', 'LS', 'LT', 'LU', 'LV', 'LY', 'MA', 'MC', 'MC', 'MD', 'ME', 'MG', 'MH', 'MK', 'ML', 'MM', 'MN', 'MP', 'MQ', 'MR', 'MS', 'MT', 'MU', 'MV', 'MW', 'MX', 'MY', 'MZ', 'NA', 'NC', 'NE', 'NF', 'NG', 'NI', 'NL', 'NO', 'NP', 'NR', 'NU', 'NZ', 'OM', 'PA', 'PE', 'PF', 'PG', 'PH', 'PK', 'PL', 'PM', 'PN', 'PR', 'PT', 'PW', 'PY', 'QA', 'RE', 'RO', 'RS', 'RU', 'RW', 'SA', 'SB', 'SC', 'SD', 'SE', 'SG', 'SH', 'SI', 'SJ', 'SK', 'SL', 'SM', 'SN', 'SO', 'SR', 'ST', 'SV', 'SY', 'SZ', 'TC', 'TD', 'TF', 'TG', 'TH', 'TJ', 'TK', 'TL', 'TM', 'TN', 'TO', 'TR', 'TT', 'TV', 'TZ', 'UA', 'UG', 'UM', 'US', 'UY', 'UZ', 'VA', 'VC', 'VE', 'VG', 'VI', 'VN', 'VU', 'WF', 'WS', 'xA', 'xE', 'xF', 'XK', 'xN', 'xO', 'xS', 'YE', 'YT', 'ZA', 'ZM', 'ZW')
GROUP BY country_code
ORDER BY country_code;
Otherwise, you will need to use a LEFT JOIN. For that purpose, it is better to start with a countries table of some sort. Do you have such a table?
If you would like to have all the countries from table A to show up even though table b has less number of countries then your join should be LEFT instead of Inner.
This way rows with no amount will be null in cases where the countries doesn't exist in table B. This null can be replaced using COALESCE()
COALESCE returns the first non-null value. if both are not null then it returns the first one.
SELECT
au.country as country_code,
COALESCE(SUM(uwm.amount), 0) as amountInbound
FROM user_wallet_movement uwm
LEFT JOIN user_wallet uw
ON uwm.wallet_id = uw.id
LEFT JOIN app_user au
ON uw.user_id = au.id
WHERE
status = 'execute'
and direction = 'inbound'
and mov_date > '2020-07-01'
and au.country IN ('ALL THE COUNTRIES')
GROUP BY
country_code
ORDER BY
country_code
Hi i have the following mysql data
INSERT INTO `monthly` (`id`, `year`, `stat_id`, `cat_id`, `January`, `February`, `March`, `April`, `May`, `June`, `July`, `August`, `September`, `October`, `November`, `December`) VALUES
(1, '2017', '12', '25', '1', '3', '1', '1', '3', '4', '4', '2', '4', '', '', ''),
and i would like it to be convert to be like this
INSERT INTO `monthlydata` (`id`, `year`, `monthName`, `stat_id`, `cat_id`, `data`) VALUES
(1, '2017', 'January', '12', '25', '1'),
(2, '2017', 'February', '12', '25', '3'),
(3, '2017', 'March', '12', '25', '1'),
(4, '2017', 'April', '12', '25', '1'),
(5, '2017', 'May', '12', '25', '3'),
(6, '2017', 'June', '12', '25', '4'),
(7, '2017', 'July', '12', '25', '4'),
(8, '2017', 'August', '12', '25', '2'),
(9, '2017', 'September', '12', '25', '4'),
(10, '2017', 'October', '12', '25', ''),
(11, '2017', 'November', '12', '25', ''),
(12, '2017', 'December', '12', '25', ''),
is there an easier way to do this using mysql/php
You need to UNPIVOT your data. MySQL doesn't have a built in function to do that so you'll need to use multiple queries.
INSERT INTO `monthlydata` (`id`, `year`, `monthName`, `stat_id`, `cat_id`, `data`) VALUES
SELECT id, year, 'January', stat_id, cat_id, January
FROM monthly WHERE monthName = 'January'
UNION ALL
SELECT id, year, 'February', stat_id, cat_id, February
FROM monthly WHERE monthName = 'February'
UNION ALL
SELECT id, year, 'March', stat_id, cat_id, March
FROM monthly WHERE monthName = 'March'
.....
ID column here might cause issues. Depending on how you have defined it. If it is auto generated then you can remove it from the INSERT and let it be auto generated. Since you'll have rows for all months with same ID, you need to handle that scenario.
Question: For a given day t, I need to create an on-time delivery (OTD) metric for 61 days for seller_id "123".
The OTD is the rate at which packages were delivered on-time
for packages created in the 30 days preceding t.
Not only do we want to know today's OTD, but we want to know the OTD for
every date in the past 60 days.
CREATE TABLE `packages` (
`id` int NOT NULL,
`created_at` timestamp NULL DEFAULT NULL,
`seller_id` int DEFAULT NULL,
`promise_date` date DEFAULT NULL,
`delivered_at` timestamp NULL DEFAULT NULL,
PRIMARY KEY (`id`)
);
INSERT INTO `packages` (`id`, `created_at`, `seller_id`, `promise_date`, `delivered_at`) VALUES
('1', '2020-04-15 06:26:28', '246', '2020-04-20', '2020-04-20 07:47:07'),
('2', '2020-05-02 06:19:13', '123', '2020-05-06', '2020-05-07 07:01:26'),
('3', '2020-07-05 08:47:17', '789', '2020-07-08', '2020-07-09 08:22:08'),
('4', '2020-03-25 08:12:35', '234', '2020-03-30', '2020-03-30 08:49:50'),
('5', '2020-07-01 07:51:33', '789', '2020-07-06', '2020-07-06 08:34:48'),
('6', '2020-03-01 08:11:31', '123', '2020-03-04', '2020-03-04 06:00:14'),
('7', '2020-05-20 07:38:14', '246', '2020-05-25', '2020-05-25 08:13:51'),
('8', '2020-04-14 07:30:19', '123', '2020-04-17', '2020-04-17 07:46:55'),
('9', '2020-02-10 08:53:00', '234', '2020-02-13', '2020-02-14 06:45:57'),
('10', '2020-01-21 07:14:56', '246', '2020-01-24', '2020-01-25 08:03:04'),
('11', '2020-07-03 08:05:11', '123', '2020-07-08', '2020-07-09 08:30:56'),
('12', '2020-05-09 06:18:31', '789', '2020-05-13', '2020-05-13 08:38:55'),
('13', '2020-02-13 08:11:10', '123', '2020-02-18', '2020-02-18 07:48:52'),
('14', '2020-04-28 08:25:28', '789', '2020-05-01', '2020-05-02 06:06:32'),
('15', '2020-06-02 07:28:52', '234', '2020-06-05', '2020-06-06 07:29:43'),
('16', '2020-05-04 08:39:33', '123', '2020-05-07', '2020-05-08 06:33:14'),
('17', '2020-07-26 08:18:30', '789', '2020-07-29', '2020-07-30 07:28:53'),
('18', '2020-02-25 08:37:42', '234', '2020-02-28', '2020-02-29 06:05:23'),
('19', '2020-02-03 06:55:39', '234', '2020-02-06', '2020-02-07 07:18:28'),
('20', '2020-03-07 08:20:44', '246', '2020-03-11', '2020-03-11 08:11:45'),
('21', '2020-03-11 07:19:47', '789', '2020-03-16', '2020-03-16 06:55:46'),
('22', '2020-06-24 08:18:56', '789', '2020-06-29', '2020-06-29 08:47:59'),
('23', '2020-02-25 07:24:19', '123', '2020-02-28', '2020-02-28 06:54:57'),
('24', '2020-07-12 07:51:52', '789', '2020-07-15', '2020-07-16 07:36:21'),
('25', '2020-01-26 07:44:59', '234', '2020-01-29', '2020-01-29 08:52:24'),
('26', '2020-02-07 06:09:24', '246', '2020-02-12', '2020-02-13 08:16:37'),
('27', '2020-03-11 08:34:57', '123', '2020-03-16', '2020-03-17 08:33:47'),
('28', '2020-02-24 08:15:41', '789', '2020-02-27', '2020-02-27 06:19:59'),
('29', '2020-02-02 06:45:36', '123', '2020-02-05', '2020-02-06 06:22:25'),
('30', '2020-02-10 06:51:48', '123', '2020-02-13', '2020-02-13 06:45:07'),
('31', '2020-03-27 07:11:58', '789', '2020-04-01', '2020-04-02 08:55:56'),
('32', '2020-05-31 07:10:05', '246', '2020-06-03', '2020-06-03 08:56:47'),
('33', '2020-06-28 06:14:19', '789', '2020-07-01', '2020-07-02 06:35:18'),
('34', '2020-07-08 08:30:12', '789', '2020-07-13', '2020-07-14 06:06:09'),
('35', '2020-05-13 08:13:34', '123', '2020-05-18', '2020-05-18 08:24:42'),
('36', '2020-04-19 08:13:38', '246', '2020-04-22', '2020-04-22 07:32:14'),
('37', '2020-03-02 06:57:32', '234', '2020-03-05', '2020-03-05 07:16:05'),
('38', '2020-05-22 08:49:51', '246', '2020-05-27', '2020-05-27 06:47:41'),
('39', '2020-02-27 08:18:26', '123', '2020-03-03', '2020-03-03 06:32:56'),
('40', '2020-02-17 07:10:24', '246', '2020-02-20', '2020-02-21 06:06:26'),
('41', '2020-06-25 08:29:32', '234', '2020-06-30', '2020-06-30 07:37:07'),
('42', '2020-03-02 08:07:57', '234', '2020-03-05', '2020-03-05 08:41:13'),
('43', '2020-06-18 06:44:38', '234', '2020-06-23', '2020-06-23 06:11:26'),
('44', '2020-07-15 08:22:49', '246', '2020-07-20', '2020-07-20 08:34:28'),
('45', '2020-07-07 07:54:10', '123', '2020-07-10', '2020-07-10 07:50:24'),
('46', '2020-07-17 07:43:08', '123', '2020-07-22', '2020-07-22 06:33:22'),
('47', '2020-04-01 08:24:20', '234', '2020-04-06', '2020-04-06 06:12:55'),
('48', '2020-05-14 08:49:10', '123', '2020-05-19', '2020-05-20 06:53:50'),
('49', '2020-06-11 08:20:35', '246', '2020-06-16', '2020-06-16 06:21:10'),
('50', '2020-06-24 06:39:29', '789', '2020-06-29', '2020-06-30 06:03:48'),
('51', '2020-02-29 06:43:01', '246', '2020-03-04', '2020-03-05 07:57:51'),
('52', '2020-07-17 08:23:46', '246', '2020-07-22', '2020-07-22 07:49:01'),
('53', '2020-03-27 07:45:10', '123', '2020-04-01', '2020-04-02 06:43:34'),
('54', '2020-04-28 06:39:55', '246', '2020-05-01', '2020-05-01 08:59:08'),
('55', '2020-05-21 07:16:03', '789', '2020-05-26', '2020-05-26 06:29:53'),
('56', '2020-02-10 08:22:04', '246', '2020-02-13', '2020-02-14 06:24:17'),
('57', '2020-02-02 08:04:26', '234', '2020-02-05', '2020-02-05 08:59:43'),
('58', '2020-03-02 08:21:53', '246', '2020-03-05', '2020-03-06 08:45:36'),
('59', '2020-02-19 08:37:15', '123', '2020-02-24', '2020-02-24 08:24:44'),
('60', '2020-06-16 08:51:24', '234', '2020-06-19', '2020-06-20 08:25:14'),
('61', '2020-07-11 07:37:15', '234', '2020-07-15', '2020-07-15 07:03:13'),
('62', '2020-06-15 07:56:39', '123', '2020-06-18', '2020-06-19 07:11:16'),
('63', '2020-03-06 07:21:52', '123', '2020-03-11', '2020-03-11 07:46:48'),
('64', '2020-06-03 06:43:50', '789', '2020-06-08', '2020-06-09 07:40:17'),
('65', '2020-01-20 06:28:47', '234', '2020-01-23', '2020-01-24 08:34:05'),
('66', '2020-04-02 08:04:41', '123', '2020-04-07', '2020-04-08 08:56:45'),
('67', '2020-03-04 06:05:57', '789', '2020-03-09', '2020-03-10 06:26:56'),
('68', '2020-07-04 06:47:46', '246', '2020-07-08', '2020-07-09 06:53:02'),
('69', '2020-02-25 06:47:09', '246', '2020-02-28', '2020-02-28 07:55:25'),
('70', '2020-02-04 07:17:28', '123', '2020-02-07', '2020-02-07 08:07:54'),
('71', '2020-06-15 07:18:16', '789', '2020-06-18', '2020-06-19 06:02:08'),
('72', '2020-07-09 06:32:34', '234', '2020-07-14', '2020-07-14 08:15:02'),
('73', '2020-05-21 06:12:52', '789', '2020-05-26', '2020-05-27 07:39:20'),
('74', '2020-05-24 06:38:49', '789', '2020-05-27', '2020-05-27 06:51:35'),
('75', '2020-02-27 06:31:02', '123', '2020-03-03', '2020-03-03 08:56:26'),
('76', '2020-07-02 08:55:00', '123', '2020-07-07', '2020-07-07 07:42:16'),
('77', '2020-06-30 06:52:27', '246', '2020-07-03', '2020-07-03 07:43:20'),
('78', '2020-04-25 08:08:14', '246', '2020-04-29', '2020-04-29 07:21:23'),
('79', '2020-06-24 08:34:43', '234', '2020-06-29', '2020-06-30 06:43:59'),
('80', '2020-05-13 08:59:11', '246', '2020-05-18', '2020-05-18 07:19:06'),
('81', '2020-02-21 07:14:16', '789', '2020-02-26', '2020-02-27 07:10:39'),
('82', '2020-06-04 08:43:13', '789', '2020-06-09', '2020-06-09 07:24:28'),
('83', '2020-07-04 07:14:42', '234', '2020-07-08', '2020-07-09 07:45:59'),
('84', '2020-05-24 08:17:00', '246', '2020-05-27', '2020-05-27 06:31:15'),
('85', '2020-03-07 07:43:27', '123', '2020-03-11', '2020-03-12 08:39:45');
create table dates(
fulldate date);
INSERT INTO `dates` (`fulldate`) VALUES
('2020-08-01'),
('2020-07-31'),
('2020-07-30'),
('2020-07-29'),
('2020-07-28'),
('2020-07-27'),
('2020-07-26'),
('2020-07-25'),
('2020-07-24'),
('2020-07-23'),
('2020-07-22'),
('2020-07-21'),
('2020-07-20'),
('2020-07-19'),
('2020-07-18'),
('2020-07-17'),
('2020-07-16'),
('2020-07-15'),
('2020-07-14'),
('2020-07-13'),
('2020-07-12'),
('2020-07-11'),
('2020-07-10'),
('2020-07-09'),
('2020-07-08'),
('2020-07-07'),
('2020-07-06'),
('2020-07-05'),
('2020-07-04'),
('2020-07-03'),
('2020-07-02'),
('2020-07-01'),
('2020-06-30'),
('2020-06-29'),
('2020-06-28'),
('2020-06-27'),
('2020-06-26'),
('2020-06-25'),
('2020-06-24'),
('2020-06-23'),
('2020-06-22'),
('2020-06-21'),
('2020-06-20'),
('2020-06-19'),
('2020-06-18'),
('2020-06-17'),
('2020-06-16'),
('2020-06-15'),
('2020-06-14'),
('2020-06-13'),
('2020-06-12'),
('2020-06-11'),
('2020-06-10'),
('2020-06-09'),
('2020-06-08'),
('2020-06-07'),
('2020-06-06'),
('2020-06-05'),
('2020-06-04'),
('2020-06-03'),
('2020-06-02');
Desired Output:
date | on_time_delivery_ratio
2020-07-31 | 0.75
2020-07-30 | 0.69
2020-07-29 | 0.68
2020-07-28 | 0.80
2020-07-27 | 0.79
2020-07-26 | 0.78
2020-07-25 | 0.69
2020-07-24 | 0.72
What I have done:
I have been able to create a metric, however just for the current date.
SELECT curdate(),sum(case when promise_date=date(delivered_at) then 1 else 0 end)/count(*)*100
"On-Time Delivery Rate (%)"
from packages p
where p.seller_id=123 and date(created_at) between DATE_SUB(curdate(),interval 30 day) and
DATE_SUB(curdate(),interval 1 day);
Where I need help:
Instead of using curdate(), I need the dates from the dates table as the first column of the desired output and the metric for those dates.
SQL Fiddle:
http://sqlfiddle.com/#!9/b665ca1/2
Fiddle: http://sqlfiddle.com/#!9/b665ca1/7 (based on http://sqlfiddle.com/#!9/b665ca1/2 )
SELECT
delivered_at,
sum(case when promise_date=date(delivered_at) then 1 else 0 end)/count(*) "On-Time Delivery Rate (%)"
from packages p
where p.seller_id=123
and date(delivered_at) between DATE_SUB(curdate(),interval 30 day) and DATE_SUB(curdate(),interval 1 day)
group by delivered_at;
output:
delivered_at On-Time Delivery Rate (%)
2020-07-07T07:42:16Z 1
2020-07-09T08:30:56Z 0
2020-07-10T07:50:24Z 1
2020-07-22T06:33:22Z 1
EDIT: to select the last 61 days
SELECT date.d,sum(case when promise_date=date(delivered_at) then 1 else 0 end)/count(*)*100
"On-Time Delivery Rate (%)"
from date
CROSS JOIN packages p
where p.seller_id=123
and date(created_at) between DATE_SUB(date.d,interval 30 day) and DATE_SUB(date.d,interval 1 day)
and date.d >= DATE_SUB(curdate(),interval 61 day)
GROUP BY date.d
ORDER BY date.d;
I change the curdate() from your query to the value date.d.
select * from date should give all dates, at least from the last 61 days.
You want to get percentage on OTD for all the dates in dates table where and get the metrics for respective days.
So percentage is calculated as : ((no of on time deliveries between last 30 days and day before the date) / (total no of deliveries between last 30 days and day before the date))*100;
So if you took example of date 2020-06-02 there are total 3 deliveries happened between 2020-05-02 to 2020-06-01 out of which 1 delivery is on time therefore OTD percentage = 33.3
According to what i have understood, Here is a solution you may be looking for:
Please check SQL fiddle:
SELECT d.fulldate,sum(case when promise_date = date(delivered_at)
and p.seller_id = 123 and date(created_at) between DATE_SUB(d.fulldate,interval 30 day) and
DATE_SUB(d.fulldate,interval 1 day) then 1 else 0 end)/sum(case when date(created_at) between DATE_SUB(d.fulldate,interval 30 day) and
DATE_SUB(d.fulldate,interval 1 day) and p.seller_id = 123 then 1 else 0 end)*100
"On-Time Delivery Rate (%)"
from packages p, dates d group by d.fulldate;
where p.seller_id=123 and date(created_at) between DATE_SUB(d.fulldate,interval 30 day) and
DATE_SUB(d.fulldate,interval 1 day);
Output:
Below is partial output: (for full output you can check Fiddle given above the query)
fulldate On-Time Delivery Rate (%)
2020-06-02 33.3333
2020-06-03 33.3333
2020-06-04 50
2020-06-05 50
2020-06-06 50
2020-06-07 50
I have a table below:
SELECT * FROM reports;
# id, date, o_type, quantity, vendor
'1', '2020-04-05', '2511', '200', 'apple'
'2', '2020-04-05', '5120', '350', 'apple'
'3', '2020-04-05', '2520', '150', 'apple'
'4', '2020-04-05', '5114', '400', 'apple'
'5', '2020-04-05', 'HG851', '200', 'google'
'6', '2020-04-05', 'HG851A', '400', 'google'
'7', '2020-04-05', 'MA5620G', '9000', 'google'
'8', '2020-04-05', 'OT550', '7000', 'google'
'9', '2020-04-05', 'OT925', '2000', 'google'
'10', '2020-04-05', 'OT928', '2000', 'google'
'11', '2020-04-06', '2520', '150', 'apple'
'12', '2020-04-06', 'HG851', '200', 'google'
'13', '2020-04-06', 'HG851', '200', 'google'
'14', '2020-04-06', 'HG851A', '400', 'google'
'15', '2020-04-07', '2511', '200', 'apple'
'16', '2020-04-07', '5120', '350', 'apple'
'17', '2020-04-07', '2520', '150', 'apple'
'18', '2020-04-07', '5114', '400', 'apple'
'19', '2020-04-07', 'G-440G-A', '200', 'NOKIA'
'20', '2020-04-07', '1240GA', '400', 'NOKIA'
'21', '2020-04-07', '1440GP', '9000', 'NOKIA'
'22', '2020-04-07', 'B-0404G-B', '7000', 'NOKIA'
'23', '2020-04-07', 'B2404GP', '2000', 'NOKIA'
'24', '2020-04-07', 'G-881G-A', '2000', 'NOKIA'
'25', '2020-04-08', 'G-881G-B', '150', 'NOKIA'
'26', '2020-04-08', 'HG851', '200', 'google'
'27', '2020-04-08', 'HG851A', '400', 'google'
I have a below query as per my project requirement:
SELECT Date(a.date), a.vendor, a.o_type, a.quantity, b.total FROM reports a
INNER JOIN (
SELECT vendor, date, SUM(quantity) as total
FROM reports WHERE date >= '2020-04-06' AND date <= '2020-04-08'
GROUP BY vendor, date) b ON a.date = b.date AND a.vendor = b.vendor
# Date(a.date), vendor, o_type, quantity, total
'2020-04-06', 'apple', '2520', '150', '150'
'2020-04-06', 'google', 'HG851', '200', '800'
'2020-04-06', 'google', 'HG851', '200', '800'
'2020-04-06', 'google', 'HG851A', '400', '800'
'2020-04-07', 'apple', '2511', '200', '1100'
'2020-04-07', 'apple', '5120', '350', '1100'
'2020-04-07', 'apple', '2520', '150', '1100'
'2020-04-07', 'apple', '5114', '400', '1100'
'2020-04-07', 'NOKIA', 'G-440G-A', '200', '20600'
'2020-04-07', 'NOKIA', '1240GA', '400', '20600'
'2020-04-07', 'NOKIA', '1440GP', '9000', '20600'
'2020-04-07', 'NOKIA', 'B-0404G-B', '7000', '20600'
'2020-04-07', 'NOKIA', 'B2404GP', '2000', '20600'
'2020-04-07', 'NOKIA', 'G-881G-A', '2000', '20600'
'2020-04-08', 'NOKIA', 'G-881G-B', '150', '150'
'2020-04-08', 'google', 'HG851', '200', '600'
'2020-04-08', 'google', 'HG851A', '400', '600'
I have to add an extra column DIFFERENCE to the above INNER JOIN query. How to calculate the difference between the current date and previous dates vendor column's total.
Example1:
2020-04-06 ---> apple ---> total(150)
2020-04-07 ---> apple ---> total(1100) Here difference equals to -950 (150-1100)
Example2:
2020-04-07 ---> apple ---> total(1100)
2020-04-08 ---> apple ---> total(0) Here difference equals to -1100 (0-1100)
Example3:
2020-04-07 ---> NOKIA ---> total(20600)
2020-04-08 ---> apple ---> total(150) Here difference equals to -20450 (150-20600)
Please guide me on how to proceed further? or if any other details required from my end kindly let me know.
The following table is for practice only. I will use the code on a much larger table.
SELECT *
FROM price_practice;
gives
id company dt price
'16', 'Amex', '2015-07-01', '5.00'
'17', 'Amex', '2015-07-02', '5.10'
'18', 'Amex', '2015-07-03', '5.00'
'19', 'Amex', '2015-07-06', '5.88'
'20', 'Amex', '2015-07-07', '4.21'
'21', 'Citi', '2015-07-01', '1.00'
'22', 'Citi', '2015-07-02', '1.10'
'23', 'Citi', '2015-07-03', '1.00'
'24', 'Citi', '2015-07-06', '0.88'
'25', 'Citi', '2015-07-07', '1.01'
'26', 'Amex', '2015-07-08', '5.23'
'27', 'Amex', '2015-07-09', '5.35'
'28', 'Amex', '2015-07-10', '5.55'
'29', 'Amex', '2015-07-13', '5.88'
'30', 'Amex', '2015-07-14', '6.01'
'31', 'Citi', '2015-07-08', '0.95'
'32', 'Citi', '2015-07-09', '0.83'
'33', 'Citi', '2015-07-10', '0.79'
'34', 'Citi', '2015-07-13', '0.72'
'35', 'Citi', '2015-07-14', '0.59'
The following snippet calculates the percentage change in price from one date to the next.
SELECT x.id, x.company, x.dt, x.price, (x.price - y.price)/y.price AS 'Change'
FROM
(
SELECT a.id AS aid, MAX(b.id) AS aPrevid
FROM price_practice a
INNER JOIN price_practice b
WHERE a.id > b.id
AND a.company = b.company
GROUP BY a.id
) Sub1
INNER JOIN price_practice x ON Sub1.aid = x.id
INNER JOIN price_practice y ON Sub1.aPrevid = y.id
ORDER BY x.id DESC
As intended, it returns
id company dt price Change
'35', 'Citi', '2015-07-14', '0.59', '-0.180556'
'34', 'Citi', '2015-07-13', '0.72', '-0.088608'
'33', 'Citi', '2015-07-10', '0.79', '-0.048193'
'32', 'Citi', '2015-07-09', '0.83', '-0.126316'
'31', 'Citi', '2015-07-08', '0.95', '-0.059406'
'30', 'Amex', '2015-07-14', '6.01', '0.022109'
'29', 'Amex', '2015-07-13', '5.88', '0.059459'
'28', 'Amex', '2015-07-10', '5.55', '0.037383'
'27', 'Amex', '2015-07-09', '5.35', '0.022945'
'26', 'Amex', '2015-07-08', '5.23', '0.242280'
'25', 'Citi', '2015-07-07', '1.01', '0.147727'
'24', 'Citi', '2015-07-06', '0.88', '-0.120000'
'23', 'Citi', '2015-07-03', '1.00', '-0.090909'
'22', 'Citi', '2015-07-02', '1.10', '0.100000'
'20', 'Amex', '2015-07-07', '4.21', '-0.284014'
'19', 'Amex', '2015-07-06', '5.88', '0.176000'
'18', 'Amex', '2015-07-03', '5.00', '-0.019608'
'17', 'Amex', '2015-07-02', '5.10', '0.020000'
The following snippet does something entirely different: it ranks observations by price for every company seperately.
SELECT (
CASE company
WHEN #curType
THEN #curRow := #curRow + 1
ELSE #curRow := 1 AND #curType := company END
) + 1 AS rank,
id,
company,
dt,
price
FROM price_practice,
(SELECT #curRow := 0, #curType := '') r
ORDER BY company DESC, price DESC;
As intended, it returns
rank id company dt price
'1', '22', 'Citi', '2015-07-02', '1.10'
'2', '25', 'Citi', '2015-07-07', '1.01'
'3', '23', 'Citi', '2015-07-03', '1.00'
'4', '21', 'Citi', '2015-07-01', '1.00'
'5', '31', 'Citi', '2015-07-08', '0.95'
'6', '24', 'Citi', '2015-07-06', '0.88'
'7', '32', 'Citi', '2015-07-09', '0.83'
'8', '33', 'Citi', '2015-07-10', '0.79'
'9', '34', 'Citi', '2015-07-13', '0.72'
'10', '35', 'Citi', '2015-07-14', '0.59'
'1', '30', 'Amex', '2015-07-14', '6.01'
'2', '19', 'Amex', '2015-07-06', '5.88'
'3', '29', 'Amex', '2015-07-13', '5.88'
'4', '28', 'Amex', '2015-07-10', '5.55'
'5', '27', 'Amex', '2015-07-09', '5.35'
'6', '26', 'Amex', '2015-07-08', '5.23'
'7', '17', 'Amex', '2015-07-02', '5.10'
'8', '18', 'Amex', '2015-07-03', '5.00'
'9', '16', 'Amex', '2015-07-01', '5.00'
'10', '20', 'Amex', '2015-07-07', '4.21'
The question is:
How do I rank observations by percentage change?
I imagine you can save the percentage change data in a new column and then rank it, but I suspect this is not the best method. I will do many similar calculations (eg weekly % change, variance etc), and I have around 3,000,000 observations, so the table would grow big quickly. If this is the only way to do it, I will, but I think combining the two snippets above to calculate percentage change and rank in one go would be better. Or what do you think?
As I'm sure you can tell from my question, I'm a beginner at MySQL. Any advise on how to proceed is appreciated!