How can i limit rows to just 1 entry per day? - mysql

I'd like to limit my results to one row per day, that is the newest one for every day when i do:
SELECT * FROM reports WHERE item = :item_id ORDER BY date DESC
Only 1 record per day, the records selected for each day needs to be the latest one at that day as well.
I really have no idea what i should try. Search results gave me no directions.
I am looking for a complete solution.
Here is example data from my table, in JSON, selected for just a single item:
[{
"id": "62",
"user": "7",
"item": "19333",
"instant_buy": "798000",
"instant_sell": "675000",
"upvotes": "0",
"downvotes": "0",
"created": "2017-06-18 14:01:32"
},
{
"id": "61",
"user": "7",
"item": "19333",
"instant_buy": "899999",
"instant_sell": "735647",
"upvotes": "0",
"downvotes": "0",
"created": "2017-06-18 11:48:25"
},
{
"id": "55",
"user": "4",
"item": "19333",
"instant_buy": "1387166",
"instant_sell": "1050000",
"upvotes": "0",
"downvotes": "0",
"created": "2017-06-17 12:11:30"
},
{
"id": "38",
"user": "4",
"item": "19333",
"instant_buy": "1850000",
"instant_sell": "900000",
"upvotes": "0",
"downvotes": "0",
"created": "2017-06-16 15:48:02"
},
{
"id": "36",
"user": "1",
"item": "19333",
"instant_buy": "1529350",
"instant_sell": "900000",
"upvotes": "1",
"downvotes": "0",
"created": "2017-06-16 14:26:41"
}]

You coud use a join with the user and max(created) grouped by user and date()
SELECT *
FROM reports r
INNER JOIN (
select user, max(created) max_created
from reports
group by user, date(created)
) t on t.user = r.user and t.max_created = r_created

You can use GROUP BY on date column. Something similar to
SELECT * FROM reports
WHERE item = :item_id
GROUP BY DATE_FORMAT(date,'%m-%d-%Y')
ORDER BY date DESC

try something like that:
select reports.*
from reports inner join
(select distinct date(Date), (select ID from reports
where date(Date) = date(r1.Date) and item = :item_id
order by Date desc
limit 1) ID
from Reports r1 where item = :item_id) s1
on reports.id = s1.id
depending if you want the first or the last entry of the date you should change the ordering the s1 subquery

Related

Json in Postgresql

I'm learning Postgresql and Json.
I have for example database like that:
CREATE TABLE employees (
employee_id serial primary key,
department_id integer references departments(department_id),
name text,
start_date date,
fingers integer,
geom geometry(point, 4326)
);
CREATE TABLE departments (
department_id bigint primary key,
name text
);
INSERT INTO departments
(department_id, name)
VALUES
(1, 'spatial'),
(2, 'cloud');
INSERT INTO employees
(department_id, name, start_date, fingers, geom)
VALUES
(1, 'Paul', '2018/09/02', 10, 'POINT(-123.32977 48.40732)'),
(1, 'Martin', '2019/09/02', 9, 'POINT(-123.32977 48.40732)'),
(2, 'Craig', '2019/11/01', 10, 'POINT(-122.33207 47.60621)'),
(2, 'Dan', '2020/10/01', 8, 'POINT(-122.33207 47.60621)');
How could i do so i could get the data like this:
[
{
"department_name": "cloud",
"employees": [
{
"name": "Craig",
"start_date": "2019-11-01"
},
{
"name": "Dan",
"start_date": "2020-10-01"
}
]
},
{
"department_name": "spatial",
"employees": [
{
"name": "Paul",
"start_date": "2018-09-02"
},
{
"name": "Martin",
"start_date": "2019-09-02"
}
]
}
]
follow this link: https://dba.stackexchange.com/questions/69655/select-columns-inside-json-agg/200240#200240
CREATE TEMP TABLE x (
name text,
start_date date
);
WITH cte AS (
SELECT
d.name AS department_name,
json_agg((e.name, e.start_date)::x) AS employees
FROM
departments d
JOIN employees e ON d.department_id = e.department_id
GROUP BY
1
)
SELECT
json_agg((row_to_json(cte.*)))
FROM
cte;

Mysql concat and group_concat

I'm trying to come up with a sql query that shows the client information as well as their orders.
this is the desired result:
{
"success": true,
"client": {
"name": "General Kenobit",
"email": "test#test.com",
"contact": 123456789,
"registerDate": "2022-04-06T16:00:05.000Z",
"status": "activo",
"orders": [
{
"orderId": 1002,
"total": 19.5,
"payment": "money",
"products": [
{
"productId": 1,
"product": "Test",
"quantity": 4
}
]
},
{
"orderId": 1006,
"total": 67.5,
"payment": "money",
"products": [
{
"productId": 1,
"product": "Test",
"quantity": 4
}
{
"productId": 2,
"product": "Product 2",
"quantity": 3
}
]
},
{
"orderId": 1009,
"total": 134,
"payment": "card",
"products": [
{
"productId": 1,
"product": "Test",
"quantity": 4
}
{
"productId": 2,
"product": "Product 2",
"quantity": 4
}
{
"productId": 3,
"product": "Food",
"quantity": 5
},
]
}
]
}
}
and this is is query I'm trying to solve
SELECT c.name, c.email, c.contact, c.registerDate, c.status,
CONCAT('[',
GROUP_CONCAT(JSON_OBJECT("orderId", o.orderId, "total", o.total, "payment", o.payment, "products",
CONCAT('[', GROUP_CONCAT(JSON_OBJECT("productId", p.productId, "product", p.product, "quantity", op.quantity) SEPARATOR ','), ']')
) SEPARATOR ','),
']') AS 'orders'
FROM t_client AS c
INNER JOIN t_order AS o ON o.email = c.email
INNER JOIN t_orderproduct AS op ON op.orderId = o.orderId
INNER JOIN t_product AS p ON p.productId = op.productId
WHERE c.clientId = 1
GROUP BY c.clientId
If I use the group_concat function before the second json_object I get error #1111 for invalid use of grouping function (group)...
Otherwise this is what it comes back as result:
{
"success": true,
"client": {
"name": "General Kenobit",
"email": "teste#teste.com",
"contact": 123456789,
"registerDate": "2022-04-06T16:00:05.000Z",
"status": "activo",
"orders": [
{
"orderId": 1002,
"total": 19.5,
"payment": "money",
"products": [
{
"productId": 1,
"product": "Test",
"quantity": 4
}
]
},
{
"orderId": 1006,
"total": 67.5,
"payment": "money",
"products": [
{
"productId": 1,
"product": "Test",
"quantity": 4
}
]
},
{
"orderId": 1009,
"total": 134,
"payment": "card",
"products": [
{
"productId": 1,
"product": "Test",
"quantity": 4
}
]
},
{
"orderId": 1006,
"total": 67.5,
"payment": "money",
"products": [
{
"productId": 2,
"product": "Product 2",
"quantity": 3
}
]
},
{
"orderId": 1009,
"total": 134,
"payment": "card",
"products": [
{
"productId": 2,
"product": "Product 2",
"quantity": 4
}
]
},
{
"orderId": 1009,
"total": 134,
"payment": "card",
"products": [
{
"productId": 3,
"product": "Food",
"quantity": 5
}
]
}
]
}
}
I turned the whole query upside down already and don't know where else to tweak.
Any suggestion or tip is appreciated.
You can't have nested aggregations in a query, so you need to do the aggregation of the order products in a subquery.
And instead of CONCAT() and GROUP_CONCAT(), you can use JSON_ARRAYAGG() if you're running at least 5.7.22.
SELECT c.name, c.email, c.contact, c.registerDate, c.status,
JSON_ARRAYAGG(JSON_OBJECT("orderId", o.orderId, "total", o.total, "payment", o.payment, "products", op.products)) AS orders
FROM t_client AS c
INNER JOIN t_order AS o ON o.email = c.email
INNER JOIN (
SELECT op.orderId, JSON_ARRAYAGG(JSON_OBJECT("productId", p.productId, "product", p.product, "quantity", op.quantity)) AS products
FROM t_orderproduct AS op
INNER JOIN t_product AS p ON p.productId = op.productId
GROUP BY op.orderId
) AS op ON op.orderId = o.orderId
WHERE c.clientId = 1
GROUP BY c.clientId

Turn Rows into Multiple elements in JSON

I have 1 order number which contains 4 skus, each sku is linked to 3 categories:
I need to write a SQL statement to convert this into JSON format which involves subquery and multiple elements in an array. Basically I need it to look like this:
{
"order_number": "WEB000000000",
"order_items": [
{
"sku": 111111111,
"categories": ["Checked Shirts", "Mens", "Shirts"]
},
{
"sku": 333333333,
"categories": ["Accessories & Shoes", "Mens Accessories", "Socks"]
},
{
"sku": 666666666,
"categories": ["Checked Shirts", "Mens", "Shirts"]
},
{
"sku": 999999999,
"categories": ["Nightwear", "Slippers", "Womens"]
}
]
}
Here's what I have so far but I just can't get it quite right:
DROP TABLE IF EXISTS ##Data;
CREATE TABLE ##Data
(
order_number varchar(100),
sku bigint,
categories varchar(100)
);
INSERT INTO ##Data
select 'WEB000000000', 111111111, 'Mens'
union all select 'WEB000000000', 111111111, 'Shirts'
union all select 'WEB000000000', 111111111, 'Checked Shirts'
union all select 'WEB000000000', 333333333, 'Accessories & Shoes'
union all select 'WEB000000000', 333333333, 'Mens Accessories'
union all select 'WEB000000000', 333333333, 'Socks'
union all select 'WEB000000000', 666666666, 'Mens'
union all select 'WEB000000000', 666666666, 'Shirts'
union all select 'WEB000000000', 666666666, 'Checked Shirts'
union all select 'WEB000000000', 999999999, 'Womens'
union all select 'WEB000000000', 999999999, 'Nightwear'
union all select 'WEB000000000', 999999999, 'Slippers'
SELECT * FROM ##Data;
select OUTER1.[order_number] as [order_number],
(select OSL.[order_number],
(select [sku],
(select [categories]
from ##Data skus
where order_item.[order_number] = skus.[order_number]
and order_item.[sku] = skus.[sku]
GROUP BY [categories]
FOR JSON PATH) as categories
from ##Data order_item
where order_item.[order_number] = OSL.[order_number]
GROUP BY [order_number], [sku]
FOR JSON PATH) as order_items
from ##Data OSL
where OSL.[order_number]=OUTER1.[order_number]
group by OSL.[order_number]
FOR JSON PATH, WITHOUT_ARRAY_WRAPPER) AS JSON
from ##Data OUTER1
group by OUTER1.[order_number]
drop table ##Data

Add count to N1QL query result in Couchbase

I have a N1QL query:
SELECT p.`ID`, p.`Name` FROM `Preferences` p WHERE `type` = "myType"
The result is a list of objects[{"ID": "123", "Name": "John"}, ...]
I want to get a result JSON such as:
{
"count": 5,
"result": [{"ID": "123", "Name": "John"}, ...]
}
How could I do this using N1QL?
SELECT
COUNT(t.ID) AS count,
ARRAY_AGG(t) AS results
FROM
(
SELECT
p.`ID`, p.`Name`
FROM
`Preferences` p
WHERE `type` = "myType"
) AS t

WITH ROLLUP combined with mutliple GROUP BY criterias

I have the following table:
CREATE TABLE PaL (
Event_Date DATE,
Country CHAR(2),
Category CHAR(255),
Revenue INTEGER(255),
Costs INTEGER(255)
);
INSERT INTO PaL
(Event_Date, Country, Category, Revenue, Costs)
VALUES
("2017-01-31", "DE", "Apparel", "692.09816652375", "-173.071989376023"),
("2017-02-28", "DE", "Apparel", "8419.9977988914", "-7622.61265984317"),
("2017-03-31", "DE", "Apparel", "2018.80471444031", "-1498.76213884283"),
("2017-04-30", "DE", "Apparel", "8863.15663035884", "-7965.69268589649"),
("2017-05-31", "DE", "Apparel", "6838.4514829573", "-1088.70351845663"),
("2017-06-30", "DE", "Apparel", "2025.73421386331", "-483.454199185678"),
("2017-07-31", "DE", "Apparel", "5389.0163788639", "-2643.93624645182"),
("2017-08-31", "DE", "Apparel", "6238.85870250446", "-1985.9879371866"),
("2017-09-30", "DE", "Apparel", "2294.62451106469", "-1864.98271539745"),
("2017-10-31", "DE", "Apparel", "4141.2074159951", "-197.773961036073"),
("2017-11-30", "DE", "Apparel", "1456.17584217397", "-1018.54129047119"),
("2017-12-31", "DE", "Apparel", "3623.54984724091", "-745.715567286581"),
("2017-01-31", "DE", "Shoes", "5955.20947079185", "-4745.39564508682"),
("2017-02-28", "DE", "Shoes", "9555.29563511224", "-5729.82601329738"),
("2017-03-31", "DE", "Shoes", "5490.36170257556", "-925.286457266534"),
("2017-04-30", "DE", "Shoes", "7652.35548686073", "-7335.32532050594"),
("2017-05-31", "DE", "Shoes", "9102.38987703511", "-5724.92574170071"),
("2017-06-30", "DE", "Shoes", "1703.95901703023", "-1678.19547060803"),
("2017-07-31", "DE", "Shoes", "3679.60045104324", "-2095.94207835501"),
("2017-08-31", "DE", "Shoes", "6672.43210841331", "-3475.55411014914"),
("2017-09-30", "DE", "Shoes", "7718.7744220635", "-1252.75877307055"),
("2017-10-31", "DE", "Shoes", "6976.01564153854", "-509.991595559256"),
("2017-11-30", "DE", "Shoes", "4725.46976319905", "-2835.09460170927"),
("2017-12-31", "DE", "Shoes", "8390.84483147949", "-7476.83516162742"),
("2017-01-31", "US", "Apparel", "939788.159047677", "-742666.846083707"),
("2017-02-28", "US", "Apparel", "826418.514009279", "-702997.151099908"),
("2017-03-31", "US", "Apparel", "775940.69563018", "-211238.971709086"),
("2017-04-30", "US", "Apparel", "516829.583069596", "-407521.856789393"),
("2017-05-31", "US", "Apparel", "635701.377748304", "-627829.016481388"),
("2017-06-30", "US", "Apparel", "757852.95599751", "-740948.867522139"),
("2017-07-31", "US", "Apparel", "154224.257732688", "-139805.456987081"),
("2017-08-31", "US", "Apparel", "102035.465731255", "-100103.875992667"),
("2017-09-30", "US", "Apparel", "880671.692714021", "-665324.083753931"),
("2017-10-31", "US", "Apparel", "187868.653562564", "-105676.793254039"),
("2017-11-30", "US", "Apparel", "994600.486892401", "-177382.899789215"),
("2017-12-31", "US", "Apparel", "813824.90461202", "-132527.311010471"),
("2017-01-31", "US", "Shoes", "661069.933966637", "-454778.427240679"),
("2017-02-28", "US", "Shoes", "675942.334464692", "-254489.623313569"),
("2017-03-31", "US", "Shoes", "473604.307973888", "-404226.047653847"),
("2017-04-30", "US", "Shoes", "872018.822577053", "-348781.396359871"),
("2017-05-31", "US", "Shoes", "718012.023481434", "-625306.312927362"),
("2017-06-30", "US", "Shoes", "688487.679029354", "-584512.575888519"),
("2017-07-31", "US", "Shoes", "690085.013711018", "-581753.771085971"),
("2017-08-31", "US", "Shoes", "204473.88894161", "-172301.871771595"),
("2017-09-30", "US", "Shoes", "516932.092423463", "-328002.737710081"),
("2017-10-31", "US", "Shoes", "609355.245817292", "-323624.391366703"),
("2017-11-30", "US", "Shoes", "313599.625504231", "-210253.020497022"),
("2017-12-31", "US", "Shoes", "723573.681040319", "-107333.764977439"),
("2017-01-31", "NZ", "Apparel", "81292.9610624533", "-27354.678748396"),
("2017-02-28", "NZ", "Apparel", "77473.6231986387", "-47920.2900396812"),
("2017-03-31", "NZ", "Apparel", "93819.4377266116", "-47582.1878235771"),
("2017-04-30", "NZ", "Apparel", "25580.8516093492", "-21277.2651303701"),
("2017-05-31", "NZ", "Apparel", "82842.9415935231", "-30714.5952859941"),
("2017-06-30", "NZ", "Apparel", "50878.6190715448", "-33047.3841488076"),
("2017-07-31", "NZ", "Apparel", "61423.3558286064", "-10811.2817583225"),
("2017-08-31", "NZ", "Apparel", "77517.2989019148", "-56818.7461336424"),
("2017-09-30", "NZ", "Apparel", "74046.1258000888", "-10108.0702908427"),
("2017-10-31", "NZ", "Apparel", "79490.650598675", "-68562.5753547413"),
("2017-11-30", "NZ", "Apparel", "65000.3971251097", "-25174.1329786955"),
("2017-12-31", "NZ", "Apparel", "99152.6457285608", "-42855.8431883814"),
("2017-01-31", "NZ", "Shoes", "20703.8970205884", "-11911.9616025915"),
("2017-02-28", "NZ", "Shoes", "72841.2537140946", "-14166.6747335237"),
("2017-03-31", "NZ", "Shoes", "45391.6550622383", "-40325.1638601903"),
("2017-04-30", "NZ", "Shoes", "58074.2843201579", "-54483.1122507654"),
("2017-05-31", "NZ", "Shoes", "52127.2701338519", "-28026.7984458694"),
("2017-06-30", "NZ", "Shoes", "32900.9222204099", "-22780.2637095601"),
("2017-07-31", "NZ", "Shoes", "18809.3868235169", "-11500.4020522949"),
("2017-08-31", "NZ", "Shoes", "67001.2729206886", "-53280.8129552599"),
("2017-09-30", "NZ", "Shoes", "26889.4058005421", "-24218.8734875798"),
("2017-10-31", "NZ", "Shoes", "56330.7544011198", "-51382.4201254223"),
("2017-11-30", "NZ", "Shoes", "60954.7129549264", "-19834.7256352672"),
("2017-12-31", "NZ", "Shoes", "97527.2014993995", "-83137.4844853141");
And I use the following query to get data from the table:
Select Country, Category, sum(Revenue) as Revenue, sum(Costs) as Costs
FROM Pal
WHERE Event_Date BETWEEN "2017-01-01" AND "2017-01-31"
GROUP BY Country, Category WITH ROLLUP
You can also find the table with data in the sql fiddle here
All this works fine so far.
Now, I was wondering how can I avoid that the WITH ROLLUP function calculates the total of the column below each country. Instead it should calculate the column total only once so the result in the end looks like this:
Country Category Revenue Costs
DE Apparel 692 -173
DE Shoes 5955 -4745
: : : :
: : : :
: : : :
US Shoes 661070 -454778
(null) (null) 1709502 -1241630
What do I have to change in my SQL query to achieve this?
MySQL does not support GROUPING SETS, which is what you really want. Perhaps the simplest way is to use UNION ALL:
SELECT Country, Category, SUM(Revenue) as Revenue, SUM(Costs) as Costs
FROM Pal
WHERE Event_Date BETWEEN '2017-01-01' AND '2017-01-31'
GROUP BY Country, Category
UNION ALL
SELECT NULL, NULL, SUM(Revenue) as Revenue, SUM(Costs) as Costs
FROM Pal
WHERE Event_Date BETWEEN '2017-01-01' AND '2017-01-31';
You can use HAVING to filter out the subtotals for each country:
Select Country, Category, sum(Revenue) as Revenue, sum(Costs) as Costs
FROM Pal
WHERE Event_Date BETWEEN "2017-01-01" AND "2017-01-31"
GROUP BY Country, Category WITH ROLLUP
HAVING (Country IS NULL AND Category IS NULL) OR (Country IS NOT NULL AND Category IS NOT NULL)
The condition Country IS NULL AND Category IS NULL matches the grand total at the end, the condition Country IS NOT NULL AND Category IS NOT NULL matches the individual rows for each country and category.
DEMO
Remove with rollup
Select Country, Category, sum(Revenue) as Revenue, sum(Costs) as Costs
FROM Pal
WHERE Event_Date BETWEEN "2017-01-01" AND "2017-01-31"
GROUP BY Country, Category
and then use union all as like sir #Gordon uses his answer