I have a table that I defined one column to receive zero 0 as value. I did this because zero 0 represents all values. The table's name is agendas and the column it's turmas_id, this column has relationship with table turmas but turmas_id in agendas it's not a foreignkey because I can add 0 as saied before.
The problem is when I make a JOIN using these tables because I need return all attributes with zero value and valid keys added in table turmas.
I tried use LEFT JOIN and INNER JOIN but the result it's not what I wait. I can use JOIN if id exists in table turmas and table agendas because it's a valid foreign key but I can't return other values with 0 in agendas attribute turmas_id and this is exactly what I need.
How could I do this ?
I need display this result
//table agendas
-----------------------------------------
turmas_id | descricao
-----------------------------------------
0 | this attribute contain zero and it's not exists in table turmas
16 | table turmas contain id 16 it is a foreign key
0 | this attribute contain zero and it's not exists in table turmas
23 | table turmas contain id 23 it is a foreign key
SQL
$agendamentos = $this->Agenda->query("SELECT * FROM responsavel_alunos RespAlunos "
. "INNER JOIN pessoas Responsavel ON (Responsavel.id = RespAlunos.pessoas_id) "
. "INNER JOIN pessoas Aluno ON (Aluno.id = RespAlunos.pessoas_id1) "
. "INNER JOIN matriculas Matricula ON (Matricula.pessoas_id = Aluno.id) "
. "RIGHT JOIN turmas Turma ON (Turma.id = Matricula.turmas_id OR Turma.id = 0) "
. "INNER JOIN escolas Escola ON (Escola.id = Matricula.escolas_id) "
. "INNER JOIN agendas Agenda ON (Agenda.turmas_id = Turma.id) "
. "WHERE Responsavel.id = ? ORDER BY Agenda.created DESC "
, array($id)); //id do responsavel
Model
JSON result
{
"status": "1",
"result": [
{
"RespAlunos": {
"id": "5",
"pessoas_id": "8",
"pessoas_id1": "9",
"created": "2015-09-21 10:25:46",
"modified": "2015-09-21 10:25:46"
},
"Responsavel": {
"id": "8",
"nome": "responsavel ",
"email": "responsavel #hotmail.com",
"tipopessoas_id": "3",
"status": "1",
"created": "2015-09-21 10:17:17",
"modified": "2015-09-21 10:17:17"
},
"Aluno": {
"id": "9",
"nome": "aluno",
"email": "aluno#gmail.com",
"tipopessoas_id": "1",
"status": "1",
"created": "2015-09-21 10:18:41",
"modified": "2015-09-21 10:18:41"
},
"Matricula": {
"id": "6",
"referencia": "238",
"pessoas_id": "9",
"turmas_id": "4",
"escolas_id": "2",
"status": "1",
"created": "2015-09-21 10:35:08",
"modified": "2016-02-18 10:51:20"
},
"Turma": {
"id": "4",
"descricao": "4º ano",
"created": "2015-09-21 10:31:32",
"modified": "2015-09-21 10:31:32"
},
"Escola": {
"id": "2",
"descricao": "Santa Luz Unidade 2",
"created": "2015-09-17 23:09:38",
"modified": "2015-09-17 23:09:38"
},
"Agenda": {
"id": "34",
"data": "2016-02-29 14:40:00",
"descricao": "<p>teste 1</p>\r\n",
"escolas_id": "2",
"turmas_id": "4",
"created": "2016-02-29 14:40:21",
"modified": "2016-02-29 14:40:21"
}
},
{
"RespAlunos": {
"id": "5",
"pessoas_id": "8",
"pessoas_id1": "9",
"created": "2015-09-21 10:25:46",
"modified": "2015-09-21 10:25:46"
},
"Responsavel": {
"id": "8",
"nome": "responsavel ",
"email": "responsavel #hotmail.com",
"tipopessoas_id": "3",
"status": "1",
"created": "2015-09-21 10:17:17",
"modified": "2015-09-21 10:17:17"
},
"Aluno": {
"id": "9",
"nome": "aluno",
"email": "aluno#gmail.com",
"tipopessoas_id": "1",
"status": "1",
"created": "2015-09-21 10:18:41",
"modified": "2015-09-21 10:18:41"
},
"Matricula": {
"id": "6",
"referencia": "238",
"pessoas_id": "9",
"turmas_id": "4",
"escolas_id": "2",
"status": "1",
"created": "2015-09-21 10:35:08",
"modified": "2016-02-18 10:51:20"
},
"Turma": {
"id": "4",
"descricao": "4º ano",
"created": "2015-09-21 10:31:32",
"modified": "2015-09-21 10:31:32"
},
"Escola": {
"id": "2",
"descricao": "Santa Luz Unidade 2",
"created": "2015-09-17 23:09:38",
"modified": "2015-09-17 23:09:38"
},
"Agenda": {
"id": "27",
"data": "2016-02-29 08:24:00",
"descricao": "descricao",
"escolas_id": "2",
"turmas_id": "4",
"created": "2016-02-29 08:25:20",
"modified": "2016-02-29 08:25:20"
}
}
]
}
try to use IN ? instead of using = ? in your WHERE clause since your using an array variable
Can you try RIGHT JOIN instead?
"RIGHT JOIN turmas Turma ON (Turma.id = Matricula.turmas_id OR Turma.id = 0)"
because TABLE1 LEFT JOIN TABLE2 means all rows from TABLE1 will be selected, even if there are no matching rows in TABLE2.
RIGHT JOIN is the opposite of it. So use RIGHT JOIN so that you can return your TURMA records based from the ON condition, even if there are no matching records from the preceding table.
Try this one.
$agendamentos = $this->Agenda->query("SELECT * FROM responsavel_alunos RespAlunos "
. "INNER JOIN pessoas Responsavel ON (Responsavel.id = RespAlunos.pessoas_id) "
. "INNER JOIN pessoas Aluno ON (Aluno.id = RespAlunos.pessoas_id1) "
. "INNER JOIN matriculas Matricula ON (Matricula.pessoas_id = Aluno.id) "
. "(SELECT turmas as turma WHERE (turma.id = Matricula.turmas_id OR turma.id = 0)) AS TURMA"
. "INNER JOIN escolas Escola ON (Escola.id = Matricula.escolas_id) "
. "INNER JOIN agendas Agenda ON (Agenda.turmas_id = Turma.id) "
. "WHERE Responsavel.id IN ? ORDER BY Agenda.created DESC "
, array($id)); //id do responsavel
Related
sorry for my bad english.
I am inserting a json into mysql like this:
set #json = '[{"name":"ivan","city":"london","kurs":"1", },{"name":"lena","city":"tokio","kurs":"5"},{"name":"misha","city":"kazan","kurs":"3"}]';
select * from json_table(#json,'$[*]' columns(name varchar(20) path '$.name',
city varchar(20) path '$.city',
kurs varchar(20) path '$.kurs')) as jsontable;
But now there is a task to insert an unknown number of additional properties:
set #json = '[{"name":"ivan","city":"london","kurs":"1","options": [{
"ao_id": 90630,
"name": "Высота предмета",
"value": "3.7 см"
}, {
"ao_id": 90673,
"name": "Ширина предмета",
"value": "4 см"
}, {
"ao_id": 90745,
"name": "Ширина упаковки",
"value": "4 см"
}]},{"name":"lena","city":"tokio","kurs":"5", "options": [{
"ao_id": 90630,
"name": "Высота предмета",
"value": "9.7 см"
}]},{"name":"misha","city":"kazan","kurs":"3", "options": [{
"ao_id": 90999,
"name": "Высота",
"value": "5.7 см"
}]}]';
How can I best do this so that I can access the table in the future (search, index, output)?
I did a DB Fiddle of what the table is kinda looking like https://www.db-fiddle.com/f/4jyoMCicNSZpjMt4jFYoz5/3382
Data in the table looks like this
[
{
"id": 1,
"form_id": 1,
"questionnaire_response": [
{
"id": "1",
"title": "Are you alive?",
"value": "Yes",
"form_id": 0,
"shortTitle": "",
"description": ""
},
{
"id": "2",
"title": "Did you sleep good?",
"value": "No",
"form_id": 0,
"shortTitle": "",
"description": ""
},
{
"id": "3",
"title": "Whats favorite color(s)?",
"value": [
"Red",
"Blue"
],
"form_id": 0,
"shortTitle": "",
"description": ""
}
]
},
{
"id": 2,
"form_id": 1,
"questionnaire_response": [
{
"id": "1",
"title": "Are you alive?",
"value": "Yes",
"form_id": 0,
"shortTitle": "",
"description": ""
},
{
"id": "2",
"title": "Did you sleep good?",
"value": "Yes",
"form_id": 0,
"shortTitle": "",
"description": ""
},
{
"id": "3",
"title": "Whats favorite color(s)?",
"value": "Black",
"form_id": 0,
"shortTitle": "",
"description": ""
}
]
},
{
"id": 3,
"form_id": 1,
"questionnaire_response": [
{
"id": "1",
"title": "Are you alive?",
"value": "Yes",
"form_id": 0,
"shortTitle": "",
"description": ""
},
{
"id": "2",
"title": "Did you sleep good?",
"value": "No",
"form_id": 0,
"shortTitle": "",
"description": ""
},
{
"id": "3",
"title": "Whats favorite color(s)?",
"value": [
"Black",
"Red"
],
"form_id": 0,
"shortTitle": "",
"description": ""
}
]
}
]
I have a query select * from form_responses,jsonb_to_recordset(form_responses.questionnaire_response) as items(value text, id text) where (items.id = '3' AND items.value like '%Black%');
But unable to do more than one object like select * from form_responses,jsonb_to_recordset(form_responses.questionnaire_response) as items(value text, id text) where (items.id = '3' AND items.value like '%Black%') AND (items.id = '2' AND items.value like '%Yes%');
The value field in the object could be an array or a single value also.. unpredictable.. I feel like I'm close but also not sure if im using the correct query in the first place.
Any help would be appreciated!
EDIT
select * from form_responses where(
questionnaire_response #> '[{"id": "2", "value":"No"},{"id": "3", "value":["Red"]}]')
Seems to work but not sure if this is the best way to do it
Your current query returns one result row per item. None of these rows has both id = 3 and id = 2. If your goal is to select the entire form response, you need to use a subquery (or rather, two of them):
SELECT *
FROM form_responses
WHERE EXISTS(
SELECT *
FROM jsonb_to_recordset(form_responses.questionnaire_response) as items(value text, id text)
WHERE items.id = '3'
AND items.value like '%Black%'
)
AND EXISTS(
SELECT *
FROM jsonb_to_recordset(form_responses.questionnaire_response) as items(value text, id text)
WHERE items.id = '2'
AND items.value like '%Yes%'
);
or alternatively
SELECT *
FROM form_responses
WHERE (
SELECT value
FROM jsonb_to_recordset(form_responses.questionnaire_response) as items(value text, id text)
WHERE items.id = '3'
) like '%Black%'
AND (
SELECT value
FROM jsonb_to_recordset(form_responses.questionnaire_response) as items(value text, id text)
WHERE items.id = '2'
) like '%Yes%';
A nicer alternative would be using json path queries:
SELECT *
FROM form_responses
WHERE questionnaire_response ## '$[*]?(#.id == "1").value == "Yes"'
AND questionnaire_response ## '$[*]?(#.id == "3").value[*] == "Black"'
-- in one:
SELECT *
FROM form_responses
WHERE questionnaire_response ## '$[*]?(#.id == "1").value == "Yes" && $[*]?(#.id == "3").value[*] == "Black"'
The [*] even has the correct semantics for that sometimes-string-sometimes-array value. And if you know the indices of the items with those ids, you can even simplify to
SELECT *
FROM form_responses
WHERE questionnaire_response ## '$[0].value == "Yes" && $[2].value[*] == "Black"'
(dbfiddle demo)
given the below data model:
{
"events": [
{
"customerId": "a",
"type": "credit" ,
"value": 10
},
{
"customerId": "a",
"type": "credit" ,
"value": 10
},
{
"customerId": "b",
"type": "credit" ,
"value": 5
},
{
"customerId": "b",
"type": "credit" ,
"value": 5
}
]
}
how can i query the sum of credits by customerId ? i.e:
{
{
"customerId": "a",
"total": "20
},
{
"customerId": "b",
"total": "10
}
}
Use SUBQUERY expression per document aggregation
SELECT d.*,
(SELECT e.customerId, SUM(e.`value`) AS total
FROM d.events AS e
WHERE ......
GROUP BY e.customerId) AS events
FROM default AS d
WHERE ...........;
For Whole Query
SELECT e.customerId, SUM(e.`value`) AS total
FROM default AS d
UNNEST d.events AS e
WHERE ......
GROUP BY e.customerId;
I have JSON stored in a SQL Server database table in the below format. I have been able to fudge a way to get the values I need but feel like there must be a better way to do it using T-SQL. The JSON is output from a report in the below format where the column names in "columns" correspond to the "rows"-"data" array values.
So column "Fiscal Month" corresponds to data value "11", "Fiscal Year" to "2019", etc.
{
"report": "Property ETL",
"id": 2648,
"columns": [
{
"name": "Fiscal Month",
"dataType": "int"
},
{
"name": "Fiscal Year",
"dataType": "int"
},
{
"name": "Portfolio",
"dataType": "varchar(50)"
},
{
"name": "Rent",
"dataType": "int"
}
],
"rows": [
{
"rowName": "1",
"type": "Detail",
"data": [
11,
2019,
"West Group",
10
]
},
{
"rowName": "2",
"type": "Detail",
"data": [
11,
2019,
"East Group",
10
]
},
{
"rowName": "3",
"type": "Detail",
"data": [
11,
2019,
"East Group",
10
]
},
{
"rowName": "Totals: ",
"type": "Total",
"data": [
null,
null,
null,
30
]
}
]
}
In order to get at the data in the 'data' array I currently have a 2 step process in T-SQL where I create a temp table, and insert the row key/values from '$.Rows' there. Then I can then select the individual columns for each row
CREATE TABLE #TempData
(
Id INT,
JsonData VARCHAR(MAX)
)
DECLARE #json VARCHAR(MAX);
DECLARE #LineageKey INT;
SET #json = (SELECT JsonString FROM Stage.Report);
SET #LineageKey = (SELECT LineageKey FROM Stage.Report);
INSERT INTO #TempData(Id, JsonData)
(SELECT [key], value FROM OPENJSON(#json, '$.rows'))
MERGE [dbo].[DestinationTable] TARGET
USING
(
SELECT
JSON_VALUE(JsonData, '$.data[0]') AS FiscalMonth,
JSON_VALUE(JsonData, '$.data[1]') AS FiscalYear,
JSON_VALUE(JsonData, '$.data[2]') AS Portfolio,
JSON_VALUE(JsonData, '$.data[3]') AS Rent
FROM #TempData
WHERE JSON_VALUE(JsonData, '$.data[0]') is not null
) AS SOURCE
...
etc., etc.
This works, but I want to know if there is a way to directly select the data values without the intermediate step of putting it into the temp table. The documentation and examples I've read seem to all require that the data have a name associated with it in order to access it. When I try and access the data directly at a position by index I just get Null.
I hope I understand your question correctly. If you know the columns names you need one OPENJSON() call with explicit schema, but if you want to read the JSON structure from $.columns, you need a dynamic statement.
JSON:
DECLARE #json nvarchar(max) = N'{
"report": "Property ETL",
"id": 2648,
"columns": [
{
"name": "Fiscal Month",
"dataType": "int"
},
{
"name": "Fiscal Year",
"dataType": "int"
},
{
"name": "Portfolio",
"dataType": "varchar(50)"
},
{
"name": "Rent",
"dataType": "int"
}
],
"rows": [
{
"rowName": "1",
"type": "Detail",
"data": [
11,
2019,
"West Group",
10
]
},
{
"rowName": "2",
"type": "Detail",
"data": [
11,
2019,
"East Group",
10
]
},
{
"rowName": "3",
"type": "Detail",
"data": [
11,
2019,
"East Group",
10
]
},
{
"rowName": "Totals: ",
"type": "Total",
"data": [
null,
null,
null,
30
]
}
]
}'
Statement for fixed structure:
SELECT *
FROM OPENJSON(#json, '$.rows') WITH (
[Fiscal Month] int '$.data[0]',
[Fiscal Year] int '$.data[1]',
[Portfolio] varchar(50) '$.data[2]',
[Rent] int '$.data[3]'
)
Dynamic statement:
DECLARE #stm nvarchar(max) = N''
SELECT #stm = CONCAT(
#stm,
N',',
QUOTENAME(j2.name),
N' ',
j2.dataType,
N' ''$.data[',
j1.[key],
N']'''
)
FROM OPENJSON(#json, '$.columns') j1
CROSS APPLY OPENJSON(j1.value) WITH (
name varchar(50) '$.name',
dataType varchar(50) '$.dataType'
) j2
SELECT #stm = CONCAT(
N'SELECT * FROM OPENJSON(#json, ''$.rows'') WITH (',
STUFF(#stm, 1, 1, N''),
N')'
)
PRINT #stm
EXEC sp_executesql #stm, N'#json nvarchar(max)', #json
Result:
--------------------------------------------
Fiscal Month Fiscal Year Portfolio Rent
--------------------------------------------
11 2019 West Group 10
11 2019 East Group 10
11 2019 East Group 10
30
Yes, it is possible without temporary table:
DECLARE #json NVARCHAR(MAX) =
N'
{
"report": "Property ETL",
"id": 2648,
"columns": [
{
"name": "Fiscal Month",
"dataType": "int"
},
{
"name": "Fiscal Year",
"dataType": "int"
},
{
"name": "Portfolio",
"dataType": "varchar(50)"
},
{
"name": "Rent",
"dataType": "int"
}
],
"rows": [
{
"rowName": "1",
"type": "Detail",
"data": [
11,
2019,
"West Group",
10
]
},
{
"rowName": "2",
"type": "Detail",
"data": [
11,
2019,
"East Group",
10
]
},
{
"rowName": "3",
"type": "Detail",
"data": [
11,
2019,
"East Group",
10
]
},
{
"rowName": "Totals: ",
"type": "Total",
"data": [
null,
null,
null,
30
]
}
]
}
}';
And query:
SELECT s.value,
rowName = JSON_VALUE(s.value, '$.rowName'),
[type] = JSON_VALUE(s.value, '$.type'),
s2.[key],
s2.value
FROM OPENJSON(JSON_QUERY(#json, '$.rows')) s
CROSS APPLY OPENJSON(JSON_QUERY(s.value, '$.data')) s2;
db<>fiddle demo
Or as a single row per detail:
SELECT s.value,
rowName = JSON_VALUE(s.value, '$.rowName'),
[type] = JSON_VALUE(s.value, '$.type'),
JSON_VALUE(s.value, '$.data[0]') AS FiscalMonth,
JSON_VALUE(s.value, '$.data[1]') AS FiscalYear,
JSON_VALUE(s.value, '$.data[2]') AS Portfolio,
JSON_VALUE(s.value, '$.data[3]') AS Rent
FROM OPENJSON(JSON_QUERY(#json, '$.rows')) s;
db<>fiddle demo 2
I have problem with my SQL SELECT statement. I get in right order, right drivers, but my other columns are incorrect! And I can't get it right way.
I have data like this:
id, races_id, drivers_id, drive_nr, lap_nr, time, dnf
"231", "9", "41", "1", "1", "00:00:04.750", "0"
"232", "9", "41", "1", "2", "00:00:06.030", "0"
"233", "9", "41", "1", "3", "00:00:01.740", "0"
"234", "9", "42", "1", "1", "00:00:05.440", "0"
"235", "9", "42", "1", "2", "00:00:05.400", "0"
"236", "9", "42", "1", "3", "00:00:02.300", "0"
"237", "9", "43", "1", "1", "00:00:00.620", "0"
"238", "9", "43", "1", "2", "00:00:00.290", "0"
"239", "9", "43", "1", "3", "00:00:00.280", "0"
"240", "9", "44", "1", "1", "00:00:00.600", "0"
"241", "9", "44", "1", "2", "00:00:00.190", "0"
"242", "9", "44", "1", "3", "00:00:00.220", "0"
"243", "9", "45", "1", "1", "00:00:02.830", "0"
"244", "9", "45", "1", "2", "00:00:01.890", "0"
"245", "9", "45", "1", "3", "00:00:03.200", "0"
"246", "9", "46", "1", "1", "00:00:03.580", "0"
"247", "9", "46", "1", "2", "00:00:04.550", "0"
"248", "9", "46", "1", "3", "00:00:01.060", "0"
"249", "9", "47", "1", "1", "00:00:02.920", "0"
"250", "9", "47", "1", "2", "00:00:03.950", "0"
"251", "9", "47", "1", "3", "00:00:00.320", "0"
"252", "9", "48", "1", "1", "00:00:02.150", "0"
"253", "9", "48", "1", "2", "00:00:05.720", "0"
"254", "9", "48", "1", "3", "00:00:04.530", "0"
"255", "9", "49", "1", "1", "00:00:01.530", "0"
"256", "9", "49", "1", "2", "00:00:04.360", "0"
"257", "9", "49", "1", "3", "00:00:07.110", "0"
"258", "9", "50", "1", "1", "00:00:00.450", "0"
"259", "9", "50", "1", "2", "00:00:03.550", "0"
"260", "9", "50", "1", "3", "00:00:07.900", "0"
with query this:
SELECT `id` ,
`races_id` ,
`drivers_id` ,
`drive_nr` ,
`lap_nr` ,
MIN( `time` ) AS TIME,
`dnf`
FROM `laps`
WHERE `races_id` =9
GROUP BY drivers_id`
ORDER BY MIN( `time` ) ASC
I get:
id, races_id, drivers_id, drive_nr, lap_nr, time, dnf
240, 9, 44, 1, 1, 00:00:00.190, 0
237, 9, 43, 1, 1, 00:00:00.280, 0
249, 9, 47, 1, 1, 00:00:00.320, 0
258, 9, 50, 1, 1, 00:00:00.450, 0
246, 9, 46, 1, 1, 00:00:01.060, 0
255, 9, 49, 1, 1, 00:00:01.530, 0
231, 9, 41, 1, 1, 00:00:01.740, 0
243, 9, 45, 1, 1, 00:00:01.890, 0
252, 9, 48, 1, 1, 00:00:02.150, 0
234, 9, 42, 1, 1, 00:00:02.300, 0
so I get correct time column in correct order, but not others columns like ID, drive_nr, lap_nr, dnf
how to fix my query to get distinct drivers_id with min time with correct other data?
And if you remove "GROUP BY"
SELECT `id` , `races_id` , `drivers_id` , `drive_nr` , `lap_nr` , MIN( `time` ) AS TIME, `dnf` FROM `laps` WHERE `races_id` = 9 ORDER BY MIN( `time` ) ASC
Take a look on this link, use of group by and min
The problem is that GROUP BY groups rows together for an aggregate function. In standard SQL every column returned must be a mentioned in the group by clause or an aggregate field, but MySQL extends this.
However although MySQL does allow extra columns to be returned, it does not specify which row the value of those columns comes from. While there is a pattern (seems to be the last row inserted I think), this is not defined and could change.
To get the other fields you have a couple of options.
Simplest is to have a sub query that gets the driver id and the min time for a lap for that driver id, then join that back against the laps table (joining on the driver id and the time) top get the values of the others fields for the matching row. There are a couple of minor downs sides to this. Firstly MySQL will not use an index on the fields on the sub query to join to the main table, but with limited data probably not an issue (beyond the annoyance of the queries popping up in the slow query log). The 2nd issue is if someone has a best lap time shared between 2 laps.
Simple example of the sql:-
SELECT a.id,
a.races_id,
a.drivers_id,
a.drive_nr,
a.lap_nr,
mt.min_time,
a.dnf
FROM laps a
INNER JOIN
(
SELECT drivers_id ,
MIN( `time` ) AS min_time
FROM laps
WHERE races_id = 9
GROUP BY drivers_id
) mt
ON a.drivers_id = mt.drivers_id
AND a.`time` = mt.min_time
WHERE a.races_id = 9
ORDER BY min_time ASC
If you do have 2 laps with the same min lap time then you need to specify which ones details to return (or you might not care, and could just misuse GROUP BY on the outer query as well).
A second solution is that you generate a sequence number for each row on the results ordered by driver id and lap time, resetting the sequence number on change of driver id. Then discard any lap which does not have a sequence of 1. However harder to read, and likely to be very slow when you have lots of data.
Example as follows (not tested):-
SELECT id,
races_id,
drivers_id,
drive_nr,
lap_nr,
`time`,
dnf
FROM
(
SELECT id,
races_id,
drivers_id,
drive_nr,
lap_nr,
`time`,
dnf,
#ctr := IF(drivers_id = #did, #ctr + 1, 1) AS ctr,
#did := drivers_id
FROM
(
SELECT id,
races_id,
drivers_id,
drive_nr,
lap_nr,
`time`,
dnf
FROM laps
WHERE races_id = 9
ORDER BY drivers_id, `time`
)
CROSS JOIN
(
SELECT #ctr := 0, #did := 0
) sub1
) sub2
WHERE ctr = 1
ORDER BY `time`