Anyone see why the query below would yield the error
"#1064 - You have an error in your SQL syntax; check the manual that corresponds to your MySQL server version for the right syntax to use near '%s)"?
SELECT SQL_CALC_FOUND_ROWS id
FROM (
SELECT taba.id
FROM (
SELECT alum.id
FROM cvm_education AS edu
JOIN cvm_alumni AS alum ON alum.id = edu.alumni_id
WHERE cvm_alumni.profile_status =1
AND highest_edu
IN (
SELECT name
FROM cvm_filter_educationlevels
JOIN cvm_educationlevel AS edulevels ON educationlevel_id = edulevels.id
WHERE filter_id = % s
)
) AS taba
Cheers!
you need to quote the string value and use LIKE for pattern matching
WHERE filter_id LIKE '% s'
but if you really want to find % s literally, use =
WHERE filter_id = '% s'
Try this:
SELECT count(taba.id)
FROM (
SELECT alum.id
FROM cvm_education AS edu
JOIN cvm_alumni AS alum ON alum.id = edu.alumni_id
WHERE alum.profile_status =1
AND highest_edu
IN (
SELECT name
FROM cvm_filter_educationlevels
JOIN cvm_educationlevel AS edulevels ON educationlevel_id = edulevels.id
WHERE filter_id = 1
)
) AS taba ;
http://www.sqlfiddle.com/#!2/f8adc/15
Two important points:
I don't understand the use of SQL_CALC_FOUND_ROWS() if have chanced
it in count(). I think this provides the same desired result.
You haven't provided sample data so I wasn't able to try %s. I have
substituted it with a binary (1,0). Furthermore, I don't know your
exact code so I made some assumptions based on your query.
Sample data:
CREATE TABLE cvm_education(
ID int auto_increment primary key,
alumni_id int
);
CREATE TABLE cvm_alumni(
ID int auto_increment primary key,
profile_status int,
highest_edu varchar(30)
);
CREATE TABLE cvm_filter_educationlevels (
ID int auto_increment primary key,
educationlevel_id int,
name varchar(30)
);
CREATE TABLE cvm_educationlevel(
ID int auto_increment primary key,
filter_id int
);
INSERT INTO cvm_education (alumni_id)
VALUES (10), (1), (2), (3),(5), (6),(7),(8),(9);
INSERT INTO cvm_alumni (profile_status, highest_edu)
VALUES (1, "master"),
(0,"bachelor"),
(1,"bachelor"),
(0, "master"),
(1, "master"),
(0, "master"),
(1, "master"),
(1, "master"),
(1, "master"),
(1, "master");
INSERT INTO cvm_filter_educationlevels(educationlevel_id,name)
VALUES (1, "master"), (0,"bachelor");
INSERT INTO cvm_educationlevel(filter_ID)
VALUES (1), (0), (1), (0), (0), (1),(1),(1),(1);
The "% s" is invalid syntax. If that's a literal, then it needs to be enclosed in quotes:
WHERE filter_id = '% s'
(But that fix doesn't appear to be right. It almost looks as if the MySQL statement is being generated with a sprintf, and there was intended to be a '%s' placeholder that was supposed to be replaced with an value.)
Also, there's a closing parenthesis and alias missing from the end of the statement:
) foo
And this:
WHERE cvm_alumni.profile_status = 1
should be changed to this:
WHERE alum.profile_status = 1
(The table is assigned an alias, the column reference should be qualified with the alias, not the table_name)
It's also a good idea to qualify the references all column references, including educationlevel_id, highest_edu and name. (That's not necessarily a problem with the statement, unless MySQL is throwing an "ambiguous column" error, but I prefer to insulate my statements from any "ambiguous column" error that will crop up when new columns are added.)
SELECT SQL_CALC_FOUND_ROWS id
FROM (SELECT taba.id
FROM (
SELECT alum.id
FROM cvm_education edu
JOIN cvm_alumni alum
ON alum.id = edu.alumni_id
WHERE alum.profile_status = 1
AND `highest_edu` IN
(
SELECT `name`
FROM cvm_filter_educationlevels
JOIN cvm_educationlevel edulevels
ON `educationlevel_id` = edulevels.id
WHERE `filter_id` = '% s'
)
) taba
) foo
Related
I have this table:
CREATE TABLE stackoverflow_question (
id int NOT NULL AUTO_INCREMENT,
name varchar(255) NOT NULL,
json_ob mediumtext default null,
PRIMARY KEY (id)
);
I do some inserts:
insert into stackoverflow_question values(null, 'albert', '[{name: "albert1", qt: 2},{name: "albert2", qt: 2}]');
insert into stackoverflow_question values(null, 'barbara', '[{name: "barbara1", qt: 4},{name: "barbara2", qt: 7}]');
insert into stackoverflow_question values(null, 'paul', '[{name: "paul1", qt: 9},{name: "paul2", qt: 11}]');
Eventually, I will need to sort this table by total quantity.
in the examples above, "paul" has quantity = 20, while "barbara" has quantity = 11. And "albert" has quantity = 4.
Is it possible to create a select statement where a new field is created on the fly? Something like this:
SELECT
SUM (loop json_ob and sum all the quantity fields) AS total_quantity,
id,
name
FROM
stackoverflow_question
ORDER BY total_quantity
If json_ob is actually a valid json object then you can use JSON_TABLE() to extract the quantities and aggregate:
SELECT s.*, SUM(t.qt) total_quantity
FROM stackoverflow_question s,
JSON_TABLE(json_ob, '$[*]' COLUMNS (qt INTEGER PATH '$.qt')) t
GROUP BY s.id
ORDER BY total_quantity DESC;
See the demo.
According to jsonlint your JSON is not valid.
That's why this SQL returns an error (ERROR 3141 (22032): Invalid JSON text in argument 1 to function json_table: "Missing a name for object member." at position 2.")
SELECT
j.name, j.qt
FROM JSON_TABLE('[{name: "paul1", qt: 9},{name: "paul2", qt: 11}]',
"$[*]" COLUMNS (name varchar(20) PATH "$.name", qt int PATH "$.qt")) j ;
and this will return the values:
SELECT
j.name, j.qt
FROM JSON_TABLE('[{"name": "paul1", "qt": 9},{"name": "paul2", "qt": 11}]',
"$[*]" COLUMNS (name varchar(20) PATH "$.name", qt int PATH "$.qt")) j ;
output:
name
qt
paul1
9
paul2
11
You can convert your relaxedJSON, to JSON, using tools like : www.relaxedjson.org
I have one string element, for example : "(1111, Tem1), (0000, Tem2)" and hope to generate a data table such as
var1
var2
1111
Tem1
0000
Tem2
This is my code, I created the lag token and filter with odd rows element.
with var_ as (
select '(1111, Tem1), (0000, Tem2)' as pattern_
)
select tbb1.*, tbb2.result_string as result_string_previous
from(
select tb1.*,
min(token) over(partition by 1 order by token asc rows between 1 preceding and 1 preceding) as min_token
from
table (
strtok_split_to_table(1, var_.pattern_, '(), ')
returns (outkey INTEGER, token INTEGER, result_string varchar(20))
) as tb1) tbb1
inner join (select min_token, result_string from tbb1) tbb2
on tbb1.token = tbb2.min_token
where (token mod 2) = 0;
But it seems that i can't generate new variables in "from" step and applied it directly in "join" step.
so I wanna ask is still possible to get the result what i want in my procedure? or is there any suggestion?
Thanks for all your assistance.
I wouldn't split / recombine the groups. Split each group to a row, then split the values within the row, e.g.
with var_ as (
select '(1111, Tem1), (0000, Tem2)' as pattern_
),
split1 as (
select trim(leading '(' from result_string) as string_
from
table ( /* split at & remove right parenthesis */
regexp_split_to_table(1, var_.pattern_, '\)((, )|$)','c')
returns (outkey INTEGER, token_nbr INTEGER, result_string varchar(256))
) as tb1
)
select *
from table(
csvld(split1.string_, ',', '"')
returns (var1 VARCHAR(16), var2 VARCHAR(16))
) as tb2
;
Postgres (12.2) Setup:
CREATE TABLE public.test_table (
id int NOT NULL,
value_type text NOT NULL,
value text NOT NULL
);
INSERT INTO public.test_table
(id, value_type, value)
VALUES (1, 'string', 'a'),
(2, 'json', '{"hello":"world"}'),
(3, 'json', '{"color":"blue"}');
Initial Queries:
select value::jsonb as json_value from test_table where value_type = 'json'
json_value |
------------------|
{"hello": "world"}|
{"color": "blue"} |
But I'm only interested in ones with 'color'.
Moving it to a subquery so that I can get only 'color', also just fine:
select only_json.json_value
from(
select value::jsonb as json_value from test_table where value_type = 'json'
) only_json
where only_json.json_value ? 'color' = true
json_value |
------------------|
{"color": "blue"} |
Now let's break that main table up into two, and suddenly effectively the same query has trouble:
CREATE TABLE public.test_table (
id INT PRIMARY KEY,
value TEXT NOT NULL
);
CREATE TABLE public.test_types (
id INT PRIMARY KEY REFERENCES public.test_table (id),
value_type TEXT NOT NULL
);
INSERT INTO public.test_table
(id, value)
VALUES (1, 'a'),
(2, '{"hello":"world"}'),
(3, '{"color":"blue"}');
insert into public.test_types
(id, value_type)
values (1, 'string'),
(2, 'json'),
(3, 'json');
Now this query:
select id, value from (
select id, value::jsonb from public.test_table natural join public.test_types
where value_type = 'json') only_json
returns, as expected:
id|value |
--|------------------|
2|{"hello": "world"}|
3|{"color": "blue"} |
But as soon as I attach the where clause, it fails:
select id, value from (
select id, value::jsonb from public.test_table natural join public.test_types
where value_type = 'json') only_json
where only_json.value ? 'color' = true
SQL Error [22P02]: ERROR: invalid input syntax for type json
Detail: Token "a" is invalid.
Where: JSON data, line 1: a
It's somehow resurrected the value of 'a' that was well-eliminated prior to this where clause. So what gives? Why does the join cause it to apply the last where clause (which should happen logically last) too early? Failed workarounds I've tried:
Using left join instead of natural join.
Applying where value_type = 'json' to the joined table first, prior to the join.
Moving it to a "with".
Creating a view and then applying the where clause to a select from the view.
Creating a column via select called is_color_holder with SELECT only_json.value ? 'color' as is_color_holder. This column populates correctly, but if I use a where clause, WHERE is_color_holder = true, I receive the same error.
Repeating the value_type='json' expression in the problematic where clause.
Moving the cast up a subquery.
Replacing the join with where id in (select id from public.test_types where value_type = 'json')
Comma-style joins.
Centering the query around the types table first, then joining the value type after the types have already been filtered.
Is this a bug I should report to postgres? Am I missing something?
Edit: I managed one workaround. See my answer for more details. Still looking for a better answer, though.
select id, value from (
select id, case when value_type = 'json' then value::jsonb else to_jsonb(value) end as value, value_type from
public.test_table natural join public.test_types
where value_type = 'json') as_json
where value ? 'color' = true
id|value |
--|-----------------|
3|{"color": "blue"}|
I suspect that what you are seeing is premature optimization, caused by predicate pushdown.
In Postgres, a common strategy to avoid that is the offset 0 hack:
select id, value from (
select id, value
from public.test_table
inner join public.test_types using(id)
where value_type = 'json'
offset 0 -- (try to) prevent predicate pushdown
) only_json
where value::jsonb ? 'color'
Demo on DB Fiddle
I found a workaround. I'll post as an 'answer' here and edit the above question.
But if anyone has a better answer for me, I'll make yours as the correct one.
Casting non-json values to json with "CASE" works fine:
select id, value from (
select id, case when value_type = 'json' then value::jsonb else to_jsonb(value) end as value, value_type from
public.test_table natural join public.test_types
where value_type = 'json') as_json
where value ? 'color' = true
id|value |
--|-----------------|
3|{"color": "blue"}|
CREATE TABLE fcc_consistency_check
(
cons_id VARCHAR2(30),
cons_desc VARCHAR2(4000),
cons_query CLOB,
module_id VARCHAR2(2),
main_tab_name VARCHAR2(30),
hist_tab_name VARCHAR2(30),
col_name VARCHAR2(4000),
col_type VARCHAR2(4000),
check_reqd VARCHAR2(1)
);
INSERT INTO fcc_consistency_check
VALUES ('CHK_BC003','Missing records in contract_event_log','select a.CONTRACT_REF_NO ,a.Latest_Event_Seq_No,
c.PREV_WORKING_DAY from cstb_contract A ,sttm_dates c
where module_code = 'BC'
and c.Branch_code='000'
and not exists (select * from cstb_contract_event_log B
where a.contract_ref_no = b.contract_ref_no
and latest_event_seq_no = event_seq_no);',
'BC','BCCC_EVENT_LOG_MISREC','BCCC_EVENT_LOG_MISREC_HISTORY','CONTRACT_REF_NO,LATEST_EVENT_SEQ_NO,EOD_DATE','VARCHAR2(16),NUMBER,DATE','Y');
Not able to insert clob value, I'm getting this error
ORA-00917: missing comma
When I try to insert individual column value then I found that, error is throwing for column cons_query.
The problem is that you have quotes within your query:
'select a.CONTRACT_REF_NO
,a.Latest_Event_Seq_No,
c.PREV_WORKING_DAY from cstb_contract A ,sttm_dates
where module_code = 'BC'
^string starst here:
^ends here, there's a commma missing
However, the actual issue is not that a comma is missing but that you have quotes you forgot to escape. You need to write module_code = ''BC'' for example to escape those quotes (you have additional quotes in there, not just at 'BC').
Good Morning!
I am trying to combine two queries to make a table. (Please see code below)
`CREATE TABLE Layer_Loss
(
dYear INT NOT NULL,
EventNum INT NOT NULL,
Loss INT NULL,
Rec_L1 BIGINT NULL,
Rec_L2 BIGINT NULL,
Rec_L3 BIGINT NULL,
Cap_CML_L1 BIGINT NULL,
Cap_CML_L2 BIGINT NULL,
Cap_CML_L3 BIGINT NULL,
)
INSERT INTO Layer_Loss (dYear,EventNum, Loss, Rec_L1, Rec_L2, Rec_L3, Capped_CML_L1, Capped_CML_L2, Capped_CML_L3)
WITH c AS (SELECT Row_number() OVER (ORDER BY dYear) AS rownum,*
FROM Layer_Loss_Capped2)
SELECT *
FROM
(
SELECT dYear, ROW_NUMBER() OVER (Partition by dYear Order by dYear) as Event_Number, Loss
, 'Recovery_L1'=CASE
WHEN Loss<10000000 THEN 0
WHEN Loss<30000000 THEN 20000000-(30000000-Loss)
ELSE 20000000
END
, 'Recovery_L2'=CASE
WHEN Loss<30000000 THEN 0
WHEN Loss<60000000 THEN 30000000-(60000000-Loss)
ELSE 30000000
END
, 'Recovery_L3'=CASE
WHEN Loss<60000000 THEN 0
WHEN Loss<100000000 THEN 40000000-(100000000-Loss)
ELSE 40000000
END
, (SELECT *, 'Capped_CML_L1'=CASE
WHEN d.CML_L1>40000000 THEN 4000000
ELSE d.CML_L1
END
, (SELECT *, 'Capped_CML_L2'=CASE
WHEN d.CML_L2>60000000 THEN 6000000
ELSE d.CML_L1
END
, (SELECT *, 'Capped_CML_L3'=CASE
WHEN d.CML_L1>80000000 THEN 8000000
ELSE d.CML_L1
END
FROM
(
SELECT a.dYear, a.EventNum, a.Loss, a.Rec_L1, SUM(b.Rec_L1) AS CML_L1, SUM(b.Rec_L2) AS CML_L2, SUM(b.Rec_L3) as CML_L3
FROM c a
LEFT JOIN c b ON a.dYear = b.dYear AND b.rownum <= a.rownum
GROUP BY a.dYear, a.rownum, a.EventNum, a.Rec_L1, a.Loss
) AS d
) AS e
FROM ['04_AIR_StdHU_DS_noSS_ByTerr$']
) AS a
DROP TABLE Layer_Loss`
I have it so that the query about 'Recovery_L1', 'Recovery_L2', and 'Recovery_L3' are about of table "Layer_Loss", which I've called "Rec_L1", "Rec_L2", and "Rec_L3". When I try to add the query that leads to "Capped_CML_L1", "Capped_CML_L2", and "Capped_CML_L3" I get the following error:
"Msg 156, Level 15, State 1, Line 14
Incorrect syntax near the keyword 'WITH'.
Msg 319, Level 15, State 1, Line 14
Incorrect syntax near the keyword 'with'. If this statement is a common table expression, an xmlnamespaces clause or a change tracking context clause, the previous statement must be terminated with a semicolon."
I have tried moving 'WITH' clause around but end up with the same result.
Also, this is not my end result. My next step would be to subtract the current row from the previous row from the columns "Capped_CML_L1", "Capped_CML_L2", and "Capped_CML_L3" into a column called "Inc_Rec_L1", "Inc_Rec_L2", and "Inc_Rec_L3". I was thinking about using a cursor, but I have never used one before, so if you have any suggestions on this, that would be great too!
Thank you for your help!
EDIT:
`CREATE TABLE Layer_Loss
(
dYear INT NOT NULL,
EventNum INT NOT NULL,
Loss INT NULL,
Rec_L1 BIGINT NULL,
Rec_L2 BIGINT NULL,
Rec_L3 BIGINT NULL,
Cap_CML_L1 BIGINT NULL,
Cap_CML_L2 BIGINT NULL,
Cap_CML_L3 BIGINT NULL,
)
;WITH c AS (SELECT Row_number() OVER (ORDER BY dYear) AS rownum,*
FROM Layer_Loss_Capped2)
INSERT INTO Layer_Loss (dYear,EventNum, Loss, Rec_L1, Rec_L2, Rec_L3, Capped_CML_L1, Capped_CML_L2, Capped_CML_L3)
SELECT *
FROM
(
SELECT dYear, ROW_NUMBER() OVER (Partition by dYear Order by dYear) as Event_Number, Loss
, 'Recovery_L1'=CASE
WHEN Loss<10000000 THEN 0
WHEN Loss<30000000 THEN 20000000-(30000000-Loss)
ELSE 20000000
END
, 'Recovery_L2'=CASE
WHEN Loss<30000000 THEN 0
WHEN Loss<60000000 THEN 30000000-(60000000-Loss)
ELSE 30000000
END
, 'Recovery_L3'=CASE
WHEN Loss<60000000 THEN 0
WHEN Loss<100000000 THEN 40000000-(100000000-Loss)
ELSE 40000000
END
, (SELECT *, 'Capped_CML_L1'=CASE
WHEN d.CML_L1>40000000 THEN 4000000
ELSE d.CML_L1
END
, (SELECT *, 'Capped_CML_L2'=CASE
WHEN d.CML_L2>60000000 THEN 6000000
ELSE d.CML_L1
END
, (SELECT *, 'Capped_CML_L3'=CASE
WHEN d.CML_L1>80000000 THEN 8000000
ELSE d.CML_L1
END
FROM
(
SELECT a.dYear, a.EventNum, a.Loss, a.Rec_L1, SUM(b.Rec_L1) AS CML_L1, SUM(b.Rec_L2) AS CML_L2, SUM(b.Rec_L3) as CML_L3
FROM c a
LEFT JOIN c b ON a.dYear = b.dYear AND b.rownum <= a.rownum
GROUP BY a.dYear, a.rownum, a.EventNum, a.Rec_L1, a.Loss
) AS d
FROM ['04_AIR_StdHU_DS_noSS_ByTerr$']
) AS e
) AS f
) AS g
) AS a
DROP TABLE Layer_Loss`
When I put in the above edited code I get error:
Msg 156, Level 15, State 1, Line 58
Incorrect syntax near the keyword 'FROM'.
I would like to be able to reference Capped_CML_L1, Capped_CML_L2, and Capped_CML_L3 in another query or table or cursor later on. I wanted it to be under just 'e' but I'm not sure how with the parentheses
WITH must be separated from any preceding command by a ;.
When a CTE is used in a statement that is part of a batch, the statement before it must be followed by a semicolon.
Also, it must be the first part of the entire statement it's a part of, whether that be a plain SELECT or an INSERT. Try:
/* CREATE TABLE */
;WITH c AS (SELECT Row_number() OVER (ORDER BY dYear) AS rownum,*
FROM Layer_Loss_Capped2)
INSERT INTO Layer_Loss (dYear,EventNum, Loss, Rec_L1, Rec_L2, Rec_L3,
Capped_CML_L1, Capped_CML_L2, Capped_CML_L3)
SELECT *
FROM
(
SELECT dYear, ROW_NUMBER() OVER ...
(Have also moved the WITH before the INSERT, having realised what was being attempted)
See also Transact SQL Syntax conventions:
; Transact-SQL statement terminator.Although the semicolon is not required for most statements in this version of SQL Server, it will be required in a future version.
One of the main reasons for this is that there is a pre-existing use for the keyword WITH that modifies SELECT statements. By insisting on the ;, it makes the parse a lot easier.