Join a json UUID value with a column in postgresql - json

I want to join a column with a JSON value. The problem is the JSON value is an array and I want to join the resulting UUID value from the other column to whatever matches with the JSON Array. The table name that has the JSON column(column named staffdep) is department and the other table name is staff which has the staffId column.
staffdepID column's value would be like
{"departmentID":[],"staffID":["109ec36a-42bd-42fe-9b1f-c4f479c48fda","109ec36a-42bd-42fe-9b1f-c4f479c48fda"]}
staff id column would have a unique uuid for each row. For example like '109ec36a-42bd-42fe-9b1f-c4f479c48fda', '84dfbc00-0ff4-4689-98de-1d7496bb9da1'.
The extract of the query I used was,
from department d
join staff s on s.staffId = (d.staffdep -> 'staffID' ->> 0)::uuid
The issue with the above query is as I said above, the equivalent UUID of staffID might not always be the first value in the JSON array under d.staffdep. I would need a solution for this.
Any help is highly appreciated. Thanks

You can use a JSON path condition as the join condition:
from department d
join staff s on jsonb_path_exists(d.staffdep, '$.staffID[*] ? (# == $id)', jsonb_build_object('id', s.staffid::text));

Related

Unable to write WHERE condition for a JSON array field with array values in MySQL

I have a JSON datatype with array values in a MySQL database as highlighted in the picture below:
I want to compare multiple values with this category field, Ex: ["49","27"].
How to write a MySQL query for this?
I tried this query:
SELECT l.*, pc.name as cat_name,u.name as uname
FROM listing l
LEFT OUTER JOIN package_purchased_history ph ON ph.user_id = l.user_id AND ph.expired_date >= 1656095400 AND ph.purchase_date <= 1656095400
LEFT OUTER JOIN user u ON u.id = l.user_id
INNER JOIN category pc ON JSON_SEARCH(l.categories, 'one', pc.id) AND pc.parent = 26
WHERE JSON_CONTAINS(l.categories,'["49"]','$[0]') IS NOT NULL
AND l.status = 'active'
GROUP BY l.id
Unfortunately it is not working, so please suggest me the a better approach.
$[0] is the first element of the array, not the whole array. So you're testing whether an array is contained in a single number, not whether the array is contained in the array.
The whole array is $, since that refers to the top-level element of the JSON value. But you don't need to specify the path when you're searching the whole value.
WHERE JSON_CONTAINS(l.categories, '[49]')
You don't need IS NOT NULL there, since JSON_CONTAINS() returns a boolean. The value will never be NULL unless l.categories is NULL.

SELECT statement inside a CASE statement in SNOWFLAKE

I have a query where i have "TEST"."TABLE" LEFT JOINED to PUBLIC."SchemaKey". Now in my final select statement i have a case statement where i check if c."Type" = 'FOREIGN' then i want to grab a value from another table but the table name value i am using in that select statement is coming from the left joined table column value. I've tried multiple ways to get to work but i keep getting an error, although if i hard code the table name it seems to work. i need the table name to come from c."FullParentTableName". Is what i am trying to achieve possible in snowflake and is there a way to make this work ? any help would be appreciated !
SELECT
c."ParentColumn",
c."FullParentTableName",
a."new_value",
a."column_name"
CASE WHEN c."Type" = 'FOREIGN' THEN (SELECT "Name" FROM TABLE(c."FullParentTableName") WHERE "Id" = 'SOME_ID') ELSE null END "TestColumn" -- Need assistance on this line...
FROM "TEST"."TABLE" a
LEFT JOIN (
select s."Type", s."ParentSchema", s."ParentTable", s."ParentColumn", concat(s."ParentSchema",'.','"',s."ParentTable",'"') "FullParentTableName",s."ChildSchema", s."ChildTable", trim(s."ChildColumn",'"') "ChildColumn"
from PUBLIC."SchemaKey" as s
where s."Type" = 'FOREIGN'
and s."ChildTable" = 'SOMETABLENAME'
and "ChildSchema" = 'SOMESCHEMANAME'
) c
on a."column_name" = c."ChildColumn"
Thanks !
In Snowflake you cannot dynamically use the partial results as tables.
You can use a single bound value via identifier to bind a value to table name
But you could write a Snowflake Scripting but it would need to explicitly join the N tables. Thus if you N is fixed, you should just join those.

Join a json value with a column in postgresql

I want to join a column with a json value. The problem is the json value is within square brackets and its a uuid. Table name that has the json column(column named json) is department and the other table name is staff. The json column value would be like below,
{"title":"Manager","alternativeTitle":null,"departmentIds":["c8098u43-7d9a-3789-gt56-r78009v4r345"]}
I would like to query the departmentIds from the json column and join it with staffdepartmentID column in the staff table.
My query for the join
from staff s
join department d on d.json ->> departmentIds::json = s.staffdepartmentID
The problem I am facing is that I dont know how to remove those square brackets. Any help is highly appreciated. Thanks
Square braquets within a json data correspond to an array.
You can access any element of the array based on its position starting with 0 for the first element : array->0
So for your query you can do :
from staff s
inner join department d
on d.json -> 'departmentIds'->>0 = s.staffdepartmentID :: text

Join with JSON fields

In the database (postgres 10) i work with at the moment i have a lot of columns with json format strings (text fields)
I have a hard time to search,filter and join with values inside thoose strings.
I created an example in fiddle and i hope someone can help me solve it.
https://dbfiddle.uk/?rdbms=postgres_13&fiddle=d003c1cea832e35696260b10c6b4c047
Here is my two problems i found tricky.
In table object -> configuration field i have "date_kg_later" field. It could be [] or hold more dates. In my select i want to get the highest date if it's not empty.
In table object -> object_type_cd refers on a value inside the records string in code_table. Here i want to get the title from that string back.
My goal is an output like this:
id name object_type_name date
1000 Headphones tech 2022-04-30
1001 Pencil null null
Your first question : if the json structure of your configuration field conforms the example you provide, then a solution can be :
Before Postgres 12 :
SELECT Max((c.elt->>'date_from') :: date)
FROM object
CROSS JOIN LATERAL json_array_elements(configuration->'date_kg_later') AS c(elt)
GROUP BY your_object_table_key
From and after Postgres 12 :
SELECT Max(d :: text :: date)
FROM object
CROSS JOIN LATERAL jsonb_array_elements((configuration :: jsonb)->'date_kg_later', '$[*].date_from') AS d
GROUP BY your_object_table_key
Your second question is unclear to me, needs more explanation.

How to create an empty JSON object in postgresql?

Datamodel
A person is represented in the database as a meta table row with a name and with multiple attributes which are stored in the data table as key-value pair (key and value are in separate columns).
Simplified data-model
Now there is a query to retrieve all users (name) with all their attributes (data). The attributes are returned as JSON object in a separate column. Here is an example:
name data
Florian { "age":25 }
Markus { "age":25, "color":"blue" }
Thomas {}
The SQL command looks like this:
SELECT
name,
json_object_agg(d.key, d.value) AS data,
FROM meta AS m
JOIN (
JOIN d.fk_id, d.key, d.value AS value FROM data AS d
) AS d
ON d.fk_id = m.id
GROUP BY m.name;
Problem
Now the problem I am facing is, that users like Thomas which do not have any attributes stored in the key-value table, are not shown with my select function. This is because it does only a JOIN and no LEFT OUTER JOIN.
If I would use LEFT OUTER JOIN then I run into the problem, that json_object_agg try's to aggregate NULL values and dies with an error.
Approaches
1. Return empty list of keys and values
So I tried to check if the key-column of a user is NULL and return an empty array so json_object_agg would just create an empty JSON object.
But there is not really a function to create an empty array in SQL. The nearest thing I found was this:
select '{}'::text[];
In combination with COALESCE the query looks like this:
json_object_agg(COALESCE(d.key, '{}'::text[]), COALESCE(d.value, '{}'::text[])) AS data
But if I try to use this I get following error:
ERROR: COALESCE types text and text[] cannot be matched
LINE 10: json_object_agg(COALESCE(d.key, '{}'::text[]), COALES...
^
Query failed
PostgreSQL said: COALESCE types text and text[] cannot be matched
So it looks like that at runtime d.key is a single value and not an array.
2. Split up JSON creation and return empty list
So I tried to take json_object_agg and replace it with json_object which does not aggregate the keys for me:
json_object(COALESCE(array_agg(d.key), '{}'::text[]), COALESCE(array_agg(d.value), '{}'::text[])) AS data
But there I get the error that null value not allowed for object key. So COALESCE does not check that the array is empty.
Qustion
So, is there a function to check if a joined column is empty, and if yes return just a simple JSON object?
Or is there any other solution which would solve my problem?
Use left join with coalesce(). As default value use '{}'::json.
select name, coalesce(d.data, '{}'::json) as data
from meta m
left join (
select fk_id, json_object_agg(d.key, d.value) as data
from data d
group by 1
) d
on m.id = d.fk_id;
name | data
---------+------------------------------------
Florian | { "age" : "25" }
Marcus | { "age" : "25", "color" : "blue" }
Thomas | {}
(3 rows)