QueryDSL with DB2 fetching Nested Json object or Json array aggregation Response - json

I am trying to fetch nested JSON objects and JSON List from Database using QueryDSL. I have used a native query with LISTAGG and JSON_OBJECT.
Native Query :
SELECT b.id,b.bankName,b.account,b.branch,(select CONCAT(CONCAT('[',LISTAGG(JSON_OBJECT('accountId' value c.accountId, 'name' value customer_name,'amount' value c.amount),',')),']') from CUSTOMER_DETAILS c where c.bankId = b.id) as customers from BANK_DETAILS b
BANK_DETAILS
+----+---------+---------+----------+
| id | BankName| account | branch |
+----+---------+---------+----------+
| 1 | bank1 | savings | branch1 |
| 2 | bank2 | current | branch2 |
+----+---------+---------+----------+
CUSTOMER_DETAILS
+----+-----------+---------------+----------+-----------+
| id | accountId | customer_name | amount | BankId |
+----+-----------+---------------+----------+-----------+
| 1 | 50123 | Abc1 | 150000 | 1 |
| 2 | 50124 | Abc2 | 25000 | 1 |
| 3 | 50125 | Abc3 | 50000 | 2 |
| 4 | 50126 | Abc4 | 250000 | 2 |
+----+-----------+---------------+----------+-----------+
Expected Output for the above tables
[{
"id": "1",
"bankName": "bank1",
"account": "savings",
"branch": "branch1",
"customers": [
{
"accountId": "50123",
"Name": "Abc1",
"amount": 150000
},
{
"accountId": "50124",
"Name": "Abc2",
"amount": 25000
},
]
},{
"id": "2",
"bankName": "bank3",
"account": "current",
"branch": "branch2",
"customers": [
{
"accountId": "50125",
"name": "Abc3",
"amount": 50000
},
{
"accountId": "50126",
"Name": "Abc4",
"amount": 250000
},
]
}]
i have tried with writing this native query in QueryDSL with the below multiple queries for make the same expected output with the forEach loop.
class Repository {
private SQLQueryFactory queryFactory;
public Repository (SQLQueryFactory queryFactory){
this.queryFactory = queryFactory;
}
public void fetchBankDetails(){
List<BankDetails> bankList = queryFactory.select(QBankDetails.bankDetails)
.from(QBankDetails.bankDetails);
bankList.forEach(bankData ->{
List<CustomerDetails> customerList = queryFactory.select(QCustomerDetails.customerDetails)
.from(QCustomerDetails.customerDetails)
.where(QCustomerDetails.customerDetails.bankId.eq(bankData.bankId));
bankData.setCustomerList(customerList)
});
System.out.println(bankList);
}
}
I need to improve my code and convert it into a single query using QueryDSL to return the expected output
Is there any other way or any suggestions?

Related

SQL query to return an attribute as an array of objects

My DB (MySQL) looks as follows:
TASKS:
-----------------
| id | desc |
-----------------
| 1 | 'dishes' |
| 2 | 'dust' |
-----------------
IMAGES:
---------------------------
| id | task_id | url |
---------------------------
| 1 | 1 | 'http1' |
| 2 | 1 | 'http2' |
---------------------------
I would like to get a response in the following structure (nested array of objects with id, url):
"tasks": [
{
"id": 1,
"desc": "dishes",
"images": [
{
"id": 1,
"url": "http1"
},
{
"id": 2,
"url": "http2"
}
]
},
...
]
The closest I have got was with this code:
SELECT
t.id,
t.desc,
JSON_ARRAYAGG(i.url) AS images,
FROM tasks AS t
LEFT JOIN images AS i ON t.id=i.task_id
GROUP BY t.id
And got in return:
[
{
"id": 1,
"desc": "dishes",
"images": [
"http1",
"http2"
]
}
...
]
Above response is problematic as I need the image_ids.
I have also tried using JSON_OBJECTAGG (which is not ideal) however I had below SQL error:
"JSON documents may not contain NULL member names."
Indeed some tasks may not have images matching and I want to have them included in the response.
How should I refactor my code to get the desired response from the server?

How to convert query to Json object in MySQL

The immediate answer to the question is to use json_objectfunction.
However this function is not avliable on my DB as it's older version.
We have plans to upgrade but it will take a while.
How do I convert this:
SELECT name, phone, order_id FROM orders;
| name | phone | order_id
| Jack | 12345 | 120
| Jack | 12345 | 121
To this:
[
{
"name": "Jack",
"order_id": "120",
"phone": 12345
},
{
"name": "Jack",
"order_id": "121",
"phone": 12345
}
]
In a SQL query without using json_object function

hive json data parsing

My JSON Data is something like this in the table json_table and column: json_col
{
"href": "example.com",
"Hosts": {
"cluster_name": "test",
"host_name": "test.iabc.com"
},
"metrics": {
"cpu": {
"cpu_user": [
[
0.7,
1499795941
],
[
0.3,
1499795951
]
]
}
}
}
I want to get this into a table json_data in the below format
+-------------+-------+------------+
| metric_type | value | timestamp |
+-------------+-------+------------+
| cpu_user | 0.7 | 1499795941 |
+-------------+-------+------------+
| cpu_user | 0.3 | 1499795951 |
+-------------+-------+------------+
I tried getting the values using get_json_object
select get_json_object(json_col,'$.metrics.cpu.cpu_user[1]') from json_table
,this gives me
[0.3,1499795951]
How do I use the explode function from here to get the desired output?
select 'cpu_user' as metric_type
,val_ts[0] as val
,val_ts[1] as ts
from (select split(m.col,',') as val_ts
from json_table j
lateral view explode(split(regexp_replace(get_json_object(json_col,'$.metrics.cpu.cpu_user[*]'),'^\\[\\[|\\]\\]$',''),'\\],\\[')) m
) m
;
+-------------+-----+------------+
| metric_type | val | ts |
+-------------+-----+------------+
| cpu_user | 0.7 | 1499795941 |
| cpu_user | 0.3 | 1499795951 |
+-------------+-----+------------+
You can also implement SerDe and InputFormat interface based on JSON data, instead of using UDF.
here are some referance:
http://blog.cloudera.com/blog/2012/12/how-to-use-a-serde-in-apache-hive/
https://github.com/xjtuzxh/inceptor-inputformat

how to use mulesoft dataweave to transform to json with grouping and string to array

I have a database call the provides a payload as described below. how I do use dataweave to transform that payload to json in the format as provided below the example table?
|---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| company |status| license_id |acct status| last_inv_date | acctnum | owner | entlmt | roles |subscribed|attr_type| attr_key |attr_value|
|---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| company name 1|Active|02iq0000000xlBBAAY| Active |2016-02-25 22:50:04|A100001135|myemail#email.com|Standard|Admin;wcl_admin;wcl_support| 1 | cloud |cloud_num_247_t_streams| 1 |
|---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| company name 1|Active|02iq0000000xlBBAAY| Active |2016-02-25 22:50:04|A100001135|myemail#email.com|Standard|Admin;wcl_admin;wcl_support| 1 | cloud | api_access | 1 |
|---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| company name 1|Active|02iq0000000xlBBAAY| Active |2016-02-25 22:50:04|A100001135|myemail#email.com|Standard|Admin;wcl_admin;wcl_support| 1 | cloud |cloud_num_247_p_streams| 1 |
|---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| company name 2|Active|02iq0000000xlBBBBZ| Active |2016-02-25 22:50:04|A100001166|myblah1#email.com|Standard| Admin | 1 | cloud |cloud_num_247_p_streams| 0 |
|---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| company name 2|Active|02iq0000000xlBBBBZ| Active |2016-02-25 22:50:04|A100001166|myblah1#email.com|Standard| Admin | 1 | cloud | api_access | 1 |
|---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
Final output desired in json:
{
"records": [
{
"company": "company name 1",
"has_active_subscriptions": true,
"license_status": "Active",
"license_id": "02iq0000000xlBBAAY",
"account_status": "Prospect",
"last_invoice_date": "2016-02-25 22:50:04",
"cloud_owner_email": "myemail#email.com",
"role": [
"Admin",
"wcl_admin",
"wcl_support"
],
"account_number": "A100001135",
"attributes": {
"cloud": {
"api_access": 1,
"cloud_num_247_t_streams": 1,
"cloud_num_247_p_streams": 1
}
},
"entitlement_plan": "Standard"
},
{
"company": "company name 2",
"has_active_subscriptions": true,
"license_status": "Active",
"license_id": "02iq0000000xlBBBBZ",
"account_status": "Active",
"last_invoice_date": "2016-02-25 22:50:04",
"cloud_owner_email": "myblah#email.com",
"role": [
"Admin"
],
"account_number": "A100001166",
"attributes": {
"cloud": {
"cloud_num_247_p_streams": 0,
"api_access": 1
}
},
"entitlement_plan": "Standard"
}
]
}
Supposing that the dataweave component is just after the database component, and the result of the query is still on the payload: the payload is then an ArrayList of CaseInsensitiveHashMap - similar to the records object on your JSON.
So I would try something like:
%dw 1.0
%output application/json
records: payload
You dont need DataWeave if you just want to transform resultset into JSON.
You can use ObjectToJson transformer to do that.

Writing nested JSON in spark scala

My Spark-SQL is generating an output for a query by joining two tables which has one to many cardinality. I have to convert the data into JSON.
This is how the output of the query will look like.
Address_id_parent | Address_id_child | Country_child | city_child
1 | 1 | India | Delhi
1 | 1 | US | NewYork
1 | 1 | US | NewJersey
The above data has to be converted to JSON in this way.
{
"Address": {
"Address_id_parent": "1"
},
"Address-details": [{
"Address_id_child": "1",
"location": [{
"country":"India",
"city":"Delhi",
},
{
"country":"US",
"city":"NewYork",
},
{
"country":"US",
"city":"NewJersey",
}
]
}]
}
How can I accomplish this?
Check Dataframe write interface with json:
df.write.format("json").save(path)