I am currently a happy user of ngx_postgres. However I recently discover I need to do something very weird. Basically I need to produce the following json output:
{
"PatientName": {
"Tag": "00100010",
"VR": "PN",
"PersonName": [
{
"SingleByte": "Wang^XiaoDong",
"Ideographic": "王^小東"
}
]
},
},
{
"PatientName": {
"Tag": "00100010",
"VR": "PN",
"PersonName": [
{
"SingleByte": "John^Doe",
}
]
},
}
With a little reading of the DICOM standard it is easy to create (simplified) a table of equivalent for Keyword, Tag and VR:
CREATE TABLE equiv (
"Keyword" varchar(64) PRIMARY KEY,
"Tag" char(8) NOT NULL,
"VR" char(2) NOT NULL,
);
Well now the tricky part is this indirection with PatientName that I do not understand, I tried:
CREATE TABLE patientname (
"SingleByte" varchar(64) primary key,
"Ideographic" varchar(64),
);
CREATE TABLE patientlevel_impl_detail (
"PatientName" varchar(64) references patientname("SingleByte"),
"PatientID" character varying(64) NOT NULL
);
CREATE view patientlist as select
patientname."SingleByte",
patientname."Ideographic",
patientname."Phonetic",
patientlevel_impl_detail."PatientID"
FROM patientlevel_impl_detail,patientname where patientlevel_impl_detail."PatientName" = patientname."SingleByte";
However in any case a TABLE and/or a VIEW is always flatten, and instead I am getting something like:
$ curl http://localhost:8080/patients
[
{
"Tag": "00100010",
"VR": "PN",
"SingleByte": "John^Doe",
"Ideographic": null,
},
]
So I do not see how I can make PersonName an array of nested string (nested json tree).
Note: I am not using 9.3, I need to use 9.1 for now
On 9.1, do yourself a favor and get the json extension for 9.1. It will save you a lot of work.
The second thing you need to do is to create a nested data structure as a view which matches your json structure. You will use array_agg() for this:
CREATE view patientlist as select
arrayagg(patientname) as "PatientName"
patientlevel_impl_detail."PatientID"
FROM patientlevel_impl_detail,patientname
where patientlevel_impl_detail."PatientName" = patientname."SingleByte";
Then you should be able to:
SELECT row_to_json(patientlist) FROM patientlist;
Related
I am trying to develop database table structure from following JSON Structure. "requiredfields" are straight forward and I table setup for those. However, I am stuck on how to created schema to store "Configuration" and its nested properties in SQL Database. Any help will be appreciated. Thanks.
{
"requiredFields": [
"hello",
"world"
],
"configuration": {
"hello": {
"fallbacks": [
{
"type": "constant",
"value": "30"
}
]
},
"world": {
"fallbacks": [
{
"type": "fromInputFile",
"value": "patientFirstName"
},
{
"type": "fromInputFile",
"value": "subscriberFirstName"
},
{
"type": "constant",
"value": "alpha"
}
]
}
}
}
It looks like you could store that in a single table
CREATE TABLE ConfigurationFallbacks (
field nvarchar(100),
type nvarchar(100),
value nvarchar(100)
);
You may also want to add another table to store just the field values, and the table above would be foreign-keyed to that.
You can insert like this:
INSERT ConfigurationFallbacks (field, type, value)
SELECT
keys.[key],
j.type,
j.value
FROM OPENJSON(#json, '$.configuration') keys
CROSS APPLY OPENJSON(keys.value, '$.fallbacks')
WITH (
type nvarchar(100),
value nvarchar(100)
) j;
The first OPENJSON call does not have a schema, so it returns a set of key value pairs.
today I started using MySQL JSON fields and I'm a bit lost
I have the following table:
CREATE TABLE `formulario` (
`id` int NOT NULL AUTO_INCREMENT,
`values` json NOT NULL,
PRIMARY KEY (`id`)
) ENGINE=InnoDB AUTO_INCREMENT=8 DEFAULT CHARSET=utf8mb4 COLLATE=utf8mb4_0900_ai_ci;
The JSON "values" has a recursive structure, so that a "section" can have "sections" inside:
[
{
"id": 1,
"sections": [
{
"idSection": 1,
"name": "S1",
"sections": [
{
"idSection": 2,
"name": "S2",
"sections": [...]
}
]
}
]
}
]
Let's suppose that I want to update the "name" attribute of the "section" with id = 2. What I want is to do an "UPDATE" query that somehow updates the "name" value of a specific JSON object, and I don't know how to do a "WHERE" or if there is any way to filter the JSON to get it.
I have tried to use the "JSON_SET" function, but I can't get it to work, considering that I don't know how deep the JSON can be
Extra info: I am using hibernate, but I also don't know if HQL gives you any way to achieve this, or can I build some "NativeQuery" dynamically where I can insert to the "JSON_SET" function the path of what I want to change by building it at runtime
So I'm trying to generate an XML from data that I receive in a JSON file.
What I've done so far is that I had stored each field in the JSON as a keypair. So a row would be FieldName, FieldValue, and JSON_PK.
I have a second table that I created in order to create the XML. It has the JSON FieldName, equivalent XML FieldName, and indentation. The plan was to create a loop in SSIS to manually create the XML.
It was suggested to be that I use instead the FOR XML in my query.
However I've run an issue that every field is named FieldName. It's complicated by fields that hold their values like this <Form submittedDate="2020-01-01"/>
So before I go back to creating a loop to create my XML, I'm wondering what are best practices? I can't be the first one to run into this issue.
Thanks!
A quick followup because it was requested:
This is the approximate for that the JSON comes in as, except is far longer:
{
"name": "Mylist",
"id": "9e8-19c9-e5",
"templateName": "VDashboard - Plus",
"categories": [
""
],
"attributes": [
{
"name": "Division ID",
"value": "ABCD",
"Id": "123",
"units": null,
"timestamp": "1970-01-01T00:00:00.0000000Z",
"errors": null
},
{
"name": "ETA ",
"value": null,
"Id": "123",
"units": null,
"timestamp": "2021-01-25T21:24:36.2514056Z",
"errors": null
},
{
"name": "ETA Destination - Estimated Time",
"value": "1/11/2021 4:15:34 PM",
"Id": "123",
"units": null,
"timestamp": "1970-01-01T00:00:00.0000000Z",
"errors": null
}
]
}
And I need to output it as an XML File.
I need to import it into the DB because I do transformation of certain fields.
Output should look at bit like this:
2020-12-03T08:00:00-05:00
0011
My table structure looks like this. It's done so that I won't have a different table for every report:
Name VARCHAR(4) NOT NULL,
ID VARCHAR(50),
TemplateName VARCHAR(50),
AttributeName VARCHAR(50),
AttributeSubName VARCHAR(50),
AttributeValue VARCHAR(50),
AttributeID VARCHAR(50),
AttributeUnits VARCHAR(50),
AttributeTimestamp DateTime,
AttributeErrors VARCHAR(50),
I've managed to resolve the issue.
Initially, I had hoped to put the JSON into a table, and create the XML by putting those values into a string with xml tags around it.
This XML though requires both Elements and Attributes and it's contents are dynamic. Therefore creating an SQL query was far too difficult to create and harder to troubleshoot.
I found that the easiest way was to create the whole thing in C#.
I used a script task to use C#. Also, I needed a table that functions like an XREF or Data dictionary. I could join it on the contents of the JSON in order to specify what the XML should look like.
Using the object XMLWriter, I created the XML, and based on the format that the XML should take, I used the functions WriteStartElement, WriteElementString, WriteAttributeString, WriteEndElement, and WriteEndDocument I wrote the XML.
I have a JSON as follows but need a way to remove nulls before putting it into elasticsearch. Looking for a simple jq command to remove nulls that i can incorporate into my bash script unless there's a way to do this in elasticsearch
{
"master_no": {
"master_no": 100000000,
"barcode": "E00000000",
"external_key": null,
"umid": null
},
"cust_id": {
"other_cust_id": null,
"cust_reference": null,
"external_key": null,
"list_id": null,
"cust_id": null
},
"customer_name": null,
"master_desc": "test Custom Patch - test",
"barcode": "E00000000",
"container_master_no": null,
"master_status": "I",
"length": "0:00",
"format_no": {
"format_desc": null,
"external_key": null,
"format_no": null
},
"lib_master_audio": [
{
"master_no": 10000000,
"audio_channel_no": {
"audio_channel_no": 10,
"audio_channel": "1",
"external_key": null
}
},
{
"master_no": 100000000,
"audio_channel_no": {
"audio_channel_no": 10,
"audio_channel": "2",
"external_key": null
}
}
]
}
Thanks
This GitHub issue on remove null key and values from JSON can help you, in short, some handful of command might help you like, mentioned in this link :
del(.[][] | nulls)
Please note there are several methods of doing this, please check which one works for you.
As pointed out in comments by #oguz, Please use https://github.com/stedolan/jq/issues/104#issuecomment-289637207 which work with the latest version.
Hi I'm trying to insert complex json data in mysql workbench but my json data being inserted alphabetically sorted. How do I get json data with same order which I passed in insert query?
Create table:
CREATE TABLE payload ( `id` INT NOT NULL,
`json` JSON NOT NULL,
PRIMARY KEY (`id`));
Insert Json
INSERT INTO payload ( id, json)
VALUES (2, '{
"request": "release",
"type": [
{
"type" : 1
}
],
"start": [
{
"type": "sample",
"id": "01",
"content": [
{
"name": "jon",
"email": "jon#gmail.com"
}
]
}
]
}');
stored json in database after select * table name:
'3', '{\"type\": [{\"type\": 1}], \"start\": [{\"id\": \"01\", \"type\": \"sample\", \"content\": [{\"name\": \"jon\", \"email\": \"jon#gmail.com\"}]}], \"request\": \"release\"}'
Actually I want to have store my json same as my inserted json in database.
Is there a way to prevent the json data being alphabetically sorted?
Thanks.
Mysql will automatically sort the keys of a JSON object before it's stored.
I had the same problem!
The solution is to change the column type to 'text' and then the order of keys will not be changed!
Use serialize before storing. The conversion will make the array unrecognizable to MySQL so when retrieved and restored with unserialize it will be in the exact same order as before storage.
The drawback is the native MySQL json functions and methods can't be used to search and manipulate the array. Serializing is good only for storing and retrieving.
The only other way is to add a key with a value representing the array's sort order. That drawback is that the array must be sorted to restore it to its original state.