Read Json Value from a SQL Server table - json

I have a Json value stored in SQL server table as ntext:
JSON (column: json_val):
[{"prime":{"image":{"id":"123","logo":"","productId":"4000","enable":true},"accountid":"78","productId":"16","parentProductId":"","aprx":"4.599"}}]
select JSON_VALUE(cast(json_val as varchar(8000)), '$.prime.aprx') as px
from table_1
where id = 1
Whenever I execute it, i receive a null. What's wrong with the query?
Thanks for your help!

The JSON string is an array with a single item. You need to specify the array index to retrieve a specific item, eg :
declare #t table (json_val nvarchar(4000))
insert into #t
values ('[{"prime":{"image":{"id":"123","logo":"","productId":"4000","enable":true},"accountid":"78","productId":"16","parentProductId":"","aprx":"4.599"}}]')
select JSON_VALUE(cast(json_val as varchar(8000)), '$[0].prime.aprx') as px
from #t
This returns 4.599
If you want to search all array entries, you'll have to use OPENJSON. If you need to do that though ...
Avoid JSON if possible
JSON storage is not an alternative to using a proper table design though. JSON fields can't be indexed, so filtering by a specific field will always result in a full table scan. Given how regular this JSON string is, you should consider using proper tables instead

As Panagiotis said in the comments:
As for the JSON path, this JSON string is an array with a single element
Instead, therefore, you can use OPENJSON which would inspect each array:
DECLARE #JSON nvarchar(MAX) = N'[{"prime":{"image":{"id":"123","logo":"","productId":"4000","enable":true},"accountid":"78","productId":"16","parentProductId":"","aprx":"4.599"}}]';
SELECT aprx
FROM (VALUES(#JSON))V(json_val)
CROSS APPLY OPENJSON(V.json_val)
WITH (aprx decimal(4,3) '$.prime.aprx');
As also mentioned, your JSON should already be a string data type (should be/probably an nvarchar(MAX)) so there's no reason to CAST it.

Related

How do I update data inside a stringified JSON object in SQL?

So I have three databases - an Oracle one, SQL Server one, and a Postgres one. I have a table that has two columns: name, and value, both are texts. The value is a stringified JSON object. I need to update the nested value.
This is what I currently have:
name: 'MobilePlatform',
value:
'{
"iosSupported":true,
"androidSupported":false,
}'
I want to add {"enableTwoFactorAuth": false} into it.
In PostgreSQL you should be able to do this:
UPDATE mytable
SET MobilePlatform = jsonb_set(MobilePlatform::jsonb, '{MobilePlatform,enableTwoFactorAuth}', 'false');
In Postgres, the plain concatenation operator || for jsonb could do it:
UPDATE mytable
SET value = value::jsonb || '{"enableTwoFactorAuth":false}'::jsonb
WHERE name = 'MobilePlatform';
If a top-level key "enableTwoFactorAuth" already exists, it is replaced. So it's an "upsert" really.
Or use jsonb_set() for manipulating nested values.
The cast back to text works implicitly as assignment cast. (Results in standard format; any insignificant whitespace is removed effectively.)
If the content is valid JSON, the storage type should be json to begin with. In Postges, jsonb would be preferable as it's easier to manipulate, but that's not directly portable to the other two RDBMS mentioned.
(Or, possibly, a normalized design without JSON altogether.)
For ORACLE 21
update mytable
set json_col = json_transform(
json_col,
INSERT '$.value.enableTwoFactorAuth' = 'false'
)
where json_exists(json_col, '$?(#.name == "MobilePlatform")')
;
With json_col being JSON or VARCHAR2|CLOB column with IS JSON constraint.
(but must be JSON if you want a multivalue index on json_value.name:
create multivalue index ix_json_col_name on mytable t ( t.json_col.name.string() );
)
Two of the databases you are using support JSON data type, so it doesn't make sense to have them as stringified JSON object in a Text column.
Oracle: https://docs.oracle.com/en/database/oracle/oracle-database/21/adjsn/json-in-oracle-database.html
PostgreSQL: https://www.postgresql.org/docs/current/datatype-json.html
Apart from these, MSSQL Server also provides methods to work with JSON data type.
MS SQL Server: https://learn.microsoft.com/en-us/sql/relational-databases/json/json-data-sql-server?view=sql-server-ver16
Using a JSON type column in any of the above databases would enable you to use their JSON functions to perform the tasks that you are looking for.
If you've to use Text only then you can use replace to add the key-value pair at the end of your JSON
update dataTable set value = REPLACE(value, '}',",\"enableTwoFactorAuth\": false}") where name = 'MobilePlatform'
Here dataTable is the name of table.
The cleaner and less riskier way would be connect to db using the application and use JSON methods such as JSON.parse in Javascript and JSON.loads in Python. This would give you the JSON object (dictionary in case of Python) to work on. You can look for similar methods in other languages as well.
But i would suggest, if possible use JSON columns instead of Text to store the JSON value wherever possible.

Azure Data Factory copy data from json string nested in a json

I am fetching data from a third party API, which responds with a JSON payload. However, this JSON contains another JSON object, stored as a string including escape characters. Example:
{
"aggregationType": "IDENTITY",
"outputs": [
{
"name": "Sinusoid|Sinusoid"
}
],
"value": "{\"dataX\":[1,2,3,4],\"dataY\":[1,4,9,16]}"
}
In the first part of the file, we have some regular parameters like 'aggregationType' and 'outputs', but the last parameter 'value' is the JSON object I am talking about.
What I would like to do is to enter the 'dataX' and 'dataY' arrays together into a table on a SQL DB. I haven't found a straightforward way of doing it so far.
What I've tried:
Using a simple copy activity, but I can only access the whole 'value' field, not separate out 'dataX' from 'dataY', let alone the array's individual values.
Using the lookup activity to then store 'value' in a variable. From here I can get to a usable JSON object in ADF, but the only way I've found of then sending the data to the DB is to use a ForEach activity containing a copy activity. Since dataX and dataY are in reality much larger, this seems to take forever when I debug.
Copying only the 'value' object to a blob and trying to retrieve the data from there. This hasn't worked because the object always ends up getting stored with the initial " marks and the \ escape characters.
Is there any way of getting around this issue?
You can store the value in a staging kind of table in SQL and then create a stored procedure to separate out the objects as arrays
JSON_Value can help you extract values:
SELECT JSON_VALUE('{"dataX": [1,2,3,4]}', '$.dataX') AS 'Output';
In your stored procedure you can try using above query and insert values in SQL table
To expand, on the tip from #Pratik Somaiya, I've written up a stored procedure that does the work of inserting the data to a persistent table.
I've had to use a WHILE loop, which doesn't really feel right, so I'm still on the lookout for a better solution on that.
CREATE OR ALTER PROCEDURE dataset_outputs_from_adf
#json_output_value NVARCHAR(MAX),
#output_name NVARCHAR(100)
AS
DECLARE #i INT = 0
WHILE JSON_VALUE(#json_output_value,CONCAT('$.dataX[',#i,']')) IS NOT NULL
BEGIN
INSERT INTO my_table
VALUES (
#output_name,
JSON_VALUE(
#json_output_value,
CONCAT('$.dataX[',#i,']')
),
JSON_VALUE(
#json_output_value,
CONCAT('$.dataY[',#i,']')
)
)
SET #i = #i + 1
END
GO
I should be able to make the whole thing repeatable without repetitive code by parameterizing Data Factory with the output name.
As I understand it, you can retrieve the embedded "value" JSON but it retains its escape characters. You would like to pass the arrays [1,2,3,4] and [1,4,9,16] to a relational database (Microsoft SQL Server?) to store them.
The embedded "value" can be converted to referencable JSON using the expression json(). This handles the escaped characters.
#json(variables('payload')).dataX
Will return the array [1,2,3,4] as expected.
How best to get this into SQL Server? We are limited to the activities ADF supports, which really comes down to a stored procedure (SP). Using a table valued parameter would be ideal, but not possible in current ADF. So I would suggest passing it to the SP as a string.
#string(json(variables('payload')).dataX)
This will look much the same as above but will be a string not an array.
In the SP there are a couple of ways to parse this string. If your version supports it STRING_SPLIT is convenient. Note the passed string will retain its leading and trailing square bracket. These can be removed in ADF or in SQL, it doesn't much matter where.
Since the data is JSON it may make more sense to use OPENJSON instead. Let's say we pass the contents of dataX to a SP parameter #dataX varchar(4000). Inside the SP we write
create procedure dbo.HandleData
#dataX varchar(4000),
#dataY varchar(4000)
as
insert dbo.SomeTable(ColumnX, ColumnY)
select x.value, y.value
from OPENJSON(#dataX) x
inner join OPENJSON(#dataY) y
on y.[key] = x.[key];
Further code may be needed if the arrays could be of different lengths, or there are NULLs, or non-integer values etc. Of course the resultset from OPENJSON can be used for joining or any other purpose within the SP.
If both dataX and dataY from the original payload end up in the same DB table the SP can be called twice, once for each. Alternatively you can union their arrays in ADF and call the SP once.

Update the value of JSON elements in Postgresql

I have table with following table structure:
create table instances(
id bigint,
createdate timestamp,
createdby bigint,
lastmodifieddate timestamp,
lastmodifiedby bigint,
context text
)
Field context contains a JSON data i.e.
insert into instances values
(1, '2020-06-01 22:10:04', 20112,'2020-06-01 22:10:04',20112,
'{"id":1,"details":[{"binduserid":90182}]}')
I need to replace all values of JSON element binduserid with value 90182 using postgres query.
I have achieved this by using REPLACE function:
update instances
set context = replace(context, '"binduserid":90182','"binduserid":1000619')
Is there any other way to do this by using Postgres JSON Functions
Firstly, let's consider storing the column as JSON or JSONB those are already defined to hold the data properly and use in a productive manner such as no needed conversions among types like holding a DATE value in DATE format rather than a STRING.
In this case I consider context column in JSONB data type.
You can use JSONB_SET() function in order to get the desired result where the first argument(target) might be in array format through use of JSONB_BUILD_ARRAY() function with indexes (as 0 in '{0,details}' for this case ) to manipulate easily by the below DML Statement :
UPDATE instances
SET context =
JSONB_SET(JSONB_BUILD_ARRAY(context), '{0,details}','[{"binduserid":1000619}]')
Demo

JSON update single value in MySQL table

I have a JSON array in the MySQL payment table details column. I need to update a single value of this JSON array. What is the procedure to update JSON using MySQL?
JSON Array
{"items":[{"ca_id":18,"appointment_date":"2018-09-15 15:00:00","service_name":"Software Installation / Up-gradation","service_price":165}],"coupon":{"code":"GSSPECIAL","discount":"10","deduction":"0.00"},"subtotal":{"price":165,"deposit":0},"tax_in_price":"included","adjustments":[{"reason":"Over-time","amount":"20","tax":"0"}]}
I need to update the appointment _date 2018-09-15 15:00:00 to 2018-09-28 15:00:00.
Here is a pure MySQL JSON way of doing this:
UPDATE yourTable
SET col = JSON_REPLACE(col, '$.items[0].appointment_date', '2018-09-28 15:00:00');
The best I could come up with is to address the first element of the JSON array called items, and then update the appointment_date field in that array element.
Here is a demo showing that the JSON replacement syntax/logic is working:
Demo
But, you could equally as well have done this JSON work in your PHP layer. It might make more sense to do this in PHP.
If you want to do this in php then, steps to follow:
Select the respective column from the table
Use json_decode to convert the string to array
Now you have the json object, apply your modifications
Use json_encode to convert your json object back to string
Save this string in table

Generate UUID for Postgres JSON document

I'm inserting into a Postgres table with a JSON document and I want to generate a unique ID for the document. I can do that on my own, of course, but I was wondering if there was a way to have PG do it.
INSERT INTO test3 (data) VALUES ('{"key": "value", "unique": ????}')
The docs seem to indicate that JSON records fit into various SQL data types, but I don't see how that actually works.
How about just concatenating? Assuming your column is of type json/jsonb, something like the following should work:
INSERT INTO test3 (data) VALUES (('{"key": "value", "unique": "' || uuid_generate_v4() || '"}')::jsonb)
If you're looking to generate a UUID and store it at the same time as a value within a JSON data field, here is something some may find to be a little more sane:
WITH
-- Create a temporary view named "new_entry" containing your data
new_entry
-- This is how you name the view's columns
("key", "unique")
AS (
VALUES
-- This is the actual row returned by the view
(
'value',
uuid_generate_v4()
)
)
INSERT INTO
test3(
data
)
SELECT
-- Convert row to JSON. Column name = key, column value = value.
ROW_TO_JSON(new_entry.*)
FROM
new_entry
First, we're creating a temporary view named new_entry, which containing all of the data want to store in a JSON data field.
Second, we're grabbing that entry and passing it to the ROW_TO_JSON function which converts it to a valid JSON data type. Once converted, it's then inserting the row into the test3 table.
My reasoning for the "sanity" is that more than likely, your JSON object will end up containing more than just two key/value pairs... Rather, you'll end up with a hand full of keys and values, in which it'll be up to you to ensure you don't miss any quotes and escape user input appropriately. Why glue all of this together manually when you can have Postgres do it for you (with the help of ROW_TO_JSON()) while at the same time, making it easier to read and debug?