Read and Write JSON in SQL Server - json

How to read and write JSON data in SQL Server ?
Is it possible to add any type of index to JSON fields from SQL server ?

SQL server supports JSON from version 2016
https://learn.microsoft.com/en-us/sql/t-sql/functions/json-functions-transact-sql?view=sql-server-ver15
JSON fields are indexed as normal text fields i belive

Related

How to avoid serialization of postgres json result in Sequel

In a web server, I'm running a query in Postgres that performs a select json_build_object of an aggregation(json_agg). I'm using Sequel to fetch this data, and I'm getting as a value an instance of Sequel::Postgres::JSONHash. Then, I perform a to_json in order to send it to the web client.
The output of this query is very big and I think that it could be more performant If I could grab the raw json response of postgres directly and send it to the client, instead of parsing it into JSONHash (which is done by Sequel) and then converting it again to json.
How could I do it?
Cast the json_build_object from json to text in the query: SELECT json_build_object(...)::text ...

Rails serialized JSON to string gives error on mysql JSON field

I have an old table which used to save json to a string sql field using the Rails ActiveRecord serializer. I created a new table with the field datatype as JSON which was introduced in MySQL 5.7. But directly copying the data from the old field to the new one gives me JSON error saying the json structure is wrong.
More specifically the problem is with unicode characters which my database does not support as of yet and the database is too large to just migrate everything to support it.
I am looking for a way to migrate the data from the old field to the new JSON field
I have seen that replacing \ufor the unicode character to \\u in the JSON string solved the issue but I am not just able to do this:
update table_name set column_name=REPLACE(column_name, '\u', '\\u');
since it gives an error again
ERROR 3140 (22032): Invalid JSON text: "Invalid escape character in string." at position 564 in value for column 'table_name.column_name'.
for sql updating table values \u to \\u :
update table_name set column_name=REPLACE(column_name, '\\u', '\\\\u') [where condition];

How to save an array to a mysql database in laravel

So i wanted to save some tags data to the database from the UI form,
Tags: ["male","female","kids"]
I tried everything like but it saves as a string, ialso tried checking if i can alter the data type to array in mysql, or json but i got this
You have an error in your SQL syntax; check the manual that corresponds to your MariaDB server version for the right syntax to use near 'JSON(255) NOT NULL' at line 1
I also tried json_encode and json_decode() but still no head so please what can i do?
There isn't any data type as JSON or Array in DB.
What you can do is use the tags field as TEXT as told by #Rick James and then encode your input as json by using json_encode() method before inserting and decode it after retrieving the data from DB by using json_decode() method.
JSON is basically a minimal, readable format for structuring data. Which means it can be considered as a String.
Here is a good post about JSON, in case you need.
What is JSON?
There is no datatype called JSON. Instead use TEXT without the (255).

hadoop - Validate json data loaded into hive warehouse

I have json files, volume is approx 500 TB. I have loaded complete set into hive data warehouse.
How would I validate or test the data that was loaded into hive warehouse. What should be my testing strategy ?
Client want us to validate the json data. Whether the data loaded into hive is correct ot not. Is there any miss? If yes, which field it was?
Please help.
How is your data being stored in hive tables ?
One option is create a Hive UDF function that receive the JSON string and validate the data and return another string with the error message or an empty string if the JSON string is well formed.
Here is a Hve UDF tutorial: http://blog.matthewrathbone.com/2013/08/10/guide-to-writing-hive-udfs.html
With the Hive UDF function in place you can executequeries like:
select strjson, validateJson(strjson) from jsonTable where validateJson(strjson) != "";

SQL Import and Export Wizard map nvarchar(max) to its equivalent data type in csv

Using SQL Import and Export Wizard i am importing data from csv file to SQL tables,
but in SQL Table is having Column data type nvarchar(max) and i have to convert my data type equivalent to nvarchar(max), i am Tried DT_STR , DT_WSTR,. and according to my R&D i got link that i want to share with you
=> http://msdn.microsoft.com/en-us/library/ms141036.aspx =>Mapping of Integration Services Data Types to Database Data Types
so there is no such kind of conversion for nvarchar(max). so if anyone knows then do the needful. thanks in advance.
You can use text stream [DT_TEXT] or Unicode text stream [DT_NTEXT].
Resource:
The SQL Server Integration Services Data Type Map
SSIS to SQL Server Data Type Translations