I want to create this JSON text with Delphi IDE:
{
"uygulamaninAdi": "ZGVuZW1l",
"uygulamaninSurumu1": "1",
"uygulamaninSurumu2": "0",
"uygulamaninSurumu3": "0",
"uygulamaninSurumu4": "0",
"uygulamaninMimarisi": "1",
"calistirilmaDuzeyi": "1",
"mesajGonderebilmeDurumu": "1",
"konum": "RDpcUHJvZ3JhbWxhbWE="
}
Here are my codes:
procedure TfrmManifestDosyasiOlusturucu.btnOluşturClick(Sender: TObject);
begin
jsGönderilecekMesaj := TJSONObject.Create;
jsGönderilecekMesaj.AddPair('uygulamaninAdi', TNetEncoding.Base64.Encode(edUygulamanınAdı.Text));
jsGönderilecekMesaj.AddPair('uygulamaninSurumu1', edUygulamanınSürümü1.Text);
jsGönderilecekMesaj.AddPair('uygulamaninSurumu2', edUygulamanınSürümü2.Text);
jsGönderilecekMesaj.AddPair('uygulamaninSurumu3', edUygulamanınSürümü3.Text);
jsGönderilecekMesaj.AddPair('uygulamaninSurumu4', edUygulamanınSürümü4.Text);
jsGönderilecekMesaj.AddPair('uygulamaninMimarisi', IntToStr(cbUygulamanınMimarisi.ItemIndex));
jsGönderilecekMesaj.AddPair('calistirilmaDuzeyi', IntToStr(cbÇalıştırılmaDüzeyi.ItemIndex));
jsGönderilecekMesaj.AddPair('mesajGonderebilmeDurumu', IntToStr(cbMesajGönderebilmeDurumu.ItemIndex));
jsGönderilecekMesaj.AddPair('konum', TNetEncoding.Base64.Encode(edManifestDosyasınınOluşturulacağıKonum.Text));
strKomutSatırıParametreleri := '/olustur ' + jsGönderilecekMesaj.ToString;
ShellExecute(0, 'open', 'bin\Manifest Dosyası Oluşturucu (Yardımcı Uygulama).exe', PWideChar(strKomutSatırıParametreleri), nil, SW_HIDE);
end;
But the problem is the Delphi IDE creates this:
{uygulamaninAdi:ZGVuZW1l,uygulamaninSurumu1:1,uygulamaninSurumu2:0,uygulamaninSurumu3:0,uygulamaninSurumu4:0,uygulamaninMimarisi:1,calistirilmaDuzeyi:1,mesajGonderebilmeDurumu:1,konum:RDpcUHJvZ3JhbWxhbWE=}
As far as I know all of keys should be nested with quotes and all of string values should be nested be quotes.
How can I fix my problem?
Try using TJSONObject.ToJSON instead of TNSONObject.ToString.
Also, JSON really isn't well-suited for passing around on the command line. If the target program expects to receive the JSON as a single parameter, and since the JSON has quotes in it and potentially also spaces, you should use AnsiQuotedStr() to add quotes around, and to escape quotes inside, of the JSON string.
Related
I need a better understanding about stringify, escape and storing in mysql database. The task looked easy but with escaping I run into some trouble. So I would be happy for general explanation of the following questions:
What I try is to store a javascript object in a mysql DB. It works fine with stringify prior to send. Getting it back from the DB just parse it and everything is fine.
let myObj = {
name: 'Paul',
age: '24'
}
Now, I have additionally a message in my object, which can have special characters:
let myObj = {
name: 'Paul',
age: '24',
message: 'message with special characters: ',`´"~'
}
Also no problem, I started to escape. The result:
let myObj = {
name: 'Paul',
age: '24',
message: 'message with special characters: \'\,\`´\"\~'
}
If I do stringify the object, I get following result:
{
"name": "Paul",
"age": "24",
"message": "message with special characters: \\'\\,\\`´\\\"\\~"
}
Sending it to mysql DB gives following error:
(node:13077) UnhandledPromiseRejectionWarning: Error: ER_PARSE_ERROR: You have an error in your SQL syntax; check the manual that corresponds to your MariaDB server version for the right syntax to use near '\,\`´\"\~"}
Due to the error I manipulated the special characters and removed the additional '\' which gives following result:
obj.message = obj.message(/\\\\/g,'\\');
output:
{
"name": "Paul",
"age": "24",
"message": "message with special characters: \'\,\`´\\"\~"
}
everything is fine and the data is transfered to the DB and my mysql update query has no failure anymore.
Questions:
Is there a better way dealing with escaping inside an object, which will be stringified and send to a mysql DB?
If yes, how is it done? Or is there no other way as to remove the additional backslashes inserted due to the stringify?
One step further. The message has included a new line: \n:
output stringified:
{
"name": "Paul",
"age": "24",
"message": "message with special characters: \'\,\`´\n\\"\~"
}
Sending it to the DB I get following entry (Where \n: I get a new line):
{"name":"Paul","age":"24","message":"message with special characters: ',`´
\"~"}
Which results in an error parsing it back. here is the log (serverside) prior parsing (error makes sense):
{"name":"Paul","age":"24","message":"\',`´\n' +
'\\"~"}
Question:
Regarding the upper part, what do I have to do, to get the \n also escaped? Which means, that the DB entry is correct and the DB doesn't take the \n to start a new line?
Happy for any explaning / help!
I don't know how's the correct way or the easy way, but that's how I did it when I needed to insert a user generated field as a JSON in a MYSQL database
string_cleanJSON_preStringify(str)
{
if(!str.replace) return str;
str=str.replace(/'/g,"\\'"); //escape all at least ' once
str=str.replace(/"/g,'\\"'); //escape all at least " once
str=str.replace(/[\t\r\n\f]/g,''); // remove problematic escape characters
if(str.charAt(str.length-1) == '\\') str+=' '; // add blank space at the end if \ is last character - for example: {"var":"\"} would be problematic
return str;
}
string_cleanJSON_to_query(str)
{
str = str.replace(/(\\)+\\/g,'\\'); // replace all \ more than 1 in a row, to be just 1 ( \\ -> gets escaped again when it's processed to just \)
str = str.replace(/(\\)+"/g,'\\\\\\"'); // replace all \" more than 1 (ex \\\") - i don't know why \\\\\\ - this seem to work in my case, might need to alter based on str manipulations before insert
str = str.replace(/(\\)+'/g,"\\'"); // i don't know why \\ - this seem to work in my case, might need to alter based on str manipulations before insert
str = str.replace(/(\\)+t/g,"t"); // same process as above but with problematic escape characters
str = str.replace(/(\\)+r/g,"r");
str = str.replace(/(\\)+n/g,"n");
str = str.replace(/(\\)+f/g,"f");
return str;
}
How I use this to get a query:
let o = {field_data:string_cleanJSON_preStringify(user_gen_field_data)}
let j = string_cleanJSON_to_query(JSON.stringify(o));
let q = `INSERT INTO blabla (json) VALUES('${j}')`;
This works:
select json_value('{ "a": "b" }', '$.a')
This doesn't work:
select json_value('{ "a": "b" }', '$["a"]')
and neither does this:
select json_value('{ "a": "b" }', '$[''a'']')
In JSON, these are the same:
foo = { "a": "b" }
console.log(foo.a)
console.log(foo["a"])
What am I missing? I get an error trying to use bracket notation in SQL Server:
JSON path is not properly formatted. Unexpected character '"' is found at position 2
No sooner do I ask, than I stumble on an answer. I couldn't find this in any documentation anywhere, but select json_value('{ "a": "b" }', '$."a"') works. Bracket notation is not supported, but otherwise invalid keys can be escaped with quotation marks, e.g. select json_value('{ "I-m so invalid][": "b" }', '$."I-m so invalid]["') when in JavaScript that would be foo["I-m so invalid]["]
MsSql reserves this for array index. SQL parses all JSON as a string literal, instead of as an object(JSON or ARRAY) with any hidden key.
Some of what SQL can do will vary with version. Here's a crash course on the annoying (but also really powerful, and fast once in place) requirements. I'm posting more than you need because a lot of the documentation for JSON through MsSql is lacking, and doesn't do justice to how strong it is with JSON.
MsDoc here: https://learn.microsoft.com/en-us/sql/relational-databases/json/json-data-sql-server?view=sql-server-ver15
In this example, we are working with a JSON "object" to separate the data into columns. Note how calling a position inside of an array is weird.
declare #data nvarchar(max) = N'[{"a":"b","c":[{"some":"random","array":"value"},{"another":"random","array":"value"}]},{"e":"f","c":[{"some":"random","array":"value"},{"another":"random","array":"value"}]}]'
--make sure SQL is happy. It will not accept partial snippets
select ISJSON(#data)
--let's look at the data in tabular form
select
json1.*
, json2.*
from openjson(#data)
with (
a varchar --note there is no "path" specified here, as "a" is a key in the first layer of the object
, c nvarchar(max) as JSON --must use "nvarchar(max)" and "as JSON" or SQL freaks out
, c0 nvarchar(max) N'$.c[0]' as JSON
) as json1
cross apply openjson(json1.c) as json2
You can also pull out the individual values, if needed
select oj.value from openjson(#data) as oj where oj.[key] = 1;
select
oj.value
, JSON_VALUE(oj.value,N'$.e')
, JSON_VALUE(oj.value,N'$.c[0].some')
, JSON_VALUE(#data,N'$[1].c[0].some') --Similar to your first example, but uses index position instead of key value. Works because SQL views the "[]" brackets as an array while trying to parse.
from openjson(#data) as oj
where oj.[key] = 1
I have a incoming data structure that looks like this:
declare #json nvarchar(max) = '{
"action": "edit",
"data": {
"2077-09-02": {
"Description": "some stuff",
"EffectDate": "2077-1-1"
}
}
}';
To give you a long story short, I think TSQL hates this json structure, because no matter what I have tried, I can't get to any values other than "action".
The {data} object contains another object, {2077-09-02}. "2077-09-02" will always be different. I can't rely on what that date will be.
This works:
select json_value(#json, '$.action');
None of this works when trying to get to the other values.
select json_value(#json, '$.data'); --returns null
select json_value(#json, '$.data[0]'); --returns null
select json_value(#json, 'lax $.data.[2077-09-02].Description');
--JSON path is not properly formatted. Unexpected character '[' is found at position 11.
select json_value(#json, 'lax $.data.2077-09-02.Description');
--JSON path is not properly formatted. Unexpected character '2' is found at position 11.
How do I get to the other values? Is the JSON not perfect enough for TSQL?
It is never a good idea to use the declarative part of a text based container as data. The "2077-09-02" is a valid json key, but hard to query.
You can try this:
declare #json nvarchar(max) = '{
"action": "edit",
"data": {
"2077-09-02": {
"Description": "some stuff",
"EffectDate": "2077-1-1"
}
}
}';
SELECT A.[action]
,B.[key] AS DateValue
,C.*
FROM OPENJSON(#json)
WITH([action] NVARCHAR(100)
,[data] NVARCHAR(MAX) AS JSON) A
CROSS APPLY OPENJSON(A.[data]) B
CROSS APPLY OPENJSON(B.[value])
WITH (Description NVARCHAR(100)
,EffectDate DATE) C;
The result
action DateValue Description EffectDate
edit 2077-09-02 some stuff 2077-01-01
The idea:
The first OPENJSON will return the action and the data.
I use a WITH clause to tell the engine, that action is a simple value, while data is nested JSON
The next OPENJSON dives into data
We can now use B.[key] to get the json key's value
Now we need another OPENJSON to dive into the columns within data.
However: If this JSON is under your control I'd suggest to change its structure.
Use double quotes instead of []. JSON Path uses JavaScript's conventions where a string is surrounded by double quotes. The documentation's example contains this path $."first name".
In this case :
select json_value(#json,'$.data."2077-09-02".Description');
Returns :
some stuff
As for the other calls, JSON_VALUE can only return scalar values, not objects. You need to use JSON_QUERY to extract JSON objects, eg :
select json_query(#json,'$.data."2077-09-02"');
Returns :
{
"Description": "some stuff",
"EffectDate": "2077-1-1"
}
I have a column of text type be contain JSON value.
{
"customer": [
{
"details": {
"customer1": {
"name": "john",
"addresses": {
"address1": {
"line1": "xyz",
"line2": "pqr"
},
"address2": {
"line1": "abc",
"line2": "efg"
}
}
}
"customer2": {
"name": "robin",
"addresses": {
"address1": null
}
}
}
}
]
}
How can I extract 'address1' JSON field of column with query?
First I am trying to fetch JSON value then I will go with parsing.
SELECT JSON customer from text_column;
With my query, I get following error.
com.datastax.driver.core.exceptions.SyntaxError: line 1:12 no viable
alternative at input 'customer' (SELECT [JSON] customer...)
com.datastax.driver.core.exceptions.SyntaxError: line 1:12 no viable
alternative at input 'customer' (SELECT [JSON] customer...)
Cassandra version 2.1.13
You can't use SELECT JSON in Cassandra v2.1.x CQL v3.2.x
For Cassandra v2.1.x CQL v3.2.x :
The only supported operation after SELECT are :
DISTINCT
COUNT (*)
COUNT (1)
column_name AS new_name
WRITETIME (column_name)
TTL (column_name)
dateOf(), now(), minTimeuuid(), maxTimeuuid(), unixTimestampOf(), typeAsBlob() and blobAsType()
In Cassandra v2.2.x CQL v3.3.x Introduce : SELECT JSON
With SELECT statements, the new JSON keyword can be used to return each row as a single JSON encoded map. The remainder of the SELECT statment behavior is the same.
The result map keys are the same as the column names in a normal result set. For example, a statement like “SELECT JSON a, ttl(b) FROM ...” would result in a map with keys "a" and "ttl(b)". However, this is one notable exception: for symmetry with INSERT JSON behavior, case-sensitive column names with upper-case letters will be surrounded with double quotes. For example, “SELECT JSON myColumn FROM ...” would result in a map key "\"myColumn\"" (note the escaped quotes).
The map values will JSON-encoded representations (as described below) of the result set values.
If your Cassandra version is 2.1x and below, you can use the Python-based approach.
Write a python script using Cassandra-Python API
Here you have to get your row first and then use python json's loads method, which will convert your json text column value into JSON object which will be dict in Python. Then you can play around with Python dictionaries and extract your required nested keys. See the below code snippet.
from cassandra.cluster import Cluster
from cassandra.auth import PlainTextAuthProvider
import json
if __name__ == '__main__':
auth_provider = PlainTextAuthProvider(username='xxxx', password='xxxx')
cluster = Cluster(['0.0.0.0'],
port=9042, auth_provider=auth_provider)
session = cluster.connect("keyspace_name")
print("session created successfully")
rows = session.execute('select * from user limit 10')
for user_row in rows:
customer_dict = json.loads(user_row.customer)
print(customer_dict().keys()
I am trying to manipulate a JSON with a single quote inside, and I am having some troubles:
1.- When I have a function, I cant pass as parameter a JSON string with a single quote inside:
This is the function:
CREATE OR REPLACE FUNCTION public.give_me_text
(
IN text json
)
RETURNS JSON AS
$$
DECLARE
v_text varchar;
begin
RAISE NOTICE 'text: %', text;
v_text:=text || '- hello';
return v_text;
end
$$
LANGUAGE 'plpgsql';
By calling like this is working:
SELECT * FROM give_me_text
(
'{"es":"name 1","en":"name 2","de":"name 3","fr":"name 4","pt":"name 5"}'
);
But when I have got a single quote is not working:
SELECT * FROM give_me_text
(
'{"es":"nam'e 1","en":"name 2","de":"name 3","fr":"name 4","pt":"name 5"}'
);
2.- When I am tryhing to insert JSON value with Single quote inside, it is giving me the same error:
This is working:
INSERT INTO public.my_columns VALUES ('{ "name": "Book the First", "author": { "first_name": "Bob", "last_name": "White" } }');
But this is not working:
INSERT INTO public.my_columns VALUES ('{ "name": "Book's the First", "author": { "first_name": "Bob", "last_name": "White" } }');
Any ideas how to escape this single quote? Thanks
In SQL, single quote must be escaped in a string. This is not a valid string:
'{"es":"nam'e 1"}'
because the string ends after "nam. You can escape the single quote by repeating it:
'{"es":"nam''e 1"}'
If you're doing dynamic SQL, you can pass parameters to execute:
execute 'insert into YourTable (col1) values ($1);' using json_var;