I have some data to be inserted into a MySQL column with the JSON datatype (blob_forms).
The value of the fields column is populated asynchronously, and if the document has multiple pages, then I need to append the data onto the existing row.
So a same table is;
document
document_id INT
text_data JSON
blob_forms JSON
blob_data JSON
The first chunk of data is correctly inserted and it is this data; (A sample)
{"fields": [
{"key": "Date", "value": "01/01/2020"},
{"key": "Number", "value": "xxx 2416 xx"},
{"key": "Invoice Date", "value": "xx/xx/2020"},
{"key": "Reg. No.", "value": "7575855"},
{"key": "VAT", "value": "1,000.00"}
]}
I am using lambda (Python) to handle the database insert, using this query
insertString = json.dumps(newObj)
sql = "INSERT INTO `document` (`document_id`, `blob_forms`) VALUES (%s, %s) ON DUPLICATE KEY UPDATE `blob_forms` = %s"
cursor.execute(sql, (self.documentId, insertString, insertString))
conn.commit()
The problem is, I also want to do an UPDATE too, so that if blob_forms has a value already, it would add the new items in the fields array to the existing objects fields array.
So basically use the original data input a second, so that if it is sent again, with the same document_id it would append to any existing data in blob_forms but preserve the JSON structure.
(Please note other processes write to this table and possibly this row due to the async nature, as the data for the columns can be written in any order, but the document_id ties them all together.
My failed attempt was something like this;
SET #j = {"fields": [{"key": "Date", "value": "01/01/2020"},{"key": "Number", "value": "xxx 2416 xx"},{"key": "Invoice Date", "value": "xx/xx/2020"},{"key": "Reg. No.", "value": "7575855"},{"key": "VAT", "value": "1,000.00"}]}
INSERT INTO `document` (`document_id`, `blob_forms`) VALUES ('DFGHJKfghj45678', #j) ON DUPLICATE KEY UPDATE blob_forms = JSON_INSERT(blob_forms, '$', #j)
I'm not sure that you can get the results that you want with 1 clean query in mysql. My advice would be to make the changes to the array on the client side (or wherever) and updating the entire field without delving into whether there is an existing value or not. I architect all of my api's in this way to keep the database interactions clean and fast.
So far this looks closest;
SET #j = '{"fields": [{"key": "Date", "value": "01/01/2020"},{"key": "Number", "value": "xxx 2416 xx"},{"key": "Invoice Date", "value": "xx/xx/2020"},{"key": "Reg. No.", "value": "7575855"},{"key": "VAT", "value": "1,000.00"}]}';
INSERT INTO `document` (`document_id`, `blob_forms`) VALUES ('DFGHJKfghj45678', #j) ON DUPLICATE KEY UPDATE blob_forms = JSON_MERGE_PRESERVE(blob_forms, #j)
Related
I'm trying to delete rows from a table depending on a specific value on a details column which is of json type.
The column is expected to have a json value like this one:
{
"tax": 0,
"note": "",
"items": [
{
"price": "100",
"quantity": "1",
"description": "Test"
}
]
}
The objects inside items could have a name entry or not. I'd like to delete those that don't have that entry.
NOTE: All objects inside items have the same entries so all of them will have or will not have the name entry
You can use a JSON path expression.
delete from the_table
where details::jsonb ## '$.items[*].name <> ""'
This checks if there is at least one array element where the name is not empty. Note that this wouldn't delete rows with an array element having "name": ""
As you didn't use the recommended jsonb type (which is the one that supports all the nifty JSON path operators), you need to cast the column to jsonb.
I'm using Power Query on Excel 2013 to convert an huge JSON file (more than 100Mb) to plain excel sheet.
All the fields except one are converted correctly but there is one specific field that is recognized as record. All other fields have a fixed text value or values separated by comma, so the conversion is pretty easy, this field inside has a JSON record structure so "Field" : "Value".
This is an extract of the file:
{
"idTrad": "xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx",
"images": {
"1": "SE1.JPG",
"2": "SE2.JPG"
},
"date": "2018-09-22",
"category": "MD",
"value": "Original text",
"language": "IT",
"contexts": [
""
],
"label": "Translated text",
"variantes": "1,23,45,23,32,232,2315,23131",
"theme": [
"XX_XXX"
]
}
The problematic field is "images" because it's recognized as a record, in the resulting table I have this situation:
[1]: https://i.stack.imgur.com/EnHow.png
My query so far is:
let
Source = Json.Document(File.Contents("filename.json")),
#"Converted to Table" = Table.FromList(Source, Splitter.SplitByNothing(), null, null, ExtraValues.Error),
#"Column1 développé" = Table.ExpandRecordColumn(#"Converted to Table", "Column1", {"value", "contexts", "theme", "variantes", "category", "label", "language", "idTrad","images", "date"}, {"Column1.value", "Column1.contexts", "Column1.theme", "Column1.variantes", "Column1.category", "Column1.label", "Column1.language", "Column1.idTrad","Column1.images", "Column1.date"}),
#"Valeurs extraites" = Table.TransformColumns(#"Column1 développé", {"Column1.contexts", each Text.Combine(List.Transform(_, Text.From), ","), type text}),
#"Valeurs extraites1" = Table.TransformColumns(#"Valeurs extraites", {"Column1.theme", each Text.Combine(List.Transform(_, Text.From), ","), type text})
in
#"Valeurs extraites1"
I would like to have in the images field a text rappresentation of the record so something like "1: SE1.JPG, 2: SE2.JPG", any ideas?
Sure, you can even do it in one step! If you convert a record to a table (Record.ToTable) it will create a table where the names of the fields in your record are in a column called "Name" and the values are in a column called "Value". This way you get your "1", "2", etc from the json file. From there you can just combine the columns into the text you want and convert and combine a list like you did in the rest of your columns.
= Table.TransformColumns(#"Valeurs extraites1",{"Column1.images",
each
Text.Combine(
Table.ToList(
Table.CombineColumns(
Record.ToTable(_)
,{"Name", "Value"},Combiner.CombineTextByDelimiter(": ", QuoteStyle.None),"Merged")
)
, ", ")
})
I wouldn't think Record.ToTable localizes it's column naming , but maybe test it with just converting the record to a table first to see what it does.
Table.TransformColumns(#"Valeurs extraites1",{"Column1.images",each Record.ToTable(_)})
I have a json column named Data in my user table in the database.
Example of content:
[
{
"id": 10,
"key": "mail",
"type": "male",
},
{
"id": 5,
"key": "name",
"type": "female",
},
{
"id": 8,
"key": "mail",
"type": "female",
}
]
let's assume that many row in the table may have the same content so they should be removed from all of the rows of the table too what i want to do is remove an item by key and value last thing i can come up with is this query for example i want to remove the item where id equal 10:
UPDATE
user
SET
`Data` =
JSON_REMOVE(`Data`,JSON_SEARCH(`Data`,'all',10,NULL,'$[*].id'),10)
but this query remove the all the content of the column.
If any one could help this is much appreciated.
By the way i get on this way because i can't seem to find a way to make it using QueryBuilder in laravel So it will be RawQuery.
Thank you Guys
After a lot of manual reading and interrogation i found the answer i will post for further help for anyone in need
UPDATE
user
SET
`Data` = JSON_REMOVE(
`Data`,
REPLACE(
REPLACE
(
JSON_SEARCH(
Data,
'all',
'10',
NULL,
'$**.id'
),
'.id',
''
),
'"',
''
)
)
==> Some explanation as i search and update the query and the content itself many times
I notice that JSON_SEARCH work only on string value if the value is int it will not find it so i cast the id(s) values of id to string after that JSON_SEARCH will return something like this $[the searched key].id but since i need to get the key of the hole item i need to remode ".id" so replace within was for that purpose and last to remove the quote from the result because it will be like this for example "$[0]" but JSON_REMOVE want it to be like this $[0] so that's the purpose of the second replace finally the item it self will be removed and the data will be updated
Hope laravel team can support those things in the future because i searched for a long hours but unfortunately no much help but we can get through with raw statement.
==> BE AWARE THAT IF THE ITEM YOU SEARCH FOR DOESN'T EXIST IN THE JSON CONTENT ALL THE JSON CONTENT WILL BE SET TO NULL
This is the Laravel way:
$jsonString = '[{
"id": 10,
"key": "mail",
"type": "male"
},
{
"id": 5,
"key": "name",
"type": "female"
},
{
"id": 8,
"key": "mail",
"type": "female"
}
]';
// decode json string to array
$data = json_decode($jsonString);
// remove item that id = 10
$data = array_filter($data, function ($item) {
return $item->id != 10;
});
// run the query
foreach ($data as $item){
DB::table('user')->where('id', $item->id)->update($item);
}
I have a mysql JSON column like:
column value
data [{ "report1": { "result": "5"}, "report2": {"result": "6"}, "report3": {"a": "4"}}, {"report1": { "result": "9"},"report4": {"details": "<b>We need to show the details here</b>"}, "report3": {"result": "5"}}]
another instance of data is:
[{ "report1": { "result": "5"}, "report2": {"result": "6"}, "report3": {"a": "4"}}, {"report1": { "result": "9"}, "report3": {"result": "5"},"report4": {"details": "<b>We need to show the details here</b>"}}]
In above record the key is present on 2nd index.
And in this:
[{ "report1": { "result": "5"}, "report2": {"result": "6"}, "report3": {"a": "4"}}, {"report1": { "result": "9"}, "report3": {"result": "5"}}]
The key is not present.
I need to replace {"details": "<b>We need to show the details here</b>"}, i.e. key report4's value with just [], I need now data in this report.
Actually, the logic for generating data have been changed from XML data to JSON for only that key, so, we need to replace it with a blank array, the target type now, without affecting the other data.
Is there any direct solution to that? I'm avoiding creating procedures here.
So, The Target data will be:
[{ "report1": { "result": "5"}, "report2": {"result": "6"}, "report3": {"a": "4"}}, {"report1": { "result": "9"},"report4": [], "report3": {"result": "5"}}]
And yes the keys in JSON are not consistent, means, a key may present in next or previous record in the table but may not present in this record.
The column should be of type JSON to use MySQL's JSON features efficiently. Then use the JSON modification functions, such as JSON_REPLACE.
Since each value contains a JSON array whose size may not be known in advance, you can create a small utility function to modify each element in the array.
create function modify_json(val json)
returns json
deterministic
begin
declare len int default json_length(val);
declare i int default 0;
while i < len do
# Replace the report4 property of the i'th element with an empty list
set val = json_replace(val, concat('$[', i, '].report4'), '[]');
set i = i + 1;
end while;
return val;
end;
With your utility function, update the records:
update table set data = modify_json(data)
where json_contains_path(data, 'one', '$[*].report4');
The records containing at least one element with a report4 property will be updated according to the modify_json function in this case. You could achieve the same thing with multiple update commands that operate on each index of the JSON array separately.
If the column can't be of type JSON for some reason, then you can allow MySQL to coerce the data or your program can marshall the string into a JSON object, modify the data, then serialize it to a string, and update the row.
A MySQL table has a JSON column containing a large amount of JSON data.
For example:
SELECT nodes From Table LIMIT 1;
results in:
'{"data": {"id": "Node A", "state": true, "details": [{"value": "Value","description": "Test"}, {"value": "Value2", "description": "Test2"}, {"value": "Value 7", "description": "Test 7"}, {"value": "Value 9", "description": "Test 9"}]}}'
How can I write queries that return rows in accordance with the following examples:
Where Node A state is True. In this case "Node A" is the value of key "id" and "state" contains True or False.
Where "value" is "Value2" or where "description" is "Test2." Note that these values are in a list that contains key value pairs.
I doubt if you can make a direct MySQL query to achieve above task. You will have to load all the string data from MySQL db then parse this string to get JSON object upon which you can perform your custom query operation to get your output.
But here in this case i will suggest you to use MongoDB which will be an ideal database storage solution and you can make direct queries.