Update query to convert existing complex json object to array using SQL - mysql

I have a table that contains a JSON column. Currently, the JSON column holds data which has the below schema :
Input or Source JSON looks like below :
"Q1": {
"id": 1,
"value": 12
},
"Q2": [{
"id": 1,
"value": 12
}, {
"id": 1,
"value": 12
}],
"Q3": {
"id": 1,
"additional": true
}
}
I have to perform a migration activity wherein I need to convert the Q1 key's value type from JSON object to JSON array. I don't see any utility functions which can help me in MySQL. Please suggest a suitable approach that can help me.
Result JSON should look like below
{
"Q1": [{
"id": 1,
"value": 12
}],
"Q2": [{
"id": 1,
"value": 12
}, {
"id": 1,
"value": 12
}],
"Q3": {
"id": 1,
"additional": true
}
}

Related

Pyspark read json column has invalid characters

I'm reading a JSON in pyspark and seeing the below issue.
Column name "change(me)" contains invalid characters, please use alias to rename it
I have tried to use withColumnRenamed but that does not seem to help
df = spark.read.option("multiline","true").json("json_file")
df = df.withColumnRenamed("change(me)", "change_me")
Here is my sample json
{
"1": {
"task": [
"wakeup",
"getready"
]
},
"2": {
"task": [
"brush",
"shower"
]
},
"3": {
"task": [
"brush",
"shower"
]
},
"activites": ["standup", "play", "sitdown"],
"statuscheck": {
"time": 60,
"color": 1002,
"change(me)": 9898
},
"action": ["1", "2", "3", "4"]
}
when I check for columns in my dataframe, I do not see change(me) but it still complains of invalid character

In T-SQL on SQL Server 2016, how do I reference the data after the # in the data.extendedProperty?

I receive the following json from an vendor API call. I can pull the data.extendedProprty into a varchar column, but then I need to add code to break it down into entityId, reference. Is there a simpler way to have SQL reference that data?
Sample code
[
{
"data": {
"extendedProperty": "#{entityId=a56e2696-03ea-4550-8bd4-e52e52a49f9f; reference=/reference/extendedProperty/a56e2696-03ea-4550-8bd4-e52e52a49f9f; body=}",
"name": "A",
"displayOrder": 2
},
"id": {
"entityId": "6d960f08-1a6c-4b10-a8ab-dd6ce8cf5126",
"entityVersion": 1,
"entityOrigin": "BulkUpload",
"createdOn": "2015-04-29T18:51:31.77",
"deleted": false
}
},
{
"data": {
"extendedProperty": "#{entityId=f658af92-cb88-4f3d-9e71-e436e445a158; reference=/reference/extendedProperty/f658af92-cb88-4f3d-9e71-e436e445a158; body=}",
"name": "B",
"displayOrder": 2
},
"id": {
"entityId": "71cc0553-1d33-45b5-9097-2d0dbc11d84d",
"entityVersion": 1,
"entityOrigin": "BulkUpload",
"createdOn": "2018-12-10T21:44:38.607",
"deleted": false
}
}
]

Insert data type JSON in lighthouse php

I´m working in back graphql API with Laravel and Lighthouse.
I have a table with a column named "config" that stores json data.
I´m trying to create a new register to that table.
I have a mutation:
createOrderTemplate(name: String!, config: JSON!): OrderTemplate #create
The schema is:
type OrderTemplate {
id: ID!
name: String!
config: JSON!
}
I´ve tried the mutation in graphql-playground
mutation{
createOrderTemplate(
name:"SomeName",
config:
[{
"w_name": "Name1",
"w_code": "001",
"place": "Place1",
"job": "job1",
"data": [
{
"name": "name1",
"quantity": 2
},
{
"name": "name2",
"quantity": 2
},
{
"name": "name3",
"quantity": 1
},
{
"name": "name4",
"quantity": 2
}
]
},
{
"w_name": "Name2",
"w_code": "002",
"place": "Place2",
"job": "job2",
"data": [
{
"name": "name1",
"quantity": 2
},
{
"name": "name2",
"quantity": 2
},
{
"name": "name3"
"quantity": 1
},
{
"name": "name4",
"quantity": 2
}
]
}]
){
id
name
config
}
When I typed this I get an error, all is colored red and it does not execute anything.
What am I doing wrong?
Firstly, try to run that query in Insomnia, so you could see the exception that is thrown.
Moreover, I don't think that config gets a valid JSON value. Try first with a simple JSON like {key: 'value'}

Convert json to xls and after get back data to json

I have a JSON example to i would like to transform to Excell file to be able to modify all fields and after that, be able to export Excell file to retrieve the new JSON file updated.
I tryed some online tool like (http://www.convertcsv.com/csv-to-json.htm) but the result is not good : I am able to create a csv file, but not able to convert csv file to json.
Do you know a tool with which i will be able to convert to csv / convert to json ?
JSON example :
[
{
"key": "keyExample",
"type": "typeExample",
"ref": "refExample",
"items": [
{
"itemRef": "aaa",
"count": 1,
"desc": "aaaaaaaaa"
},
{
"itemRef": "bbb",
"count": 2,
"desc": "bbbbbbb"
},
{
"itemRef": "ccc",
"count": 2,
"desc": "ccccccc"
}
]
},
{
"key": "keyExample2",
"type": "typeExample2",
"ref": "refExample2",
"items": [
{
"itemRef": "aaa",
"count": 1,
"desc": "aaaaaaaaa"
},
{
"itemRef": "bbb",
"count": 2,
"desc": "bbbbbbb"
},
{
"itemRef": "ccc",
"count": 2,
"desc": "ccccccc"
}
]
}
]

How to save multiple JSON objects into couchdb? or Why does couchdb not accept outer brackets in JSON?

I'm trying to save this to a Cloudant database:
[
{
"_id": "document_one",
"heynow": [
{
"Name": "one",
"Duration": 2,
"DurationUnit": "Hours"
},
{
"Name": "two",
"Duration": 40,
"DurationUnit": "Minutes"
}
]
},
{
"_id": "document_two",
"heynow": [
{
"Name": "three",
"Duration": 2,
"DurationUnit": "Hours"
},
{
"Name": "four",
"Duration": 40,
"DurationUnit": "Minutes"
}
]
}
]
But apparently it doesn't like the outer brackets because it's telling me:
"error":"bad_request","reason":"Document must be a JSON object"
JSONLint says it's valid, so I guess I'm asking if anyone knows how to format this so it can be entered into a Couchdb because the outer brackets seem to be causing problems.
CouchDB has a bulk documents API;
https://wiki.apache.org/couchdb/HTTP_Bulk_Document_API