How to store a Json File Using Lift Mapper in MySql - mysql

I am new to Liftweb. I want to Store a Json File in Mysql database using Lift Mapper
My Json File Like Below:-
[
{
"name": "Root Category",
"Id": "1",
"dispName": "",
"childs": [
{
"name": "Sub Category",
"Id": "",
"dispName": "",
"childs": [
{
"name": "Spec1",
"Id": "",
"dispName": "",
"childs": []
}
]
}
]
},
{
"name": "Root Category",
"Id": "",
"dispName": "",
"childs": [
{
"name": "Sub Category",
"Id": "",
"dispName": "",
"childs": [
{
"name": "Spec1",
"Id": "",
"dispName": "",
"childs": []
}
]
}
]
}
]
Is it Possible to store a Json File in Lift Mapper .Please give me Suggestions. It will be great if some one provide any sample
Best Regards
GSY

At the moment there is no good support for storing JSON in MySQL. I mean it's not going to provide capabilities MongoDB provides for example. However there are some JSON processing functions provided by community if you want. Given all that you can store it in VARCHAR. TEXT or BLOB field type as simple text. Here is a Mapper example:
import net.liftweb.mapper._
import net.liftweb.common._
class SomeDbClass extends LongKeyedMapper[SomeDbClass] with IdPK {
def getSingleton = SomeDbClass
// set limit of chars - can be used in `validate()`
object quota_type extends MappedString(this, 1024)
}
object SomeDbClass extends SomeDbClass with LongKeyedMetaMapper[SomeDbClass]
For one of my projects I store JSON as a string in Postgres similarly because I just need to read and write it without having to parse it in DB and query by fields. Whenever I need efficient JSON storage with query and update support I use MongoDB with Record + ( Casbah or Rogue ).

Related

Angular JSON pipe with angular-datatables

I'm using angular-datatables to display NoSQL de-normalized data in grid for visualization purpose,
I have few complex nested json objects and wanted to display specific cell with prettified json with inbuilt JsonPipe
I'm using datatables with data binding as
<table id="dtTable" datatable [dtOptions]="data"></table>
Sample JSON
[
{
"Id": "1",
"Name": "Test",
"Account": {
"Id": "12",
"Name": "Stackoverflow",
"Contact": {
"Id": "23",
"Name": "stack exchange",
"Phone1": "712426",
"Phone2": "490591",
"Address": {
"Id": "12",
"Name": "Address 1",
"AddressType": "commercial"
}
}
},
"CreatedBy": {
"Id": "123",
"Name": "User 1"
},
"CreatedDate": "2022-04-11T10:42:28.7525823Z",
"ModifiedBy": {
"Id": "124",
"Name": "User 2"
},
"ModifiedDate": "2022-04-11T10:42:28.7525823Z"
},
{
...
},
...
]
want to render as
Id
Name
Account
Created By
1
Test
{ Pretified JSON}
{Json}
Do we have any option to render entire json content in specific column cell of tables using angular-datatables? or do we have any other option other than json pipe to display formatted json content in angular-datatables
Yes, you can use the build in pipe as you mentioned:
In HTML directly:
...
<td>{{accountInfo | json}}</td>
...
Or in TS file using the transform function.
...
this.accountInfo = this.jsonPipe.transform(response.accountInfo)
...
If you use the transform function, be sure that the Pipe is imported properly in the TS file.

replace "key" name in whole JSON python for bulk data in efficient way

Actually i am pushing data to other system but before pushing i have to change the "key" in the whole JSON. JSON may contain 200 or 10000 or 250000 data.
sample JSON:
{
"insert": "table",
"contacts": [
{
"testName": "testname",
"ContactID": 212121
},
{
"testName": "testname",
"ContactID": 2146354564
},
{
"testName": "testname",
"ContactID": 12312
},
{
"testName": "testname",
"ContactID": 211221
},
{
"testName": "testname",
"ContactID": 10218550
}
]
}
I need to change contacts array Keys. These contacts may be in bulk. So i need to work with this efficiently with minimal complexity.
The above JSON to be converted as below
{
"insert": "table",
"contacts": [
{
"name": "testname",
"phone": 212121
},
{
"name": "testname",
"phone": 2146354564
},
{
"name": "testname",
"phone": 12312
},
{
"name": "testname",
"phone": 211221
},
{
"name": "testname",
"phone": 10218550
}
]
}
here is my code trying by loop
ini_dict = request.data
contact_data = ini_dict['contacts']
for i in contact_data:
i['name'] = i.pop('testName')
print(contact_data)
Please suggest me how can i change the key names efficiently for bulk data. i mean for 50000 lists in contacts. "for loop" will be leading a performance issue. So please let me know the efficient way to achieve this
I dont know how fast you need it to be nor how you are choosing to store your json. One simple solution is just store it as a string and then replace all the instances of your attributes.
# Something like this using a jsonstring
jsonstring.replace("'testName':", "'name':")
jsonstring.replace("'ContactId':", "'phone':")
If you want to do this in bulk you, may need to create some batch process to be able to fetch multiple existing records and make changes at once. I have done this before with the java equivalent of https://pypi.org/project/JayDeBeApi/ but, that was more for modifying existing records in a database.

JSON is it best practice to give each element in an array an id attribute?

Is it best practice in JSON to give objects in an array an id similar to below?. Im trying to decide on a JSON format for a restful service im implementing and decide include it or not... If it is to be modified by CRUD operations is it a good idea?
{
"tables": [
{
"id": 1,
"tablename": "Table1",
"columns": [
{
"name": "Col1",
"data": "-5767703747778052096"
},
{
"name": "Col2",
"data": "-5803732544797016064"
}
]
},
{
"id": 2,
"tablename": "Table2",
"columns": [
{
"name": "Col1",
"data": "-333333"
},
{
"name": "Col2",
"data": "-44444"
}
]
}
]
}
Client-Generated IDs
A server MAY accept a client-generated ID along with a request to
create a resource. An ID MUST be specified with an "id" key, the value
of which MUST be a universally unique identifier. The client SHOULD
use a properly generated and formatted UUID as described in RFC 4122
[RFC4122].
jsonapi.org

Talend: parse JSON string to multiple output

I'm aware of this question but I don't believe that there is no solution with standars component. I'm using Talend ESB Studio 5.4.
I have to parse a JSON string from a REST web service into multiple output, and add them to a database.
Database has two tables:
User (user_id, name, card, card_id, points)
Action (user_id, action_id, description, used_point)
My JSON Structure is something like that:
{
"users": [
{
"name": "foo",
"user_id": 1,
"card": {
"card_id": "AAA",
"points": 10
},
"actions": [
{
"action_id": 1,
"description": "buy",
"used_points": 2
},
{
"action_id": 3,
"description": "buy",
"used_points": 1
}
]
},
{
"name": "bar",
"user_id": 2,
"card": {
"card_id": "BBB",
"points": -1
},
"actions": [
{
"id": 2,
"description": "sell",
"used_point": 5
}
]
}
]
}
I have tried to add a JSON Schema Metadata but it is not clear to me how to "flat" the JSON. I have tried to look at tXMLMap, tExtractJSONFields.. but no luck till now.
I also had a look at tJavaRow but I don't understand how to make a Schema for that.
It's a pity because till now I'm loving Talend! Any advice?
You can save a json file in your disk, then create new json file in the metadata of Talend studio, the wizard retrieve the schema for you, after saving, you ca, copie schema in the generic schema of the metadata, and it's done, use that generic schema where you want, this is how to use generic schema in the tRestClient component:

Can we add array of objects in amazon cloudsearch in json format?

I am trying to create a domain and uploading a sample data which is like :
[
{
"type": "add",
"id": "1371964",
"version": 1,
"lang": "eng",
"fields": {
"id": "1371964",
"uid": "1200983280",
"time": "2013-12-23 13:00:26",
"orderid": "1200983280",
"callerid": "66580662",
"is_called": "1",
"is_synced": "1",
"is_sent": "1",
"allcaller": [
{
"sno": "1085770",
"uid": "1387783883.30547",
"lastfun": null,
"callduration": "00:00:46",
"request_id": "1371964"
}
]
}
}]
when I am uploading sample data while creating a domain, cloudsearch is not taking it.
If I remove allcaller array then it takes it smoothly.
If cloudsearch does not allowing object arrays, then how should I format this json??
Just found after searching on aws forums, cloudsearch doesnot allow nested json (object arrays) :(
https://forums.aws.amazon.com/thread.jspa?messageID=405879&#405879
Time to try Elastic search.