How to preserve Json Object Order in Robot Framework - json

I am working with an API that requires the json key value pairs to be ordered when creating resources. The API provides a method (called new) that allows you to make a GET request that will return an object model. I would like to update the model with values within my RobotFramework test cases. Is there a native way within Robot Framework to make a GET request and preserve the json object order sent by the server? Here's an example of the JSON response to the GET new method:
{
"account": {
"#id": "",
"#uri": "",
"#oldID": "",
"person": {
"#id": "",
"#uri": ""
},
"accountType": {
"#id": "",
"#uri": "",
"name": null
},
"accountName": "",
"createdDate": null,
"createdByPerson": {
"#id": "",
"#uri": ""
},
"lastUpdatedDate": null,
"lastUpdatedByPerson": {
"#id": "",
"#uri": ""
}
}
}
If I use the following, the key values automatically get sorted:
${r}= GET Request MySession /accounts/new

For anyone else who needs ordered JSON in robot framework, I was able to achieve it with the following (thanks to help in the comments):
&{object}= Evaluate json.loads('''${r.text}''', object_pairs_hook=collections.OrderedDict) modules=json, collections
Though, ultimately this kind of logic is probably best suited in a custom helper library.

Related

ADF pipeline get dynamic count of input parameters

I have a pipeline where I want to pass a JSON to from 3rd party application.
simple JSON Example, but the keys could be different for each call
{
"name": "Here is a name",
"guaid": "123456-123456-123456-111112",
"owner": "my.email#example.com",
"description": "here comes my description"
}
Passing the JSON is not the problem but I do not want to configure each parameter in the JSON as an pipeline parameter inside my ADF because it is not needed.
I do not modify the JSON inside my pipeline, I just have to surround it with some other parameters, so I need it as a whole.
I cloud of course define a parameter for each key in the JSON and concatenate it again but why do the effort if it is not needed.
I also can not modify how the JSON will be passed to my pipeline, so it is not possible to pass the whole JSON inside one parameter like this:
{
"inputParam": "{\"name\": \"Here is a name\",\"guaid\": \"123456-123456-123456-111112\",\"owner\": \"my.email#example.com\",\"description\":\"here comes my description\"}"
}
so is it possible to get all input without "knowing" it and use it in my activity?
My Pipeline is just simple, I only need a set variable and web activities, so I want to avoid a complex solution.
JSON Outbutt should look like this
{
"processingMode": "full",
"version": "1.0.0",
"content": [
{
"type": "Application",
"id": "akjhajf-ffsfsfs-sf-sf-sf",
"data": {
"name": "Here is a name",
"guaid": "123456-123456-123456-111112",
"owner": "my.email#example.com",
"description": "here comes my description"
}
}
]
}
I achieve this by adding this in a set variable like this with the not allowed method i mentioned above
{
"processingMode": "full",
"version": "1.0.0",
"content": [
{
"type": "Application",
"id": "#{guid()}",
"data": #{pipeline().parameters.inputParam}
}
]
}
I then just use this result JSON and call a external Webservice via Web activity

How to search azure table storage JSON response in logic app?

I am getting the below response from azure table storage. I need to search the response using the GeneralId and get the response true or false wheather an entity is available in table or not.
{
"odata.metadata": "https://google.net/$metadata#GetStudents",
"value": [
{
"odata.etag": "W/\"datetime'2019-05-01T18%3A04%3A37.5904256Z'\"",
"PartitionKey": "mypartitionkey",
"RowKey": "myrowkey",
"Timestamp": "",
"GeneralId": "456265d8-6c3b-11e9-a923-1681be663d3e",
"Inc": "PIR165461",
"Name": "",
"StudentId": "c17a3c42-6c48-11e9-a923-1681be663d3e",
"Subject": ""
},
{
"odata.etag": "W/\"datetime'2019-04-30T16%3A49%3A10.0746254Z'\"",
"PartitionKey": "par1",
"RowKey": "row1",
"Timestamp": "2019-04-30T16:49:10.0746254Z",
"Generald": "fada7dd0-6c48-11e9-a923-1681be663d3e",
"Inc": "PIR4237341",
"Name": "",
"StudentId": "c70c5de9-ac8d-4432-9f3c-1f8bede83504",
"Subject": ""
}
]
}
I guess you want to check if one entity exists in the table. So you could get the partition and row value from the json to check. Below is the workflow.
After Get Entities values, add a For each action, in the Input choose Dynamic content Get entities result List of Entities.
Then add Condition action to judge if the entity you want in the table. Use two condition, one is partition the other one is row vale.
After this you could add actions under If true or If false. And here is my test result.

parse value from JSON object with dynamic key in VBScript

Actually I want to get value from a JSON with Dynamic key in VBScript. I try to find similar question if any body asked already but nothing find for VBScript.
So below is a sample json:
{
"assessmenttype": [{
"id": "129666",
"formattedvalue": "wT",
"value": "WT"
}],
"jobid": "2017-2752",
"jobtitle": "XYZ",
"links": [{
"rel": "self",
"title": "The current profile being viewed.",
"url": "https://dummyUrl.com/customers"
}],
"field33005": {
"id": "C121",
"formattedvalue": "XYZ",
"value": "XYZ"
}
}
So in above JSON(which is client specific), as for one client node name is field33005 but for any other client this field name might be field38045 and so on.. so the challenge is to get the value of "value" sub field in this field33005 custom field.
please help me as I am not professional in JSON Parsing with VBScript.
Note: For json parsing I am using json2-min.js library
To answer my own question I make one function in JavaScript as we can call a js function from VBSript in ASP.
<script runat="server" language="javascript">
function getJSONObject(targetJSONObject, propName)
{
for (var prop in targetJSONObject)
{
if (prop = propName)
{
return targetJSONObject[prop].value;
}
}
return "";
}
</script>
In the above method we need to pass the actual Json and name of custom field then it will return the "value" sub node of that custom field.

Reading complex json data without iteration

I am working with some data and often the data is nested and i am required to perform some CRUD operations based on the structure of the data i have. For instance i have this json structure
{
"_id": "KnNLkJEhrDsvWedLu",
"createdAt": {
"$date": "2016-10-13T11:24:13.843Z"
},
"services": {
"password": {
"bcrypt": "$2a$30$1/cniPwPNCuwZ/MQDPQkLej..cAATkoGX.qD1TS4iHgf/pwZYE.j."
},
"email": {
"verificationTokens": [
{
"token": "qxe_T9IS7jW7gntpK0Q7UQ35RJ9jO9m2lclnokO3z87",
"address": "drwho#gmail.com",
"when": {
"$date": "2016-10-13T11:24:14.428Z"
}
}
]
},
"resume": {
"loginTokens": []
}
},
"username": "doctorwho",
"emails": [
{
"address": "drwho#gmail.com",
"verified": false
}
],
"persodata": {
"lastlogin": {
"$date": "2016-10-13T11:29:36.816Z"
},
"fname": "Doctor",
"lname": "Who",
"mobile": "+4480000000",
"identity": "1",
"email": "drwho#gmail.com",
"gender": null
}
}
I have several data sets with such complex structure. I need to read the data, edit and also delete. Before i get to iteration, i was wondering how i can read the data without iteration then iterate when i absolutely have to.
What are the rules i should keep in mind when reading such complex json structures to enable me read any complex structure i come across?.
I am currently using javascript but i am looking for rules that apply in other languages as well.
Parsing Json in JavaScript should be easy. http://www.json.org/js.html.
"Since JSON is a proper subset of JavaScript, the compiler will correctly parse the text and produce an object structure". Just follow the examples on that page.
If you want to use another language, in Java you could use Jackson or Gson to map those json strings to objects. Then using them becomes easy. Both libraries are annotation based, and wouldn't be difficult to implement.

How to store a Json File Using Lift Mapper in MySql

I am new to Liftweb. I want to Store a Json File in Mysql database using Lift Mapper
My Json File Like Below:-
[
{
"name": "Root Category",
"Id": "1",
"dispName": "",
"childs": [
{
"name": "Sub Category",
"Id": "",
"dispName": "",
"childs": [
{
"name": "Spec1",
"Id": "",
"dispName": "",
"childs": []
}
]
}
]
},
{
"name": "Root Category",
"Id": "",
"dispName": "",
"childs": [
{
"name": "Sub Category",
"Id": "",
"dispName": "",
"childs": [
{
"name": "Spec1",
"Id": "",
"dispName": "",
"childs": []
}
]
}
]
}
]
Is it Possible to store a Json File in Lift Mapper .Please give me Suggestions. It will be great if some one provide any sample
Best Regards
GSY
At the moment there is no good support for storing JSON in MySQL. I mean it's not going to provide capabilities MongoDB provides for example. However there are some JSON processing functions provided by community if you want. Given all that you can store it in VARCHAR. TEXT or BLOB field type as simple text. Here is a Mapper example:
import net.liftweb.mapper._
import net.liftweb.common._
class SomeDbClass extends LongKeyedMapper[SomeDbClass] with IdPK {
def getSingleton = SomeDbClass
// set limit of chars - can be used in `validate()`
object quota_type extends MappedString(this, 1024)
}
object SomeDbClass extends SomeDbClass with LongKeyedMetaMapper[SomeDbClass]
For one of my projects I store JSON as a string in Postgres similarly because I just need to read and write it without having to parse it in DB and query by fields. Whenever I need efficient JSON storage with query and update support I use MongoDB with Record + ( Casbah or Rogue ).