Using JSON Data with Coldfusion - json

I've worked with JSON data in the past - mainly 'fudging' my way to a solution, and not really understanding why or how things work. I've come across an issue where the data being returned looks somewhat different to what I've seen before, and I can't find any examples that match it.
Here's an example of the data returned (via an API);
{"domain.co.uk":{"status":"available","classkey":"thirdleveldotuk"},"domain.net":{"status":"available","classkey":"dotnet"},"domain.com":{"status":"available","classkey":"domcno"}}
On my front-end, I need to return something like this -
domain.co.uk - available
domain.net - available
domain.com - available
Because the 'domain.com' etc value will always change, I can't map the names as I usually would (although it will always be 3 'rows' returned)
I've checked every CF Book I own, and read the online CF Docs, but I'm totally at a loss as to where to even start with this one!
Pointers much appreciated!

If you run this with deserializeJSON(data), you'll see that you just end up with structures with nested structures. So, you can loop through your struct, grab the keys and then grab that key's status. In JSON terms, your JSON object has nested objects.
<cfset data = deserializeJSON(apiData) />
<cfset formattedData = [] />
<cfset tmp = {} />
<cfloop collection=#data# item="domain">
<cfset tmp.domain = domain />
<cfset tmp.status = data[domain]["status"] />
<cfset arrayAppend(formattedData,duplicate(tmp)) />
</cfloop>
<cfdump var=#formattedData# />

(This is really more of a comment, but is a bit too long ... )
I've worked with JSON data in the past - mainly 'fudging' my way to a
solution, and not really understanding why or how things work
JSON strings are essentially just a representations of two objects:
arrays which are denoted by [] and
structures (or objects) which are denoted by {}
Looking at the API string, the braces {} indicate you are dealing with a structure:
{ "theKey": "theValue" }
In your case, the domain name is the structure key:
{ "domain.co.uk": "theValue" }
.. and the value is a nested structure containing two static keys: "status" and "classkey"
{ "theKey": {"status":"available","classkey":"thirdleveldotuk"} }
As with any structure, you can iterate through the keys dynamically using a for .. in loop, a collection loop if you prefer cfml.
for (theKey in theStruct) {
WriteDump( theKey ); // ie "domain.co.uk"
}
Then inside the loop use associative array notation to grab the value, ie:
theStatus = theStruct[ theKey ]["status"]; // "available"
// ... OR
theValue = theStruct[ theKey ];
theStatus = theValue.status;
That is all there is to it. You can use similar logic to access any type of nested structures.

Related

How to access the key of a jsoncpp Value

I kind of feel stupid for asking this, but haven't been able to find a way to get the key of a JSON value. I know how to retrieve the key if I have an iterator of the object. I also know of operator[].
In my case the key is not a known value, so can't use get(const char *key) or operator[]. Also can't find a getKey() method.
My JSON looks like this:
{Obj_Array: [{"122":{"Member_Array":["241", "642"]}}]}
For the piece of code to parse {"122":{"Member_Array":["241", "642"]}} I want to use get_key()-like function just to retrieve "122" but seems like I have to use an iterator which to me seems to be overkill.
I might have a fundamental lack of understanding of how jsoncpp is representing a JSON file.
First, what you have won't parse in JsonCPP. Keys must always be enclosed in double quotes:
{"Obj_Array": [{"122":{"Member_Array":["241", "642"]}}]}
Assuming that was just an oversight, if we add whitespace and tag the elements:
{
root-> "Obj_Array" : [
elem0-> {
key0-> "122":
val0-> {
key0.1-> "Member_Array" :
val0.1-> [
elem0.1.0-> "241",
elem0.1.1-> "642" ]
}
}
]
}
Assuming you have managed to read your data into a Json::Value (let's call it root), each of the tagged values can be accessed like this:
elem0 = root[0];
val0 = elem0["122"]
val0_1 = val0["Member_Array"];
elem0_1_0 = val0_1[0];
elem0_1_1 = val0_1[1];
You notice that this only retrieves values; the keys were known a priori. This is not unusual; the keys define the schema of the data; you have to know them to directly access the values.
In your question, you state that this is not an option, because the keys are not known. Applying semantic meaning to unknown keys could be challenging, but you already came to the answer. If you want to get the key values, then you do have to iterate over the elements of the enclosing Json::Value.
So, to get to key0, you need something like this (untested):
elem0_members = elem0.getMemberNames();
key0 = elem0_members[0];
This isn't production quality, by any means, but I hope it points in the right direction.

What's the correct JsonPath expression to search a JSON root object using Newtonsoft.Json.NET?

Most examples deal with the book store example from Stefan Gössner, however I'm struggling to define the correct JsonPath expression for a simple object (no array):
{ "Id": 1, "Name": "Test" }
To check if this json contains Id = 1.
I tried the following expression: $..?[(#.Id == 1]), but this does find any matches using Json.NET?
Also tried Manatee.Json for parsing, and there it seems the jsonpath expression could be like $[?($.Id == 1)] ?
The path that you posted is not valid. I think you meant $..[?(#.Id == 1)] (some characters were out of order). My answer assumes this.
The JSON Path that you're using indicates that the item you're looking for should be in an array.
$ start
.. recursive search (1)
[ array item specification
?( item-based query
#.Id == 1 where the item is an object with an "Id" with value == 1 at the root
) end item-based query
] end array item specification
(1) the conditions following this could match a value no matter how deep in the hierarchy it exists
You want to just navigate the object directly. Using $.Id will return 1, which you can validate in your application.
All of that said...
It sounds to me like you want to validate that the Id property is 1 rather than to search an array for an object where the Id property is 1. To do this, you want JSON Schema, not JSON Path.
JSON Path is a query language for searching for values which meet certain conditions (e.g. an object where Id == 1.
JSON Schema is for validating that the JSON meet certain requirements (your data's in the right shape). A JSON Schema to validate that your object has a value of 1 could be something like
{
"properties": {
"Id": {"const":1}
}
}
Granted this isn't very useful because it'll only validate that the Id property is 1, which ideally should only be true for one object.

Azure tables unable to store flattened JSON

I am using the npm flat package, and arrays/objects are flattened, but object/array keys are surrounded by '' , like in 'task_status.0.data' using the object below.
These specific fields do not get stored into AzureTables - other fields go through, but these are silently ignored. How would I fix this?
var obj1 = {
"studentId": "abc",
"task_status": [
{
"status":"Current",
"date":516760078
},
{
"status":"Late",
"date":1516414446
}
],
"student_plan": "n"
}
Here is how I am using it - simplified code example: Again, it successfully gets written to the table, but does not write the properties that were flattened (see further below):
var flatten = require('flat')
newObj1 = flatten(obj1);
var entGen = azure.TableUtilities.entityGenerator;
newObj1.PartitionKey = entGen.String(uniqueIDFromMyDB);
newObj1.RowKey = entGen.String(uniqueStudentId);
tableService.insertEntity(myTableName, newObj1, myCallbackFunc);
In the above example, the flattened object would look like:
var obj1 = {
studentId: "abc",
'task_status.0.status': 'Current',
'task_status.0.date': 516760078,
'task_status.1.status': 'Late',
'task_status.1.date': 516760078,
student_plan: "n"
}
Then I would add PartitionKey and RowKey.
all the task_status fields would silently fail to be inserted.
EDIT: This does not have anything to do with the actual flattening process - I just checked a perfectly good JSON object, with keys that had 'x.y.z' in it, i.e. AzureTables doesn't seem to accept these column names....which almost completely destroys the value proposition of storing schema-less data, without significant rework.
. in column name is not supported. You can use a custom delimiter to flatten your objects instead.
For example:
newObj1 = flatten(obj1, {delimiter: '__'});

Parse complex Json string contained in Hadoop

I want to parse a string of complex JSON in Pig. Specifically, I want Pig to understand my JSON array as a bag instead of as a single chararray. I found that complex JSON can be parsed by using Twitter's Elephant Bird or Mozilla's Akela library. (I found some additional libraries, but I cannot use 'Loader' based approach since I use HCatalog Loader to load data from Hive.)
But, the problem is the structure of my data; each value of Map structure contains value part of complex JSON. For example,
1. My table looks like (WARNING: type of 'complex_data' is not STRING, a MAP of <STRING, STRING>!)
TABLE temp_table
(
user_id BIGINT COMMENT 'user ID.',
complex_data MAP <STRING, STRING> COMMENT 'complex json data'
)
COMMENT 'temp data.'
PARTITIONED BY(created_date STRING)
STORED AS RCFILE;
2. And 'complex_data' contains (a value that I want to get is marked with two *s, so basically #'d'#'f' from each PARSED_STRING(complex_data#'c') )
{ "a": "[]",
"b": "\"sdf\"",
"**c**":"[{\"**d**\":{\"e\":\"sdfsdf\"
,\"**f**\":\"sdfs\"
,\"g\":\"qweqweqwe\"},
\"c\":[{\"d\":21321,\"e\":\"ewrwer\"},
{\"d\":21321,\"e\":\"ewrwer\"},
{\"d\":21321,\"e\":\"ewrwer\"}]
},
{\"**d**\":{\"e\":\"sdfsdf\"
,\"**f**\":\"sdfs\"
,\"g\":\"qweqweqwe\"},
\"c\":[{\"d\":21321,\"e\":\"ewrwer\"},
{\"d\":21321,\"e\":\"ewrwer\"},
{\"d\":21321,\"e\":\"ewrwer\"}]
},]"
}
3. So, I tried... (same approach for Elephant Bird)
REGISTER '/path/to/akela-0.6-SNAPSHOT.jar';
DEFINE JsonTupleMap com.mozilla.pig.eval.json.JsonTupleMap();
data = LOAD temp_table USING org.apache.hive.hcatalog.pig.HCatLoader();
values_of_map = FOREACH data GENERATE complex_data#'c' AS attr:chararray; -- IT WORKS
-- dump values_of_map shows correct chararray data per each row
-- eg) ([{"d":{"e":"sdfsdf","f":"sdfs","g":"sdf"},... },
{"d":{"e":"sdfsdf","f":"sdfs","g":"sdf"},... },
{"d":{"e":"sdfsdf","f":"sdfs","g":"sdf"},... }])
([{"d":{"e":"sdfsdf","f":"sdfs","g":"sdf"},... },
{"d":{"e":"sdfsdf","f":"sdfs","g":"sdf"},... },
{"d":{"e":"sdfsdf","f":"sdfs","g":"sdf"},... }]) ...
attempt1 = FOREACH data GENERATE JsonTupleMap(complex_data#'c'); -- THIS LINE CAUSE AN ERROR
attempt2 = FOREACH data GENERATE JsonTupleMap(CONCAT(CONCAT('{\\"key\\":', complex_data#'c'), '}'); -- IT ALSO DOSE NOT WORK
I guessed that "attempt1" was failed because the value doesn't contain full JSON. However, when I CONCAT like "attempt2", I generate additional \ mark with. (so each line starts with {\"key\": ) I'm not sure that this additional marks breaks the parsing rule or not. In any case, I want to parse the given JSON string so that Pig can understand. If you have any method or solution, please Feel free to let me know.
I finally solved my problem by using jyson library with jython UDF.
I know that I can solve it by using JAVA or other languages.
But, I think that jython with jyson is the most simplist answer to this issue.

MongoDB - Dynamically update an object in nested array

I have a document like this:
{
Name : val
AnArray : [
{
Time : SomeTime
},
{
Time : AnotherTime
}
...arbitrary more elements
}
I need to update "Time" to a Date type (right now it is string)
I would like to do something psudo like:
foreach record in document.AnArray { record.Time = new Date(record.Time) }
I've read the documentation on $ and "dot" notation as well as a several similar questions here, I tried this code:
db.collection.update({_id:doc._id},{$set : {AnArray.$.Time : new Date(AnArray.$.Time)}});
And hoping that $ would iterate the indexes of the "AnArray" property as I don't know for each record the length of it. But am getting the error:
SyntaxError: missing : after property id (shell):1
How can I perform an update on each member of the arrays nested values with a dynamic value?
There's no direct way to do that, because MongoDB doesn't support an update-expression that references the document. Moreover, the $ operator only applies to the first match, so you'd have to perform this as long as there are still fields where AnArray.Time is of $type string.
You can, however, perform that update client side, in your favorite language or in the mongo console using JavaScript:
db.collection.find({}).forEach(function (doc) {
for(var i in doc.AnArray)
{
doc.AnArray[i].Time = new Date(doc.AnArray[i].Time);
}
db.outcollection.save(doc);
})
Note that this will store the migrated data in a different collection. You can also update the collection in-place by replacing outcollection with collection.