Converting complex Powershell objects into Json with arrays of single elements - json

I am trying to construct a complex powershell object containing nested hashtables and arrays, and then using the ConvertTo-Json function to convert the whole lot into Json.
I am experiencing a problem where - if an array object has only a single element, the Json that is returned is not returned as an array but as a single object. That is to say, it is missing the enclosing square brackets in the json.
Here is some sample code that demonstrates the problem:
function GetMyArrayData
{
param($myData)
$myArrayOfInfo = $myData -split '\s*[,;]\s*'
$myReturnArray = #()
foreach($infoItem in $myArrayOfInfo)
{
$hashObj = #{
field1 = "standard-value"
field2 = $infoItem
}
$myReturnArray += $hashObj
}
return $myReturnArray
}
$myFirstArray = GetMyArrayData -myData "abc;def"
$mySecondArray = GetMyArrayData -myData "xyz"
$myBigObject = #{
item1 = "my-first-item"
item2 = "my-second-item"
myArray1 = $myFirstArray
myArray2 = $mySecondArray
}
$myJsonVersion = ConvertTo-Json $myBigObject -Depth 5
Write-Output $myJsonVersion
This code then outputs the following JSON:
{
"item2": "my-second-item",
"myArray1": [
{
"field1": "standard-value",
"field2": "abc"
},
{
"field1": "standard-value",
"field2": "def"
}
],
"myArray2": {
"field1": "standard-value",
"field2": "xyz"
},
"item1": "my-first-item"
}
Notice that "myArray2" is converted to a single object, rather than an array containing a single object.
It should be rendered thus:
{
"item2": "my-second-item",
"myArray1": [
{
"field1": "standard-value",
"field2": "abc"
},
{
"field1": "standard-value",
"field2": "def"
}
],
"myArray2": [
{
"field1": "standard-value",
"field2": "xyz"
}
],
"item1": "my-first-item"
}
Can anyone advise what is wrong with this code that prevents it from outputting myArray2 as an array containing a single object?
Thanks heaps,
David :-)

When you return arrays and collections from a function PwerShell will “unroll” them. If the array contains a single object, it will return that object rather than an array.
You can work around this in a couple of ways - inside your function you can do this:
return #(, $myReturnArray )
which wraps your return value in an outer array - Powershell will then unroll that rather than your inner array, so your single item array is returned intact, or you can do this:
$myFirstArray = #( GetMyArrayData -myData "abc;def" )
which wraps the return value in a new array. If the return value is already an array it basically creates a shallow copy, but if it’s a single object it will be bundled onto an array.

In this case, you can use the #() array subexpression operator or the unary comma operator to force the variable into an array:
myArray1 = #($myFirstArray)
myArray2 = #($mySecondArray)
or
myArray1 = ,$myFirstArray
myArray2 = ,$mySecondArray
Note that you still will lose the [] if you pipe the $MyBigObject through to the ConvertTo-Json cmdlet. Using -InputObject (as you implicitely do) is the correct way here.

Related

Array of structures vs structure of arrays in JSON

Let I have an array of structures in JSON like this:
[{
"name": "Amrit",
"second_name": "Valentine"
},
{
"name": "Beatriz",
"second_name": "Carty"
}// And so on...
]
It is human readable, but the issue with it is that "name" and "second_name" are repeated for all entities in this array. And if the array is big, it creates about 40% size overhead.
On the other hand, I could translate this data structure:
{
"names": ["Amrit", "Beatriz" /* ... */],
"second_names": ["Valentine", "Carty" /* ... */]
}
But it is not human readable any longer, but the size is optimal. It is also error prone, since it is not guaranteed that sizes of "names" and "second_names" are the same.
Is there any trade-off in JSON, so I can use array of structures without repeating filed names and the size is still optimal?
Not as part of JSON itself, no. What I've done on projects is a generic system where the JSON would look like this:
{
"__keys__": ["name", "second_name"],
"values": [
["Amrit", "Valentine"],
["Beatriz", "Carty"]
]
}
...where once I've parsed the JSON, I throw a utility function at it to consume that and turn it into an array of objects. Along these lines:
const json = `{
"__keys__": ["name", "second_name"],
"values": [
["Amrit", "Valentine"],
["Beatriz", "Carty"]
]
}`;
const parsed = JSON.parse(json);
const expanded = expand(parsed);
console.log(expanded);
function expand(data) {
const keys = data.__keys__;
return data.values.map(entry => {
const obj = {};
keys.forEach((key, index) => {
obj[key] = entry[index];
});
return obj;
});
}
Or, of course, you just leave off __keys__ and assume your endpoint knows what the keys should be, but it impairs readability/debugging even more than the above does. :-)
(You can shoehorn that forEach into a reduce, because you can shoehorn just about any array operation into a reduce, but it doesn't buy you anything.)

Writing a proto file for a JSON input containing random field names

A newbie to protobuff here. I am working on compressing a JSON file using protobuff. The problem is that this JSON file comes as a response from a webserver and contains certain fields whose name are random i.e. with each request posted to the server, the key names differ. For example consider the below JSON:
{
"field1": [
{
"abc": "vala",
"def": "valb",
"ghi": "valc"
}
],
"field2": "val2",
"field3": "val3"
}
In the above json, the field names "abc", "def", "ghi" can vary each time. Is there a way in protobuf so that I get field1's value completely (like a single string or anything else) without losing the random fields inside it?
I think you want "struct.proto", i.e.
syntax = "proto3";
import "google/protobuf/struct.proto";
message Foo {
.google.protobuf.Struct field1 = 1;
string field2 = 2;
string field3 = 3;
}
or possibly (because of the array):
syntax = "proto3";
import "google/protobuf/struct.proto";
message Foo {
repeated .google.protobuf.Struct field1 = 1;
string field2 = 2;
string field3 = 3;
}
However, I should emphasize that protobuf isn't well-suited for parsing arbitrary json; for that you should use a json library, not a protobuf library.

JSON.parse SyntaxError with multiple values on same key

Here is my sample JSON data:
var testData =
{
"Level1": [
{
"Level2": [
{
"Level3": [
{
"body": "AAAAA"
},
{
"body": "BBBBB"
}
]
}
]
}
]
};
When I use JSON.stringify like this:
var x = JSON.stringify(testData).replace(/[\[\]]/g,"");
console.log(x);
It works as expected and correctly replaces the square brackets and returns this result:
{"Level1":{"Level2":{"Level3":{"body":"AAAAA"},{"body":"BBBBB"}}}}
The error occurs when I try to add JSON.parse like this which returns an error:
var x = JSON.parse(JSON.stringify(testData).replace(/[\[\]]/g,""));
The specific error is SyntaxError: Unexpected token { in JSON. What seems to be happening is that JSON.parse is treating the comma inside the key/value list as the end of the JSON string, when it is not the end.
Does anyone have any idea why this is happening?
{"Level1":{"Level2":{"Level3":{"body":"AAAAA"},{"body":"BBBBB"}}}}
This is not valid JSON
The level 3 should be:
"Level3":[{"body":"AAAAA"}, {"body":"BBBBB"}]
For you first levels, you have arrays with only 1 element, so the array brackets [] kan be removed without consequence. The level 3 is an actual array with 2 elements, so removing the [] breaks your valid JSON syntax.

Nested objects in Json

I am a newbie in Json objects and Json array. I want to access a nested object in Json object , but I am making a slight error ,I have wasted my 2 hr searching also reading lots of stackoverflow's questions on it but I can't find where i am making an error. Please help me out
Response
{ __v: 0,
friends_in:
[ { friends_in_email: '12',
friends_in_gcm_regId: '12'
} ]
}
My code
console.log(JSON.stringify(doc));
Output:
{
"__v": " 0",
"friends_in": [
{
"friends_in_email": "12",
"friends_in_gcm_regId": "12"
}
]
}
Here is error generating saying undefined
MyCode
console.log(JSON.stringify(doc[0].__v));
console.log(JSON.stringify(doc[0].friends_in));
Output
0 //Correct
undefined //Why ?
There are some errors in your stringified JSON (maybe some mistakes in pasting?). But using the below JSON, everything works as expected.
var rawString = '{ "__v":" 0", "friends_in": [{ "friends_in_email": "12", "friends_in_gcm_regId": "12"}] }';
var x = JSON.parse(rawString);
console.log(JSON.stringify(x.__v));
console.log(JSON.stringify(x.friends_in));
The above results in the following output:
0
[{"friends_in_email":"12","friends_in_gcm_regId":"12"}]
You seem to be mixing up JSON object (things in curly braces { ... }) and JSON arrays (things in square brackets [ ... ]). Only JSON arrays should be indexed like you have:
var y = [22, 24, 28];
y[0] // do something with it ...
Objects should have their members accessed by name:
var z = { test: 22, another_number: 24 };
z.test // do something with it ...

Merging json objects in powershell

I have json that looks like this:
{
"Workflow": [
{
"Parameters": {
"Project": "/Path/To/File",
"OtherParam": "True"
}
}
],
"Overrides": [
{
"Special": {
"Parameters": {
"NewParam": "NewStuffGoesHere",
"OtherParam": "False"
}
}
}
]
}
... where I want to use the Overrides.Special section to add or update fields in the workflow object. In other words, given the json above, I want to do something like this:
$config = Get-Content workflow.json | out-string | ConvertFrom-Json
$configWithOverrides = Merge-Object $config.Workflow $config.Overrides.Special
And end up with something like this:
$configWithOverrides
Parameters
----------
#{Project=/Path/To/File; NewParam=NewStuffGoesHere; OtherParam=False}
I can certainly write the Merge-Object function above to add or update values as needed based on what's in the override section, but it seems there should (could?) be a built-in or one-liner way to handle this.
I tried this:
$test = $config.Workflow + $config.Overrides.Special
...but that doesn't quite work.
$test
Parameters
----------
#{Project=/Path/To/File; OtherParam=True}
#{NewParam=NewStuffGoesHere; OtherParam=False}
This enables adding parameters:
>$test.Parameters.NewParam
NewStuffGoesHere
...but it's not so great for updating them
>$test.Parameters.OtherParam
True
False
Note - in this example, I'm choosing to handle the merge after converting the json to a psobject, but that's not a requirement.
I have a one-liner to do what you're asking for. Notice that, as far as I know, PowerShell does not deal directly with json strings. But, once converted to PowerShell objects, it's like any other object.
So, firstly, define your json file, and read it as a single string:
# Requires -Version 4
$jsonFile='c:\temp\jsonfile.json'
$jsonObj=#(gc $jsonFile -raw)|ConvertFrom-Json
Define the property upon which you want to merge the json's objects, and the 1st and 2nd objects:
$property='Parameters'
$1=$jsonObj.Workflow.$property
$2=$jsonObj.Overrides.Special.$property
Now, see the one-liner (which I've splitted in 3, for the sake of clarity):
$MergedJson=[pscustomobject]#{
$property=$2.psobject.properties|%{$11=$1}{$11|add-member $_.name $_.value -ea Ignore}{$11}
}|ConvertTo-Json
You see? $MergedJson holds the following string (using your json string):
{
"Parameters": {
"Project": "/Path/To/File",
"OtherParam": "True",
"NewParam": "NewStuffGoesHere"
}
}
Is that what you're looking for?
P.S.: if you swap the roles of $1 and $2, the common parameters' (like OtherParam) values that prevail, change.