Nested Json to CSV powershell - json

I tried to solve it on my own but my knowlegde of PS and Json is just not so good that I can adapt other Solutions to my Problem.
The Tasks seems quite simple, but my json Structure which I get through a Rest API is kinda stupid for me.
Example JSON (which has more columns like (test-id,test-Name) but excatly the same logic over all columns in Fields):
{
"entities": [
{
"Fields": [
{
"Name": "test-id",
"values": [
{
"value": "1851"
}
]
},
{
"Name": "test-name",
"values": [
{
"value": "01_DUMMY"
}
]
}
],
"Type": "run",
"children-count": 0
},
{
"Fields": [
{
"Name": "test-id",
"values": [
{
"value": "1852"
}
]
},
{
"Name": "test-name",
"values": [
{
"value": "02_DUMMY"
}
]
}
],
"Type": "run",
"children-count": 0
}
],
"TotalResults": 2
}
I tried to following PS Script:
Get-Content $file -raw |
convertfrom-json | select -ExpandProperty entities |
select -ExpandProperty Fields |
select -ExpandProperty Values |
Export-CSV $OutputFile -NoTypeInformation
But my CSV Result Looks like this:
"value"
"1851"
"01_DUMMY"
"N"
I would like to receive following result:
test-id,test-Name,run,children-count
1851,01_DUMMY,run,0
1852,02_DUMMY,run,0

Related

Can we use dynamic values in static json file

Am having a json file for application configuration like below.
[
{
"name": "environment",
"value": "prod"
},
{
"name": "deployment_date",
"value": "2022-12-21"
}
]
The variable deployment_date, I want it as dynamic to current UTC date. Can we use any programing language to achieve this? something like getUTCDate().toString() instead "2022-12-21"?
Using jq:
jq '(.[] | select(.name == "deployment_date")).value |= (now | todate)' file.json
Output
[
{
"name": "environment",
"value": "prod"
},
{
"name": "deployment_date",
"value": "2022-12-21T12:46:11Z"
}
]
jq '(.[] | select(.name == "deployment_date")).value |= (now | strflocaltime("%Y-%m-%d"))' file.json
Output
[
{
"name": "environment",
"value": "prod"
},
{
"name": "deployment_date",
"value": "2022-12-21"
}
]

Remove an attribute from the JSON object in powershell

I have following JSON and I would like to remove streets from the JSON object under Address which is an array. I am trying to do this in powershell
{
"Customer": [
{
"id": "123"
}
],
"Nationality": [
{
"name": "US",
"id": "456"
}
],
"address": [
{
"$type": "Home",
"name": "Houston",
"streets": [
{
"name": "Union",
"postalCode": "10",
}
]
},
{
"$type": "Office",
"name": "Hawai",
"streets": [
{
"name": "Rock",
"postalCode": "11",
}
]
}
],
"address": [
{
"$type": "Home1",
"name": "Houston",
"streets": [
{
"name": "Union1",
"postalCode": "14",
}
]
},
{
"$type": "Office1",
"name": "Hawaii1",
"streets": [
{
"name": "Rock1",
"postalCode": "15",
}
]
}
],
}
I would like to remove streets from the JSON object and here is my powershell script but it is not working! I am trying to convert JSON into object and then loop over properties to remove those.
$FileContent = Get-Content -Path "Test.json" -Raw | ConvertFrom-Json
foreach ($content in $FileContent) {
#Write-Host $content.address
$content.address = $content.address | Select-Object * -ExcludeProperty streets
}
$FileContent | ConvertTo-Json -Depth 100 | Out-File "Test.json" -Force
When you use ConvertFrom-Json it will automatically convert things to objects for you. Once they're objects you can use Select-Object to specify what properties you want to include in the pipeline, with which you can just set $FileContent.address (an array of objects) to equal itself, excluding the streets property from each object in the array.
$FileContent = Get-Content -Path "Test.json" -Raw | ConvertFrom-Json
$FileContent.address = $FileContent.address | Select-Object * -ExcludeProperty streets
$FileContent | ConvertTo-Json

jq sort using the value of a nested array element

I need some help using jq to sort an array of elements where each element contains a nested
tags array of elements. My input JSON looks like this:
{
"result": [
{
"name": "ct-1",
"tags": [
{
"key": "service_name",
"value": "BaseCT"
},
{
"key": "sequence",
"value": "bb"
}
]
},
{
"name": "ct-2",
"tags": [
{
"key": "service_name",
"value": "BaseCT"
},
{
"key": "sequence",
"value": "aa"
}
]
}
]
}
I would like to sort using the value of the sequence tag in the nested tags array so that the output looks like this:
{
"result": [
{
"name": "ct-2",
"tags": [
{
"key": "service_name",
"value": "BaseCT"
},
{
"key": "sequence",
"value": "aa"
}
]
},
{
"name": "ct-1",
"tags": [
{
"key": "service_name",
"value": "BaseCT"
},
{
"key": "sequence",
"value": "bb"
}
]
}
]
}
I have tried the following jq command:
$ jq '.result |= ([.[] | .tags[] | select(.key == "sequence") | .value] | sort_by(.))' input.json
but I get the following result:
{
"result": [
"aa",
"bb"
]
}
Please let me know if you know how to deal with this scenario.
from_entries converts an array of key-value pairs to an object, you can use it with sort_by like this:
.result |= sort_by(.tags | from_entries | .sequence)

How to get the specific variable value from nested JSON based on the specific variable

Below is the JSON Output got for one of the API call, now need to get the date based on the name value, for example want to iterate the JSON file and look for name variable value which is equal to 1.0 then want to have date which is 2018-12-13T18:04:42-0500.
VERBOSE: {
"paging": {
"pageIndex": 1,
"pageSize": 100,
"total": 2
},
"analyses": [
{
"key": "xxxx",
"date": "2019-06-07T18:04:56-0400",
"events": [
{
"key": "xxxxxx",
"category": "VERSION",
"name": "01.00"
}
]
},
{
"key": "yyyyyy",
"date": "2018-12-13T18:04:42-0500",
"events": [
{
"key": "yyyyyy",
"category": "VERSION",
"name": "1.0"
}
]
}
]
}
Assuming you have the entire JSON text in a string variable, say $Json, would this work for you?
$obj = $Json | ConvertFrom-Json
$obj.analyses | ForEach-Object { if ($_.Events.Name -eq "1.0"){$_.Date}}

Convert JSON to CSV - string manipulation (jq, bash, awk, sed, etc.)

I'm in a dire need of help for a script to basically convert JSON text to CSV text in an attempt to copy users from one AWS Cognito userpool to another.
The export JSON looks like this:
{
"Users": [
{
"Username": "user.name",
"Attributes": [
{
"Name": "sub",
"Value": "some-value"
},
{
"Name": "email_verified",
"Value": "true"
},
{
"Name": "custom:jobtitle",
"Value": Director"
},
{
"Name": "custom:user_id",
"Value": "38"
},
{
"Name": "email",
"Value": "foo.bar#email.com"
}
],
"UserCreateDate": some-value,
"UserLastModifiedDate": some-value,
"Enabled": some-value,
"UserStatus": "some-value"
}
[more lines down here]...
] }
Then the CSV file would contain these lines:
,,,,,,,,,foo.bar#email.com,TRUE,,,,,,FALSE,,,Director,,38,FALSE,foo.bar
[more lines down here]...
So, the variables would be like this for JSON:
{
"Users": [
{
"Username": "%USERNAME%",
"Attributes": [
{
"Name": "sub",
"Value": "some-value"
},
{
"Name": "email_verified",
"Value": "true"
},
{
"Name": "custom:jobtitle",
"Value": %JOB_TITLE%"
},
{
"Name": "custom:user_id",
"Value": "%USER_ID%"
},
{
"Name": "email",
"Value": %EMAIL%"
}
],
"UserCreateDate": some-value,
"UserLastModifiedDate": some-value,
"Enabled": some-value,
"UserStatus": "some-value"
}
...
]
}
And like this for CSV:
,,,,,,,,,%EMAIL%,TRUE,,,,,,FALSE,,,%JOB_TITLE%,,%USER_ID%,FALSE,%USERNAME%
where %EMAIL%, %JOB_TITLE%, %USER_ID%, and %USERNAME% are variables, everything else should be just string.
Appreciate your help in advanced guys.
Consider first this filter:
.Users[].Attributes
| map(select(.Name | . == "custom:jobtitle" or . == "custom:user_id" or . == "email") )
| from_entries
| [ .email, .["custom:jobtitle"], .["custom:user_id"] ]
| #csv
The trick used here is the use of from_entries to convert the array of Name/Value pairs to an object with the Names as keys.
Assuming valid JSON input along the lines shown in the Q, invoking jq with the -r option would yield:
"foo.bar#email.com","Director","38"
Unfortunately the precise requirements are not so clear to me, but you should be able to adapt the above in accordance with your needs.