With Powershell 7.2 there seems to be a change in how a JSON is deserialized into an object in terms of dates -> instead of string it is now datetime. But I want to have the "old" behavior, i.e. that it is handled as string and NOT datetime.
How can I achieve that when using ConvertFrom-Json in Powershell 7.2 all dates are deserialized as string and not datetime?
EDIT:
$val = '{ "date":"2022-09-30T07:04:23.571+00:00" }' | ConvertFrom-Json
$val.date.GetType().FullName
This is actually a known issue, see: #13598 Add a -DateKind parameter to ConvertFrom-Json to control how System.DateTime / System.DateTimeOffset values are constructed. Yet I think there is no easy solution for this. One thing you might do is just invoke (Windows) PowerShell. Which isn't currently straights forward as well therefore I have created a small wrapper to send and receive complex objects between PowerShell sessions (see also my #18460 Invoke-PowerShell purpose):
function Invoke-PowerShell ($Command) {
$SerializeOutput = #"
`$Output = $Command
[System.Management.Automation.PSSerializer]::Serialize(`$Output)
"#
$Bytes = [System.Text.Encoding]::Unicode.GetBytes($SerializeOutput)
$EncodedCommand = [Convert]::ToBase64String($Bytes)
$PSSerial = PowerShell -EncodedCommand $EncodedCommand
[System.Management.Automation.PSSerializer]::Deserialize($PSSerial)
}
Usage:
Invoke-PowerShell { '{ "date":"2022-09-30T07:04:23.571+00:00" }' | ConvertFrom-Json }
date
----
2022-09-30T07:04:23.571+00:00
Update
As commented by mklement0, I clearly complicated the answer.
Calling via powershell.exe is a pragmatic workaround (albeit slow and Windows-only), but note that you don't need a helper function: if you pass a script block to powershell.exe (or pwsh.exe) from PowerShell, Based64 CLIXML-based serialization happens automatically behind the scenes: try powershell.exe -noprofile { $args | ConvertFrom-Json } -args '{ "date":"2022-09-30T07:04:23.571+00:00" }' For that reason, I don't think there's a need for an Invoke-PowerShell cmdlet.
$Json = '{ "date":"2022-09-30T07:04:23.571+00:00" }'
powershell.exe -noprofile { $args | ConvertFrom-Json } -args $Json
date
----
2022-09-30T07:04:23.571+00:00
iRon's helpful answer provides a pragmatic solution via the Windows PowerShell CLI, powershell.exe, relying on the fact that ConvertFrom-Json there does not automatically transform ISO 8601-like timestamp strings to [datetime] instances.
Hopefully, the proposal in the GitHub issue he links to, #13598, will be implemented in the future, which would then simplify the solution to:
# NOT YET IMPLEMENTED as of PowerShell 7.2.x
'{ "date":"2022-09-30T07:04:23.571+00:00" }' |
ConvertFrom-Json -DateTimeKind None
However, a powershell.exe workaround has two disadvantages: (a) it is slow (a separate PowerShell instance in a child process must be launched), and (b) it is Windows-only. The solution below is a generalization of your own approach that avoids these problems.
Here's a generalization of your own in-process approach:
It injects a NUL character ("`0") at the start of each string that matches the pattern of a timestamp - the assumption is that the input itself never contains such characters, which is fair to assume.
This, as in your approach, prevents ConvertFrom-Json from recognizing timestamp strings as such, and leaves them untouched.
The [pscustomobject] graph that ConvertFrom-Json outputs must then be post-processed in order to remove the injected NUL characters again.
This is achieved with a ForEach-Object call that contains a helper script block that recursively walks the object graph, which has the advantage that it works with JSON input whose timestamp strings may be at any level of the hierarchy (i.e. they may also be in properties of nested objects).
Note: The assumption is that the timestamp strings are only ever contained as property values in the input; more work would be needed if you wanted to handle input JSON such as '[ "2022-09-30T07:04:23.571+00:00" ]' too, where the strings are input objects themselves.
# Sample JSON.
$val = '{ "date":"2022-09-30T07:04:23.571+00:00" }'
$val -replace '"(?=\d{4}-\d{2}-\d{2}T\d{2}:\d{2}:\d{2}\.\d{3}[+-]\d{2}:\d{2}")', "`"`0" | #"
ConvertFrom-Json |
ForEach-Object {
# Helper script block that walks the object graph
$sb = {
foreach ($o in $args[0]) {
if ($o -is [Array]) { # nested array -> recurse
foreach ($el in $o) { & $sb $el } # recurse
}
elseif ($o -is [System.Management.Automation.PSCustomObject]) {
foreach ($prop in $o.psobject.Properties) {
if ($prop.Value -is [Array]) {
foreach ($o in $prop.Value) { & $sb $o } # nested array -> recurse
}
elseif ($prop.Value -is [System.Management.Automation.PSCustomObject]) {
& $sb $prop.Value # nested custom object -> recurse
}
elseif ($prop.Value -is [string] -and $prop.Value -match '^\0') {
$prop.Value = $prop.Value.Substring(1) # Remove the NUL again.
}
}
}
}
}
# Call the helper script block with the input object.
& $sb $_
# Output the modified object.
if ($_ -is [array]) {
# Input object was array as a whole (implies use of -NoEnumerate), output as such.
, $_
} else {
$_
}
}
Based on the input from #zett42 here my solution:
Assuming we know the regex pattern of the date used in the JSON I get the JSON as string, add a prefix so that ConvertFrom-Json does not convert dates to datetime but keeps it as string, convert it with ConvertFrom-Json to a PSCustomObject, do whatever I need to do on the object, serialize it back to a JSON string with ConvertTo-Json and then remove the prefix again.
[string]$json = '{ "date":"2022-09-30T07:04:23.571+00:00", "key1": "value1" }'
[string]$jsonWithDatePrefix = $json -replace '"(\d+-\d+.\d+T\d+:\d+:\d+\.\d+\+\d+:\d+)"', '"#$1"'
[pscustomobject]$jsonWithDatePrefixAsObject = $jsonWithDatePrefix | ConvertFrom-Json
$jsonWithDatePrefixAsObject.key1 = "value2"
[string]$updatedJsonString = $jsonWithDatePrefixAsObject | ConvertTo-Json
[string]$updatedJsonStringWithoutPrefix = $updatedJsonString -replace '"(#)(\d+-\d+.\d+T\d+:\d+:\d+\.\d+\+\d+:\d+)"', '"$2"'
Write-Host $updatedJsonStringWithoutPrefix
Two additional ways to change the date format:
Get-Node
Using this Get-Node which is quiet similar to mklement0 recursive function:
$Data = ConvertFrom-Json $Json
$Data |Get-Node -Where { $_.Value -is [DateTime] } | ForEach-Object {
$_.Value = GetDate($_.Value) -Format 'yyyy-MM-ddTHH\:mm\:ss.fffzzz' -AsUTC
}
$Data
DIY
Or do-it-yourself and build your own Json deserializer:
function ConvertFrom-Json {
[CmdletBinding()][OutputType([Object[]])] param(
[Parameter(ValueFromPipeLine = $True, Mandatory = $True)][String]$InputObject,
[String]$DateFormat = 'yyyy-MM-ddTHH\:mm\:ss.fffffffzzz', # Default: ISO 8601, https://www.newtonsoft.com/json/help/html/datesinjson.htm
[Switch]$AsLocalTime,
[Switch]$AsOrdered
)
function GetObject($JObject) {
switch ($JObject.GetType().Name) {
'JValue' {
switch ($JObject.Type) {
'Boolean' { $JObject.Value }
'Integer' { 0 + $JObject.Value } # https://github.com/PowerShell/PowerShell/issues/14264
'Date' { Get-Date $JObject.Value -Format $DateFormat -AsUTC:(!$AsLocalTime) } # https://github.com/PowerShell/PowerShell/issues/13598
Default { "$($JObject.Value)" }
}
}
'JArray' {
,#( $JObject.ForEach{ GetObject $_ } )
}
'JObject' {
$Properties = [Ordered]#{}
$JObject.ForEach{ $Properties[$_.Name] = GetObject $_.Value }
if ($AsOrdered) { $Properties } else { [PSCustomObject]$Properties } # https://github.com/PowerShell/PowerShell/pull/17405
}
}
}
GetObject ([Newtonsoft.Json.Linq.JObject]::Parse($InputObject))
}
Usage:
ConvertFrom-Json $Json -DateFormat 'yyyy-MM-ddTHH\:mm\:ss.fffzzz' |ConvertTo-Json -Depth 9
Related
One requirement of mine is - Using windows, not use any tools not already available as part of aws cli or windows
For example, I have this json file test.json with below content:
"My number is $myvar"
I read this into a powershell variable like so:
$myobj=(get-content .\test.json | convertfrom-json)
$myvar=1
From here, I would like to do something with this $myobj which will enable me to get this output:
$myobj | tee json_with_values_from_environment.json
My number is 1
I got some limited success with iex, but not sure if it can be made to work for this example
You can use $ExecutionContext.InvokeCommand.ExpandString()
$myobj = '{test: "My number is $myvar"}' | ConvertFrom-Json
$myvar = 1
$ExecutionContext.InvokeCommand.ExpandString($myobj.test)
Output
My number is 1
Here is one way to do it using the Parser to find all VariableExpressionAst and replace them with the values in your session.
Given the following test.json:
{
"test1": "My number is $myvar",
"test2": {
"somevalue": "$env:myothervar",
"someothervalue": "$anothervar !!"
}
}
We want to find and replace $myvar, $myothervar and $anothervar with their corresponding values defined in the current session, so the code looks like this (note that we do the replacement before converting the Json string into an object, this way is much easier):
using namespace System.Management.Automation.Language
$isCore7 = $PSVersionTable.PSVersion -ge '7.2'
# Define the variables here
$myvar = 10
$env:myothervar = 'hello'
$anothervar = 'world'
# Read the Json
$json = Get-Content .\test.json -Raw
# Now parse it
$ast = [Parser]::ParseInput($json, [ref] $null, [ref] $null)
# Find all variables in it, and enumerate them
$ast.FindAll({ $args[0] -is [VariableExpressionAst] }, $true) |
Sort-Object { $_.Extent.Text } -Unique | ForEach-Object {
# now replace the text with the actual value
if($isCore7) {
# in PowerShell Core is very easy
$json = $json.Replace($_.Extent.Text, $_.SafeGetValue($true))
return
}
# in Windows PowerShell not so much
$varText = $_.Extent.Text
$varPath = $_.VariablePath
# find the value of the var (here we use the path)
$value = $ExecutionContext.SessionState.PSVariable.GetValue($varPath.UserPath)
if($varPath.IsDriveQualified) {
$value = $ExecutionContext.SessionState.InvokeProvider.Item.Get($varPath.UserPath).Value
}
# now replace the text with the actual value
$json = $json.Replace($varText, $value)
}
# now we can safely convert the string to an object
$json | ConvertFrom-Json
If we were to convert it back to Json to see the result:
{
"test1": "My number is 10",
"test2": {
"somevalue": "hello",
"someothervalue": "world !!"
}
}
I feel like this is something simple and I'm just not getting it, and I'm not sure if my explanation is great.
I have this below JSON file, and I want to get "each App" (App1, App2, App3) under the "New" object
In this script line below I'm essentially trying to replace "TestApp2" with some variable. I guess I'm trying to get TestApp2 as an object without knowing the name.
And I realize that the foreach loop doesn't do anything right now
Write-Host $object.Value.TestApp2.reply_urls
JSON:
{
"New": {
"App1": {
"reply_urls": [
"https://testapp1url1"
]
},
"App2": {
"reply_urls": [
"https://testapp2url1",
"https://testapp2url2"
]
},
"App3": {
"reply_urls": [
"https://testapp3url1",
"https://testapp3url2",
"https://testapp3url3"
]
}
},
"Remove": {
"object_id": [
""
]
}
}
Script:
$inputFile = Get-Content -Path $inputFilePath -Raw | ConvertFrom-Json
foreach ($object in $inputFile.PsObject.Properties)
{
switch ($object.Name)
{
New
{
foreach ($app in $object.Value)
{
Write-Host $object.Value.TestApp2.reply_urls
# essentially want to replace this line with something like
# Write-Host $app.reply_urls
}
}
Remove
{
}
}
}
Output:
https://testapp2url1 https://testapp2url2
You can access the object's PSObject.Properties to get the property Names and property Values, which you can use to iterate over.
For example:
foreach($obj in $json.New.PSObject.Properties) {
$out = [ordered]#{ App = $obj.Name }
foreach($url in $obj.Value.PSObject.Properties) {
$out[$url.Name] = $url.Value
}
[pscustomobject] $out
}
Produces the following output:
App reply_urls
--- ----------
App1 {https://testapp1url1}
App2 {https://testapp2url1, https://testapp2url2}
App3 {https://testapp3url1, https://testapp3url2, https://testapp3url3}
If you just want to output the URL you can skip the construction of the PSCustomObject:
foreach($obj in $json.New.PSObject.Properties) {
foreach($url in $obj.Value.PSObject.Properties) {
$url.Value
}
}
Complementing Santiago Squarzon's helpful answer, I've looked for a generalized approach to get JSON properties without knowing the names of their parents in advance.
Pure PowerShell solution
I wrote a little helper function Expand-JsonProperties that "flattens" the JSON. This allows us to use simple non-recursive Where-Object queries for finding properties, regardless how deeply nested they are.
Function Expand-JsonProperties {
[CmdletBinding()]
param (
[Parameter(Mandatory, ValueFromPipeline)] [PSCustomObject] $Json,
[Parameter()] [string] $Path,
[Parameter()] [string] $Separator = '/'
)
process {
$Json.PSObject.Properties.ForEach{
$propertyPath = if( $Path ) { "$Path$Separator$($_.Name)" } else { $_.Name }
if( $_.Value -is [PSCustomObject] ) {
Expand-JsonProperties $_.Value $propertyPath
}
else {
[PSCustomObject]#{
Path = $propertyPath
Value = $_.Value
}
}
}
}
}
Given your XML sample we can now write:
$inputFile = Get-Content -Path $inputFilePath -Raw | ConvertFrom-Json
$inputFile | Expand-JsonProperties | Where-Object Path -like '*/reply_urls'
Output:
Path Value
---- -----
New/App1/reply_urls {https://testapp1url1}
New/App2/reply_urls {https://testapp2url1, https://testapp2url2}
New/App3/reply_urls {https://testapp3url1, https://testapp3url2, https://testapp3url3}
Optimized solution using inline C#
Out of curiosity I've tried out a few different algorithms, including ones that don't require recursion.
One of the fastest algorithms is written in inline C# but can be called through an easy to use PowerShell wrapper cmdlet (see below). The C# code basically works the same as the pure PowerShell function but turned out to be more than 9 times faster!
This requires at least PowerShell 7.x.
# Define inline C# class that does most of the work.
Add-Type -TypeDefinition #'
using System;
using System.Collections.Generic;
using System.Management.Automation;
public class ExpandPSObjectOptions {
public bool IncludeObjects = false;
public bool IncludeLeafs = true;
public string Separator = "/";
}
public class ExpandPSObjectRecursive {
public static IEnumerable< KeyValuePair< string, object > > Expand(
PSObject inputObject, string parentPath, ExpandPSObjectOptions options ) {
foreach( var property in inputObject.Properties ) {
var propertyPath = parentPath + options.Separator + property.Name;
if( property.Value is PSObject ) {
if( options.IncludeObjects ) {
yield return new KeyValuePair< string, object >( propertyPath, property.Value );
}
// Recursion
foreach( var prop in Expand( (PSObject) property.Value, propertyPath, options ) ) {
yield return prop;
}
continue;
}
if( options.IncludeLeafs ) {
yield return new KeyValuePair< string, object >( propertyPath, property.Value );
}
}
}
}
'#
# A PowerShell cmdlet that wraps the C# class.
Function Expand-PSObjectRecursive {
[CmdletBinding()]
param (
[Parameter(Mandatory, ValueFromPipeline)] [PSObject] $InputObject,
[Parameter()] [string] $Separator = '/',
[Parameter()] [switch] $IncludeObjects,
[Parameter()] [switch] $ExcludeLeafs
)
process {
$options = [ExpandPSObjectOptions]::new()
$options.IncludeObjects = $IncludeObjects.ToBool()
$options.IncludeLeafs = -not $ExcludeLeafs.ToBool()
$options.Separator = $Separator
[ExpandPSObjectRecursive]::Expand( $InputObject, '', $options )
}
}
The C# code is wrapped by a normal PowerShell cmdlet, so you can basically use it in the same way as the pure PowerShell function, with minor syntactic differences:
$inputFile | Expand-PSObjectRecursive | Where-Object Key -like '*/reply_urls'
I've added some other useful parameters that allows you to define the kind of elements that the cmdlet should output:
$inputFile |
Expand-PSObjectRecursive -IncludeObjects -ExcludeLeafs |
Where-Object Key -like '*/App*'
Parameter -IncludeObjects also includes PSObject properties from the input, while -ExcludeLeafs excludes the value-type properties, resulting in this output:
Key Value
--- -----
/New/App1 #{reply_urls=System.Object[]}
/New/App2 #{reply_urls=System.Object[]}
/New/App3 #{reply_urls=System.Object[]}
While the table format output in itself is not too useful, you could use the output objects for further processing, e. g.:
$apps = $inputFile |
Expand-PSObjectRecursive -IncludeObjects -ExcludeLeafs |
Where-Object Key -like '*/App*'
$apps.Value.reply_urls
Prints:
https://testapp1url1
https://testapp2url1
https://testapp2url2
https://testapp3url1
https://testapp3url2
https://testapp3url3
Implementation notes:
The C# code uses the yield return statement to return properties one-by-one, similar to what we are used from PowerShell pipelines. In inline C# code we can't use the pipeline directly (and it wouldn't be advisable for performance reasons), but yield return allows us to smoothly interface with PowerShell code that can be part of a pipeline.
PowerShell automatically inserts the return values of the C# function into the success output stream, one by one. The difference to returning an array from C# is that we may exit early at any point without having to process the whole input object first (e. g. using Select -First n). This would actually cancel the C# function, something that would otherwise only be possible using multithreading (with all its complexities)! But all of this is just single-threaded code.
I have extra large log file in CSV format which includes JSON formatted data inside. What I'm trying to do is extract JSON parts from the data and store it in a separate file.
The real problem is that the file size is almost 70Gb which causes some interesting problems to tackle.
The file size makes it impossible to read the whole file in one chunk. With Powershell's Get-Content combined with -ReadCount and Foreach-Object I can take smaller chunks and run regex pattern over them, chunk by chunk.
$Path = <pathToFile>
$outPath = <pathToOutput>
Out-File -Encoding utf8 -FilePath $outPath
$JsonRegex = "(?smi)\{.*?\}"
Get-Content -Path $Path -ReadCount 100000 | Foreach-Object {
( "$_" | Select-String -Pattern $JsonRegex -AllMatches | Foreach-Object { $_.Matches } | Foreach-Object { $_.Value } ) | Add-Content $outPath
}
But here what happens is, every 100k lines the ReadCount is in the middle of a JSON object thus skipping said object and continuing from next object.
Here is an example how this log data looks like. It includes some columns on first row and then JSON formatted data which is not consistent so I cannot use any fixed ReadCount value to avoid being in the middle of a JSON object.
"5","5","9/10/2019 12:00:46 AM","2","some","data","removed","comment","{
"message": "comment",
"level": "Information",
"logType": "User",
"timeStamp": "2019-09-10T03:00:46.5573047+03:00",
"fingerprint": "some",
}","11"
"5","5","9/10/2019 12:00:46 AM","2","some","data","removed","comment","{
"message": "comment",
"level": "Information",
"logType": "User",
"timeStamp": "2019-09-10T03:00:46.5672713+03:00",
"fingerprint": "some",
"windowsIdentity": "LOCAL\\WinID",
"machineName": "TK-141",
"processVersion": "1.0.71",
"jobId": "24a8",
"machineId": 11
}","11"
Is there any way to accomplish this without missing any data rows from the gigantous logfile?
Use a switch statement with the -Regex and -File parameters to efficiently (by PowerShell standards) read the file line by line and keep state across multiple lines.
For efficient writing to a file, use a .NET API, namely a System.IO.StreamWriter instance.
The following code assumes:
Each JSON string spans multiple lines and is non-nested.
On a given line, an opening { / closing } unambiguously marks the start / end of a (multi-line) JSON string.
# Input file path
$path = '...'
# Output file path
# Important: specify a *full* path
$outFileStream = [System.IO.StreamWriter] "$PWD/out.txt"
$json = ''
switch -Regex -File $path {
'\{.*' { $json = $Matches[0]; continue }
'.*\}' {
$json += "`n" + $Matches[0]
$outFileStream.WriteLine($json)
$json = ''
continue
}
default { if ($json) { $json += "`n" + $_ } }
}
$outFileStream.Close()
If you can further assume that no part of the JSON string follows the opening { / precedes the closing } on the same line, as your sample data suggest, you can simplify (and speed up) the switch statement:
$json = ''
switch -Regex -File $path {
'\{$' { $json ='{'; continue }
'^\}' { $outFileStream.WriteLine(($json + "`n}")); $json = ''; continue }
default { if ($json) { $json += "`n" + $_ } }
}
$outFileStream.Close()
Doug Maurer had a solution attempt involving a System.Text.StringBuilder instance so as to optimize the iterative concatenation of the parts making up each JSON string:
However, at least with an input file crafted from many repetitions of the sample data, I saw only a small performance gain in my informal tests.
For the sake of completeness, here's the System.Text.StringBuilder solution:
$json = [System.Text.StringBuilder]::new(512) # tweak the buffer size as needed
switch -Regex -File $path {
'\{$' { $null = $json.Append('{'); continue }
'^\}' { $outFileStream.WriteLine($json.Append("`n}").ToString()); $null = $json.Clear(); continue }
default { if ($json.Length) { $null = $json.Append("`n").Append($_) } }
}
$outFileStream.Close()
As I am new to Powershell, can someone please support on the looping part?
Below is the json format from Test.json file:
{
"Pre-Production_AFM": {
"allowedapps": ["app1", "app2"]
},
"Production_AFM": {
"allowedapps": ["app1", "app2"]
}
}
I am reading the json file as below
$json = (Get-Content "Test.json" -Raw) | ConvertFrom-Json
I need to loop and get the 1st and 2nd objects - "Pre-Production_AFM" and "Production_AFM" one after another dynamically.
right now I have written the code as below :
foreach($i in $json){
if($i -contains "AFM"){
Write host "execute some code"
}
}
My dout is - Will $i holds the object "Pre-Production_AFM" dynamically?
If not please suggest the way to get the objects one after one dynamically for further execution.
# read the json text
$json = #"
{
"Pre-Production_AFM": {
"allowedapps": ["app1", "app2"]
},
"Production_AFM": {
"allowedapps": ["app1", "app2"]
}
}
"#
# convert to a PSCustomObject
$data = $json | ConvertFrom-Json
# just to prove it's a PSCustomObject...
$data.GetType().FullName
# System.Management.Automation.PSCustomObject
# now we can filter the properties by name like this:
$afmProperties = $data.psobject.Properties | where-object { $_.Name -like "*_AFM" };
# and loop through all the "*_AFM" properties
foreach( $afmProperty in $afmProperties )
{
$allowedApps = $afmProperty.Value.allowedApps
# do stuff
}
For "Get-Msoldomain" powershell command-let I get the below output (lets call it Output#1) where Name, Status and Authentication are the property names and below are their respective values.
Name Status Authentication
myemail.onmicrosoft.com Verified Managed
When I use the command with "ConvertTo-Json" like below
GetMsolDomain |ConvertTo-Json
I get the below output (lets call it Output#2) in Json Format.
{
"ExtensionData": {
},
"Authentication": 0,
"Capabilities": 5,
"IsDefault": true,
"IsInitial": true,
"Name": "myemail.onmicrosoft.com",
"RootDomain": null,
"Status": 1,
"VerificationMethod": 1
}
However, the problem is, that if you notice the Status property in both the outputs, it's different. Same happens for VerificationMethod property. Without using the ConvertTo-JSon Powershell gives the Text, and with using ConvertTo-Json it gives the integer.
When I give the below command
get-msoldomain |Select-object #{Name='Status';Expression={"$($_.Status)"}}|ConvertTo-json
I get the output as
{
"Status": "Verified"
}
However, I want something so that I don't have to specify any specific property name for it to be converted , the way I am specifying above as
Select-object #{Name='Status';Expression={"$($_.Status)"}}
This line is transforming only the Status Property and not the VerificationMethod property because that is what I am providing as input .
Question: Is there something generic that I can give to the "ConvertTo-Json" commandlet, so that It returns ALL the Enum properties as Texts and not Integers, without explicitly naming them, so that I get something like below as the output:
{
"ExtensionData": {
},
"Authentication": 0,
"Capabilities": 5,
"IsDefault": true,
"IsInitial": true,
"Name": "myemail.onmicrosoft.com",
"RootDomain": null,
"Status": "Verified",
"VerificationMethod": "DnsRecord"
}
Well, if you don't mind to take a little trip :) you can convert it to CSV which will force the string output, then re-convert it back from CSV to PS Object, then finally back to Json.
Like this:
Get-MsolDomain | ConvertTo-Csv | ConvertFrom-Csv | ConvertTo-Json
If you need to keep the original Types instead of converting it all to string see mklement0 helpful answer...
PowerShell Core (PowerShell versions 6 and above) offers a simple solution via ConvertTo-Json's -EnumsAsStrings switch.
GetMsolDomain | ConvertTo-Json -EnumsAsStrings # PS *Core* (v6+) only
Unfortunately, this switch isn't supported in Windows PowerShell.
Avshalom's answer provides a quick workaround that comes with a big caveat, however: All property values are invariably converted to strings in the process, which is generally undesirable (e.g., the Authentication property's numeric value of 0 would turn into string '0').
Here's a more generic workaround based on a filter function that recursively introspects the input objects and outputs ordered hashtables that reflect the input properties with enumeration values converted to strings and all other values passed through, which you can then pass to ConvertTo-Json:
Filter ConvertTo-EnumsAsStrings ([int] $Depth = 2, [int] $CurrDepth = 0) {
if ($_ -is [enum]) { # enum value -> convert to symbolic name as string
$_.ToString()
} elseif ($null -eq $_ -or $_.GetType().IsPrimitive -or $_ -is [string] -or $_ -is [decimal] -or $_ -is [datetime] -or $_ -is [datetimeoffset]) {
$_
} elseif ($_ -is [Collections.IEnumerable] -and $_ -isnot [Collections.IDictionary]) { # enumerable (other than a dictionary)
, ($_ | ConvertTo-EnumsAsStrings -Depth $Depth -CurrDepth ($CurrDepth+1))
} else { # non-primitive type or dictionary (hashtable) -> recurse on properties / entries
if ($CurrDepth -gt $Depth) { # depth exceeded -> return .ToString() representation
Write-Warning "Recursion depth $Depth exceeded - reverting to .ToString() representations."
"$_"
} else {
$oht = [ordered] #{}
foreach ($prop in $(if ($_ -is [Collections.IDictionary]) { $_.GetEnumerator() } else { $_.psobject.properties })) {
if ($prop.Value -is [Collections.IEnumerable] -and $prop.Value -isnot [Collections.IDictionary] -and $prop.Value -isnot [string]) {
$oht[$prop.Name] = #($prop.Value | ConvertTo-EnumsAsStrings -Depth $Depth -CurrDepth ($CurrDepth+1))
} else {
$oht[$prop.Name] = $prop.Value | ConvertTo-EnumsAsStrings -Depth $Depth -CurrDepth ($CurrDepth+1)
}
}
$oht
}
}
}
Caveat: As with ConvertTo-Json, the recursion depth (-Depth) is limited to 2 by default, to prevent infinite recursion / excessively large output (as you would get with types such as [System.IO.FileInfo] via Get-ChildItem, for instance). Similarly, values that exceed the implied or specified depth are represented by their .ToString() value. Use -Depth explicitly to control the recursion depth.
Example call:
PS> [pscustomobject] #{ p1 = [platformId]::Unix; p2 = 'hi'; p3 = 1; p4 = $true } |
ConvertTo-EnumsAsStrings -Depth 2 |
ConvertTo-Json
{
"p1": "Unix", # Enum value [platformId]::Unix represented as string.
"p2": "hi", # Other types of values were left as-is.
"p3": 1,
"p4": true
}
Note: -Depth 2 isn't necessary here, given that 2 is the default value (and given that the input has depth 0), but it is shown here as a reminder that you may want to control it explicitly.
If you want to implement custom representations for additional types, such as [datetime], [datetimoffset] (using the ISO 8601-compatible .NET round-trip date-time string format, o, as PowerShell (Core) v6+ automatically does), as well as [timespan], [version], [guid] and [ipaddress], see Brett's helpful variation of this answer.
I needed to serialize pwsh objects to JSON, and was not able to use the -EnumsAsStrings parameter of ConvertTo-Json, as my code is running on psv5. As I encountered infinite loops while using #mklement0's code Editor's note: since fixed., I rewrote it. My revised code also deals with the serialization of some other types such as dates, serializing them into the ISO 8601 format, which is generally the accepted way to represent dates in JSON. Feel free to use this, and let me know if you encounter any issues.
Filter ConvertTo-EnumsAsStrings ([int] $Depth = 10, [int] $CurrDepth = 0) {
if ($CurrDepth -gt $Depth) {
Write-Error "Recursion exceeded depth limit of $Depth"
return $null
}
Switch ($_) {
{ $_ -is [enum] -or $_ -is [version] -or $_ -is [IPAddress] -or $_ -is [Guid] } {
$_.ToString()
}
{ $_ -is [datetimeoffset] } {
$_.UtcDateTime.ToString('o')
}
{ $_ -is [datetime] } {
$_.ToUniversalTime().ToString('o')
}
{ $_ -is [timespan] } {
$_.TotalSeconds
}
{ $null -eq $_ -or $_.GetType().IsPrimitive -or $_ -is [string] -or $_ -is [decimal] } {
$_
}
{ $_ -is [hashtable] } {
$ht = [ordered]#{}
$_.GetEnumerator() | ForEach-Object {
$ht[$_.Key] = ($_.Value | ConvertTo-EnumsAsStrings -Depth $Depth -CurrDepth ($CurrDepth + 1))
}
if ($ht.Keys.Count) {
$ht
}
}
{ $_ -is [pscustomobject] } {
$ht = [ordered]#{}
$_.PSObject.Properties | ForEach-Object {
if ($_.MemberType -eq 'NoteProperty') {
Switch ($_) {
{ $_.Value -is [array] -and $_.Value.Count -eq 0 } {
$ht[$_.Name] = #()
}
{ $_.Value -is [hashtable] -and $_.Value.Keys.Count -eq 0 } {
$ht[$_.Name] = #{}
}
Default {
$ht[$_.Name] = ($_.Value | ConvertTo-EnumsAsStrings -Depth $Depth -CurrDepth ($CurrDepth + 1))
}
}
}
}
if ($ht.Keys.Count) {
$ht
}
}
Default {
Write-Error "Type not supported: $($_.GetType().ToString())"
}
}
}