A passing problem with PSCustomObject when passing from PowerShell to Windows PowerShell - json

I'm trying to setup an IIS application pool via PowerShell 7.1.1.
I read configuration from a JSON file into the variable $configuration which is hand over to Windows Powershell because of WebAdministration module which isn't natively supported PS 7.1.1.
A script block is defined in the top level function, the configuration is injected as PSCustomObject into the script block and executed in Windows PowerShell.
function Set-AxisAppPool
{
Write-Message 'Setting up a resource pool for Axis...'
$executeInWindowsPowerShellForCompatibilityReasons = {
param (
[Parameter(Mandatory)]
[ValidateNotNullOrEmpty()]
[PSCustomObject]
$Configuration
)
Import-Module WebAdministration
Remove-WebAppPool -Name $Configuration.AppPool.Name -Confirm:$false -ErrorAction:SilentlyContinue
New-WebAppPool -Name $Configuration.AppPool.Name -Force | Write-Verbose
$params = #{
Path = "IIS:\AppPools\$($Configuration.AppPool.Name)"
Name = 'processModel'
Value = #{
userName = $Configuration.AxisUser.Name
password = $Configuration.AxisUser.Password
identitytype = 'SpecificUser'
}
}
Set-ItemProperty #params
}
powershell -NoLogo -NoProfile $executeInWindowsPowerShellForCompatibilityReasons -Args $configuration # This is a line 546
}
When the configuration JSON file exceeds a certain level, PowerShell can't pass through this deserialized JSON, the PSCustomObject, into Windows PowerShell.
Program 'powershell.exe' failed to run: The Process object must have the UseShellExecute property set to false in order to use environment
| variables.At C:\Users\JohnDoe\Desktop\Localhost automatization\Set-AxisEnvironment.ps1:546 char:5 + powershell -NoLogo -NoProfile
| $executeInWindowsPowerShellForCompa … +
It literally work with level n of objects in the JSON and it doesn't with n+1 level of objects in the configuration JSON. The JSON schema is validated, deserialization works as expected.
When I use Start-Process for invoking Windows PowerShell, I receive a different problem. Does anybody have any hint on this one?
Update
This seems to be a bug in PowerShell.

I suspect it is the size of the argument list overflowing into other fields, thus giving you weird error messages. From Start Process:
The length of the string assigned to the Arguments property must
be less than 32,699.
If you are passing a configuration that is larger than 32,699 characters (including spaces), then that likely may be your problem. It would likely take those first 32,699 characters then continue to the next field, -UseShellExecute which would receive a character which is not zero or false, and thus true. This would trip the "wrong", and misleading error message.

Related

Powershell convertto-json Date mismatch [duplicate]

I'm using the Invoke-RestMethod to get the data from REST API. One of the attributes in response is the date. When using Postman or other tools to get the data the date is returned correctly but when I'm using PowerShell (version 5.1.19041.906) and its Invoke-RestMethod like this:
$response = Invoke-RestMethod -Method Get -Uri $url -Headers $requestHeaders
All values from the date attribute are automatically converted to UTC. Is there any way how to disable this shift? I need the original values returned from the API.
Invoke-RestMethod, when given a JSON response, automatically parses it into a [pscustomobject] graph; in a manner of speaking, it has ConvertFrom-Json built in.
When ConvertFrom-Json does recognize what are invariably string representation of dates in the input JSON, it converts them to [datetime] instances.
In Windows PowerShell (v5.1, the latest and final version) and as of PowerShell (Core) 7.2, you get NO control over what kind of [datetime] instances are constructed, as reflected in their .Kind property:
In Windows PowerShell, which requires a custom date-string format (e.g. "\/Date(1633984531266)\/"), you invariably get Utc instances.
In PowerShell (Core) 7+, which additionally recognizes string values that are (variations of) ISO 8601 date-time strings (e.g. "2021-10-11T13:27:12.3318432-04:00"), the .Kind value depends on the specifics of the string value:
If the string ends in Z, denoting UTC, you get a Utc instance.
If the string ends in a UTC offset, e.g. -04:00 you get a Local instance (even if the offset value is 00:00)
Note that this means that the timestamp is translated to the caller's local time zone, so the original offset information is lost (unless the caller's time zone's offset happens to match).
Otherwise you get an Unspecified instance.
While Windows PowerShell will see no new features, there is a hope for PowerShell (Core): GitHub issue #13598 proposes adding a -DateTimeKind parameter to ConvertFrom-Json, so as to allow explicitly requesting the kind of interest, and to alternatively construct [datetimeoffset] instances, which are preferable.
Workaround:
Note: In the event that you need access to the raw string values, exactly as defined, the solution below wont' work. You'll have to retrieve the raw JSON text and perform your own parsing, using Invoke-WebRequest and the response's .Content property, as Mathias R. Jessen notes.
The following snippet walks a [pscustomobject] graph, as returned from Invoke-RestMethod and explicitly converts any [datetime] instances encountered to Local instances in place (Unspecified instances are treated as Local):
# Call Invoke-RestMethod to retrieve and parse a web service's JSON response.
$fromJson = Invoke-RestMethod ...
# Convert any [datetime] instances in the object graph that aren't already
# local dates (whose .Kind value isn't already 'Local') to local ones.
& {
# Helper script block that walks the object graph.
$sb = {
foreach ($el in $args[0]) { # iterate over elements (if an array)
foreach ($prop in $el.psobject.Properties) {
# iterate over properties
if ($dt = $prop.Value -as [datetime]) {
switch ($dt.Kind) {
'Utc' { $prop.Value = $dt.ToLocalTime() }
# Note: calling .ToLocalTime() is not an option, because it interprets
# an 'Unspecified' [datetime] as UTC.
'Unspecified' { $prop.Value = [datetime]::new($dt.Ticks, 'Local') }
}
}
elseif ($prop.Value -is [Array] -or $prop.Value -is [System.Management.Automation.PSCustomObject]) {
& $sb $prop.Value # recurse
}
}
}
}
# Start walking.
& $sb $args[0]
} $fromJson
# Output the transformed-in-place object graph
# that now contains only Local [datetime] instances.
$fromJson
$response = Invoke-RestMethod -Method Get -Uri $url -Headers $requestHeaders
$changeddate = $response.fields.'System.ChangedDate'
$datetime = ([DateTime]$changeddate).ToLocalTime()

Set property value on object loaded from json containing comments

When loading an object from a json file one can normally set the value on properties and write the file back out like so:
$manifest = (gc $manifestPath) | ConvertFrom-Json -AsHashtable
$manifest.name = "$($manifest.name)-sxs"
$manifest | ConvertTo-Json -depth 100 | Out-File $manifestPath -Encoding utf8NoBOM
But if the source json file contains comments, the object's properties can't be set:
// *******************************************************
// GENERATED FILE - DO NOT EDIT DIRECTLY
// *******************************************************
{
"name": "PublishBuildArtifacts"
}
Running the code above throws an error:
$manifest
id : 1D341BB0-2106-458C-8422-D00BCEA6512A
name : PublishBuildArtifacts
friendlyName : ms-resource:loc.friendlyName
description : ms-resource:loc.description
category : Build
visibility : {Build}
author : Microsoft Corporation
version : #{Major=0; Minor=1; Patch=71}
demands : {}
inputs : {#{name=CopyRoot; type=filePath; label=ms-resource:loc.input.label.CopyRoot; defaultValue=;
required=False; helpMarkDown=Root folder to apply copy patterns to. Empty is the root of the
repo.}, #{name=Contents; type=multiLine; label=ms-resource:loc.input.label.Contents;
defaultValue=; required=True; helpMarkDown=File or folder paths to include as part of the
artifact.}, #{name=ArtifactName; type=string; label=ms-resource:loc.input.label.ArtifactName;
defaultValue=; required=True; helpMarkDown=The name of the artifact to create.},
#{name=ArtifactType; type=pickList; label=ms-resource:loc.input.label.ArtifactType;
defaultValue=; required=True; helpMarkDown=The name of the artifact to create.; options=}…}
instanceNameFormat : Publish Artifact: $(ArtifactName)
execution : #{PowerShell=; Node=}
$manifest.name
PublishBuildArtifacts
$manifest.name = "sxs"
InvalidOperation: The property 'name' cannot be found on this object. Verify that the property exists and can be set.
When I strip the comments, I can overwrite the property.
Is there a way I can coax PowerShell to ignore the comments while loading the json file/convert the object and generate a writable object?
I'm not sure if this is intended, but seems like ConvertFrom-Json is treating the comments on the Json as $null when converting it to an object. This only happens if it's receiving an object[] from pipeline, with a string or multi-line string it works fine.
A simple way to demonstrate this using the exact same Json posted in the question:
$contentAsArray = Get-Content test.json | Where-Object {
-not $_.StartsWith('/')
} | ConvertFrom-Json -AsHashtable
$contentAsArray['name'] = 'hello' # works
Here you can see the differences and the workaround, it is definitely recommended to use -Raw on Get-Content so you're passing a multi-line string to ConvertFrom-Json:
$contentAsString = Get-Content test.json -Raw | ConvertFrom-Json -AsHashtable
$contentAsArray = Get-Content test.json | ConvertFrom-Json -AsHashtable
$contentAsString.PSObject, $contentAsArray.PSObject | Select-Object TypeNames, BaseObject
TypeNames BaseObject
--------- ----------
{System.Collections.Hashtable, System.Object} {name}
{System.Object[], System.Array, System.Object} {$null, System.Collections.Hashtable}
$contentAsArray['name'] # null
$null -eq $contentAsArray[0] # True
$contentAsArray[1]['name'] # PublishBuildArtifacts
$contentAsArray[1]['name'] = 'hello'
$contentAsArray[1]['name'] # hello
Santiago Squarzon's helpful answer shows an effective solution and analyzes the symptom. Let me complement it with some background information.
tl;dr
You're seeing a variation of a known bug, still present as of PowerShell 7.2.1, where a blank line or a single-line comment as the first input object unexpectedly causes $null to be emitted (first) - see GitHub issue #12229.
Using -Raw with Get-Content isn't just a workaround, it is the right - and faster - thing to do when piping a file containing JSON to be parsed as whole to ConvertFrom-Json
For multiple input objects (strings), ConverFrom-Json has a(n unfortunate) heuristic built in that tries to infer whether the multiple strings represent either (a) the lines of a single JSON document or (b) separate, individual JSON documents, each on its own line, as follows:
If the first input string is valid JSON by itself, (b) is assumed, and an object representing the parsed JSON ([pscustomobject] or , with -AsHashtable, [hastable], or an array of either) is output for each input string.
Otherwise, (a) is assumed and all input strings are collected first, in a multi-line string, which is then parsed.
The aforementioned bug is that if the first string is an empty/blank line or a single-line comment[1] (applies to both // ... comments, which are invariably single-line, and /* ... */ if they happen to be single-line), (b) is (justifiably) assumed, but an extraneous $null is emitted before the remaining (non-blank, non-comment) lines are parsed and their object representation(s) are output.
As a result an array is invariably returned, whose first element is $null - which isn't obvious, but results in subtle changes in behavior, as you've experienced:
Notably, attempting to set a property on what is presumed to be a single object then fails, because the fact that an array is being accessed makes the property access an instance of member-access enumeration - implicitly applying property access to all elements of a collection rather than the collection itself - which only works for getting property values - and fails obscurely when setting is attempted - see this answer for details.
A simplified example:
# Sample output hashtable parsed from JSON
$manifest = #{ name = 'foo' }
# This is what you (justifiably) THOUGHT you were doing.
$manifest.name = 'bar' # OK
# Due to the bug, this is what you actually attempted.
($null, $manifest).name = 'bar' # !! FAILS - member-access enumeration doesn't support setting.
As noted above, the resulting error message - The property 'name' cannot be found on this object. ... - is unhelpful, as it doesn't indicate the true cause of the problem.
Improving it would be tricky, however, as the user's intent is inherently ambiguous: the property name may EITHER be a situationally unsuccessful attempt to reference a nonexistent property of the collection itself OR, as in this case, a fundamentally unsupported attempt at setting properties via member-access enumeration.
Conceivably, the following would help if the target object is a collection (enumerable) from PowerShell's perspective: The property 'name' cannot be found on this collection, and setting a collection's elements' properties by member-access enumeration isn't supported.
[1] Note that comments in JSON are supported in PowerShell (Core) v6+ only.

Verify a function in PowerShell has run succesfully

I'm writing a script to backup existing bit locker keys to the associated device in Azure AD, I've created a function which goes through the bit locker enabled volumes and backs up the key to Azure however would like to know how I can check that the function has completed successfully without any errors. Here is my code. I've added a try and catch into the function to catch any errors in the function itself however how can I check that the Function has completed succesfully - currently I have an IF statement checking that the last command has run "$? - is this correct or how can I verify please?
function Invoke-BackupBDEKeys {
##Get all current Bit Locker volumes - this will ensure keys are backed up for devices which may have additional data drives
$BitLockerVolumes = Get-BitLockerVolume | select-object MountPoint
foreach ($BDEMountPoint in $BitLockerVolumes.mountpoint) {
try {
#Get key protectors for each of the BDE mount points on the device
$BDEKeyProtector = Get-BitLockerVolume -MountPoint $BDEMountPoint | select-object -ExpandProperty keyprotector
#Get the Recovery Password protector - this will be what is backed up to AAD and used to recover access to the drive if needed
$KeyId = $BDEKeyProtector | Where-Object {$_.KeyProtectorType -eq 'RecoveryPassword'}
#Backup the recovery password to the device in AAD
BackupToAAD-BitLockerKeyProtector -MountPoint $BDEMountPoint -KeyProtectorId $KeyId.KeyProtectorId
}
catch {
Write-Host "An error has occured" $Error[0]
}
}
}
#Run function
Invoke-BackupBDEKeys
if ($? -eq $true) {
$ErrorActionPreference = "Continue"
#No errors ocurred running the last command - reg key can be set as keys have been backed up succesfully
$RegKeyPath = 'custom path'
$Name = 'custom name'
New-ItemProperty -Path $RegKeyPath -Name $Name -Value 1 -Force
Exit
}
else {
Write-Host "The backup of BDE keys were not succesful"
#Exit
}
Unfortunately, as of PowerShell 7.2.1, the automatic $? variable has no meaningful value after calling a written-in-PowerShell function (as opposed to a binary cmdlet) . (More immediately, even inside the function, $? only reflects $false at the very start of the catch block, as Mathias notes).
If PowerShell functions had feature parity with binary cmdlets, then emitting at least one (non-script-terminating) error, such as with Write-Error, would set $? in the caller's scope to $false, but that is currently not the case.
You can work around this limitation by using $PSCmdlet.WriteError() from an advanced function or script, but that is quite cumbersome. The same applies to $PSCmdlet.ThrowTerminatingError(), which is the only way to create a statement-terminating error from PowerShell code. (By contrast, the throw statement generates a script-terminating error, i.e. terminates the entire script and its callers - unless a try / catch or trap statement catches the error somewhere up the call stack).
See this answer for more information and links to relevant GitHub issues.
As a workaround, I suggest:
Make your function an advanced one, so as to enable support for the common -ErrorVariable parameter - it allows you to collect all non-terminating errors emitted by the function in a self-chosen variable.
Note: The self-chosen variable name must be passed without the $; e.g., to collection in variable $errs, use -ErrorVariable errs; do NOT use Error / $Error, because $Error is the automatic variable that collects all errors that occur in the entire session.
You can combine this with the common -ErrorAction parameter to initially silence the errors (-ErrorAction SilentlyContinue), so you can emit them later on demand. Do NOT use -ErrorAction Stop, because it will render -ErrorVariable useless and instead abort your script as a whole.
You can let the errors simply occur - no need for a try / catch statement: since there is no throw statement in your code, your loop will continue to run even if errors occur in a given iteration.
Note: While it is possible to trap terminating errors inside the loop with try / catch and then relay them as non-terminating ones with $_ | Write-Error in the catch block, you'll end up with each such error twice in the variable passed to -ErrorVariable. (If you didn't relay, the errors would still be collected, but not print.)
After invocation, check if any errors were collected, to determine whether at least one key wasn't backed up successfully.
As an aside: Of course, you could alternatively make your function output (return) a Boolean ($true or $false) to indicate whether errors occurred, but that wouldn't be an option for functions designed to output data.
Here's the outline of this approach:
function Invoke-BackupBDEKeys {
# Make the function an *advanced* function, to enable
# support for -ErrorVariable (and -ErrorAction)
[CmdletBinding()]
param()
# ...
foreach ($BDEMountPoint in $BitLockerVolumes.mountpoint) {
# ... Statements that may cause errors.
# If you need to short-circuit a loop iteration immediately
# after an error occurred, check each statement's return value; e.g.:
# if (-not $BDEKeyProtector) { continue }
}
}
# Call the function and collect any
# non-terminating errors in variable $errs.
# IMPORTANT: Pass the variable name *without the $*.
Invoke-BackupBDEKeys -ErrorAction SilentlyContinue -ErrorVariable errs
# If $errs is an empty collection, no errors occurred.
if (-not $errs) {
"No errors occurred"
# ...
}
else {
"At least one error occurred during the backup of BDE keys:`n$errs"
# ...
}
Here's a minimal example, which uses a script block in lieu of a function:
& {
[CmdletBinding()] param() Get-Item NoSuchFile
} -ErrorVariable errs -ErrorAction SilentlyContinue
"Errors collected:`n$errs"
Output:
Errors collected:
Cannot find path 'C:\Users\jdoe\NoSuchFile' because it does not exist.
As stated elsewhere, the try/catch you're using is what is preventing the relay of the error condition. That is by design and the very intentional reason for using try/catch.
What I would do in your case is either create a variable or a file to capture the error info. My apologies to anyone named 'Bob'. It's the variable name that I always use for quick stuff.
Here is a basic sample that works:
$bob = (1,2,"blue",4,"notit",7)
$bobout = #{} #create a hashtable for errors
foreach ($tempbob in $bob) {
$tempbob
try {
$tempbob - 2 #this will fail for a string
} catch {
$bobout.Add($tempbob,"not a number") #store a key/value pair (current,msg)
}
}
$bobout #output the errors
Here we created an array just to use a foreach. Think of it like your $BDEMountPoint variable.
Go through each one, do what you want. In the }catch{}, you just want to say "not a number" when it fails. Here's the output of that:
-1
0
2
5
Name Value
---- -----
notit not a number
blue not a number
All the numbers worked (you can obvious surpress output, this is just for demo).
More importantly, we stored custom text on failure.
Now, you might want a more informative error. You can grab the actual error that happened like this:
$bob = (1,2,"blue",4,"notit",7)
$bobout = #{} #create a hashtable for errors
foreach ($tempbob in $bob) {
$tempbob
try {
$tempbob - 2 #this will fail for a string
} catch {
$bobout.Add($tempbob,$PSItem) #store a key/value pair (current,error)
}
}
$bobout
Here we used the current variable under inspection $PSItem, also commonly referenced as $_.
-1
0
2
5
Name Value
---- -----
notit Cannot convert value "notit" to type "System.Int32". Error: "Input string was not in ...
blue Cannot convert value "blue" to type "System.Int32". Error: "Input string was not in a...
You can also parse the actual error and take action based on it or store custom messages. But that's outside the scope of this answer. :)

How to serialize an object in powershell to json and get identical result in PS desktop and core?

Prolog
It turns out that in my case it is important to understand the source of the objects - it is a JSON payload from a REST API response. Unfortunately, JSON -> Object conversion produces different results on PS desktop vs PS core. On desktop the numbers are deserialized into Int32 types, but on core - to Int64 types. From that it follows that I cannot use Export-CliXml, because the binary layout of the objects is different.
Main question
I have a unit test that needs to compare the actual result with an expected. The expected result is saved in a json file, so the procedure is:
Convert the actual result to json string
Read the expected result from disk to string
Compare the actual and the expected as strings
Unfortunately, this scheme does not work because PS desktop ConvertTo-Json and PS core ConvertTo-Json do not produce identical results. So, if the expected result was saved on desktop and the test runs on core - boom, failure. And vice versa.
One way is to keep two versions of jsons. Another way is to use a library to create the json.
First I tried the Newtonsoft-Json powershell module, but it just does not work. I think the problem is that whatever C# library we use, it must be aware of PSCustomObject and alike and treat them specially. So, we cannot just take any C# JSON library.
At this point I am left with having two jsons - one per PS edition, which is kind of sad.
Are there better options?
EDIT 1
I guess I can always read the json, convert to object and then back to json again. That sucks.
EDIT 2
I tried to use ConvertTo-Json -Compress. This eliminates the difference in spacing, but the problem is that for some reason the desktop version translates all the non characters to \u000... representation. The core version does not do it.
Please, observe:
Desktop
C:\> #{ x = "'a'" } |ConvertTo-Json -Compress
{"x":"\u0027a\u0027"}
C:\>
Core
C:\> #{ x = "'a'" } |ConvertTo-Json -Compress
{"x":"'a'"}
C:\>
Now the core version has the flag -EscapeHandling, so:
C:\> #{ x = "'a'" } |ConvertTo-Json -Compress -EscapeHandling EscapeHtml
{"x":"\u0027a\u0027"}
C:\>
Bingo! Same result. But now this code does not run on the desktop version, which does not have this flag. More massaging is needed. I will check if that is the only problem.
EDIT 3
It is impossible to reconcile the differences between the core and the desktop versions without expensive post processing. Please, observe:
Desktop
C:\> #{ x = '"a"';y = "'b'" } |ConvertTo-Json -Compress
{"y":"\u0027b\u0027","x":"\"a\""}
C:\>
Core
C:\> #{ x = '"a"';y = "'b'" } |ConvertTo-Json -Compress -EscapeHandling EscapeHtml
{"y":"\u0027b\u0027","x":"\u0022a\u0022"}
C:\> #{ x = '"a"';y = "'b'" } |ConvertTo-Json -Compress
{"y":"'b'","x":"\"a\""}
C:\>
Any suggestions on how to salvage the json approach?
EDIT 4
The Export-CliXml approach does not work too, because of the differences between the PS versions.
Desktop
C:\> ('{a:1}' | ConvertFrom-Json).a.gettype()
IsPublic IsSerial Name BaseType
-------- -------- ---- --------
True True Int32 System.ValueType
C:\>
Core
C:\> ('{a:1}' | ConvertFrom-Json).a.gettype()
IsPublic IsSerial Name BaseType
-------- -------- ---- --------
True True Int64 System.ValueType
C:\>
So the same JSON is represented using different numeric types - Int32 in desktop and Int64 in core. That puts to bed the option of using Export-CliXml.
Unless I am missing something.
I believe there is no other choice, but do the double conversion - json -> object -> json and then I will have two jsons created on the same PS edition. That sucks big time.
On converting from the original JSON, use the third-party Newtonsoft.Json PowerShell wrapper's ConvertFrom-JsonNewtonsoft cmdlet - this should ensure cross-edition compatibility (which the built-in ConvertFrom-Json does not guarantee across PowerShell editions, because Windows PowerShell uses a custom parser, whereas PowerShell [Core] v6+ uses Newtonsoft.json up to at least v7.1, though a move to the new(ish) .NET Core System.Text.Json API is coming).
Important: ConvertFrom-JsonNewtonsoft returns (arrays of) nested ordered hashtables ([ordered] #{ ... }, System.Collections.Specialized.OrderedDictionary), unlike the nested [pscustomobject] graphs that the built-in ConvertFrom-Json outputs. Similarly, ConvertTo-JsonNewtonsoft expects only (arrays of) hashtables (dictionaries) as input, and notably does not support [pscustomobject] instances, as you've learned yourself.
Caveat: As of this writing, the wrapper module was last updated in May 2019, and the version of the underlying bundled Newtonsoft.Json.dll assembly is quite old (8.0, whereas 12.0 is current as of this writing). See the module's source code.
Note that in order to parse JSON obtained from a RESTful web service manually, you mustn't use Invoke-RestMethod, as it implicitly parses and returns [pscustomobject] object graphs. Instead, use Invoke-WebRequest and access the returned response's .Content property.
On converting to a format suitable for storing on disk, you have two options:
(A) If you do need the serialized format to be JSON also, you must convert all [pscustomobject] graphs to (ordered) hashtables before passing them to ConvertTo-JsonNewtonsoft.
See below for function ConvertTo-OrderedHashTable, which does just that.
(B) If the specific serialization format isn't important, i.e. if all that matters is that the formats are identical across PowerShell editions for the purpose of comparison, no extra works is needed: use the built-in Export-Clixml cmdlet, which can handle any type and produces PowerShell's native, XML-based serialization format called CLIXML (as notably used in PowerShell remoting), which should be cross-edition-compatible (at least with v5.1 on the Windows PowerShell side and as of PowerShell [Core] v7.1, both of which use the same version of the serialization protocol, 1.1.0.1, as reported by $PSVersionTable.SerializationVersion).
While you could re-convert such a persisted file to objects with Import-Clixml, the potential loss of type fidelity on deserialization makes comparing the serialized (CLIXML) representations advisable.
Also note that as of PowerShell v7.1 there is no cmdlet-based way to create an in-memory CLIXML representation, so you'll have to use the PowerShell API directly for now: System.Management.Automation.PSSerializer.Serialize. However, providing in-memory counterparts to Import-CliXml / Export-CliXml in the form of ConvertFrom-CliXml / ConvertTo-CliXml cmdlets has been green-lighted as a future enhancement.
Re (A): Here's function ConvertTo-OrderedHashtable, which converts (potentially nested) [pscustomobject] objects to ordered hashtables while passing other types through, so you should be able to simply insert it into a pipeline as follows:
# CAVEAT: ConvertTo-JsonNewtonSoft only accepts a *single* input object.
[pscustomobject] #{ foo = 1 }, [pscustomobject] #{ foo = 2 } |
ConvertTo-OrderedHashtable |
ForEach-Object { ConvertTo-JsonNewtonSoft $_ }
function ConvertTo-OrderedHashtable {
<#
.SYNOPSIS
Converts custom objects to ordered hashtables.
.DESCRIPTION
Converts PowerShell custom objects (instances of [pscustomobject]) to
ordered hashtables (instances of [System.Collections.Specialized.OrderedDictionary]),
which is useful for to-JSON serialization via the Newtonsoft.JSON library.
Note:
* Custom objects are processed recursively.
* Any scalar non-custom objects are passed through as-is.
* Any (non-dictionary) collections in property values are converted to
[object[]] arrays.
.EXAMPLE
1, [pscustomobject] #{ foo = [pscustomobject] #{ bar = 'none' }; other = 2 } | ConvertTo-OrderedHashtable
Passes integer 1 through, and converts the custom object to a nested ordered
hashtable.
#>
[CmdletBinding()]
param(
[Parameter(ValueFromPipeline)] $InputObject
)
begin {
# Recursive helper function
function convert($obj) {
if ($obj -is [System.Management.Automation.PSCustomObject]) {
# a custom object: recurse on its properties
$oht = [ordered] #{ }
foreach ($prop in $obj.psobject.Properties) {
$oht.Add($prop.Name, (convert $prop.Value))
}
return $oht
}
elseif ($obj -isnot [string] -and $obj -is [System.Collections.IEnumerable] -and $obj -isnot [System.Collections.IDictionary]) {
# A collection of sorts (other than a string or dictionary (hash table)), recurse on its elements.
return #(foreach ($el in $obj) { convert $el })
}
else {
# a non-custom object, including .NET primitives and strings: use as-is.
return $obj
}
}
}
process {
convert $InputObject
}
}
Re (B): A demonstration of the Export-CliXml approach (you can run this code from either PS edition):
$sb = {
Install-Module -Scope CurrentUser Newtonsoft.json
if (-not $IsCoreClr) {
# Workaround for PS Core's $env:PSModulePath overriding WinPS'
Import-Module $HOME\Documents\WindowsPowerShell\Modules\newtonsoft.json
}
#'
{
"results": {
"users": [
{
"userId": 1,
"emailAddress": "jane.doe#example.com",
"date": "2020-10-05T08:08:43.743741-04:00",
"attributes": {
"height": 165,
"weight": 60
}
},
{
"userId": 2,
"emailAddress": "john.doe#example.com",
"date": "2020-10-06T08:08:43.743741-04:00",
"attributes": {
"height": 180,
"weight": 72
}
}
]
}
}
'# | ConvertFrom-JsonNewtonsoft | Export-CliXml "temp-$($PSVersionTable.PSEdition).xml"
}
# Execute the script block in both editions
Write-Verbose -vb 'Running in Windows PowerShell...'
powershell -noprofile $sb
Write-Verbose -vb 'Running in PowerShell Core...'
pwsh -noprofile $sb
# Compare the resulting CLIXML files.
Write-Verbose -vb "Comparing the resulting files: This should produce NO output,`n indicating that the files have identical content."
Compare-Object (Get-Content 'temp-Core.xml') (Get-Content 'temp-Desktop.xml')
Write-Verbose -vb 'Cleaning up...'
Remove-Item 'temp-Core.xml', 'temp-Desktop.xml'
You should see the following verbose output:
VERBOSE: Running in Windows PowerShell...
VERBOSE: Running in PowerShell Core...
VERBOSE: Comparing the resulting files: This should produce NO output,
indicating that the files have identical content.
VERBOSE: Cleaning up...

Add windows env variables to json object using powershell

I'm quite new to powershell and just need it for a small task so please excuse my complete and utter ineptitude for the language. I was wondering if it were possible to form a json object based off environment variables and a variable that has already been declared earlier in my script. The variable that was already declared is based off a json config named optionsConfig.json and the contents of that file are here.
{"test1": ["options_size", "options_connection", "options_object"],
"test2":["options_customArgs", "options_noUDP", "options_noName"]}
The purpose of the $Options variable in the code below is to take each element in the list value for the respective test and assume that those elements are environment variables in the system, then find their values and form a dictionary object that will be used in the json.
Here is what I have so far.
# Read the JSON file into a custom object.
$configObj = Get-Content -Raw optionsConfig.json |
ConvertFrom-Json
# Retrieve the environment variables whose
# names are listed in the $env:test property
# as name-value pairs.
Get-Item -Path env:* -Include $configObj.$env:testTool
$Options = Get-Item -Path env:* -Include $configObj.$env:testTool |
% {$hash = #{}} {$hash[$_.Name]=$_.Value} {$hash}
The $Options variable looks like so when converted to json
{
"options_size": "default",
"options_object": "forward open",
"options_connection": "connected"
}
I have a few other environment variable values that I would like to be a part of the json object. Those 3 other environment variables I would like the value of are listed below.
$Env.testTool = "test1"
$Env.RecordName = "Record1"
$Env.Target = "Target1"
How would I construct a powershell statement to get the json object to be formatted like this? -
data = {"test": $Env.testTool, "target": "$Env.Target",
"options": "$Options", "RecordName': "$Env.RecordName"}
The keys are all predefined strings and $Options is the dict object from up above. Am I able to form a Json object like this in powershell and how would it be done? Any help would be appreciated. This appears to be the last step in my struggle with powershell.
Here is what I have done.
$jObj = [ordered]#{test= $Env:testTool}
When I change this variable to $jObj = [ordered]#{test= $Env:testTool,options= $Options} I get an error saying missing expression after ','
When I change this variable to $jObj = [ordered]#{test= $Env:testTool,options= $Options} I get an error saying missing expression after ','
Entries of a hashtable literal (#{ ... } or, in its ordered form, [ordered] #{ ... }) must be separated:
either by newlines (each entry on its own line)
or by ; if placed on the same line.
Thus, the following literals are equivalent:
# Multiline form
#{
test= $env:testTool
RecordName= $env:RecordName
Target= $env.Target
options=$Options
}
# Single-line form; separator is ";"
#{ test= $env:testTool; RecordName= $env:RecordName; Target= $env.Target; options=$Options }
Get-Help about_Hashtables has more information.
$jObj = #{test= $env:testTool
RecordName= $env:RecordName
Target= $env.Target
options=$Options}
$jObj | ConvertTo-Json | Set-Content jsonStuff.json
JsonStuff.json is the new json file for the new json object. This syntax for forming $jObj seems to have done the trick.