I'm using the Invoke-RestMethod to get the data from REST API. One of the attributes in response is the date. When using Postman or other tools to get the data the date is returned correctly but when I'm using PowerShell (version 5.1.19041.906) and its Invoke-RestMethod like this:
$response = Invoke-RestMethod -Method Get -Uri $url -Headers $requestHeaders
All values from the date attribute are automatically converted to UTC. Is there any way how to disable this shift? I need the original values returned from the API.
Invoke-RestMethod, when given a JSON response, automatically parses it into a [pscustomobject] graph; in a manner of speaking, it has ConvertFrom-Json built in.
When ConvertFrom-Json does recognize what are invariably string representation of dates in the input JSON, it converts them to [datetime] instances.
In Windows PowerShell (v5.1, the latest and final version) and as of PowerShell (Core) 7.2, you get NO control over what kind of [datetime] instances are constructed, as reflected in their .Kind property:
In Windows PowerShell, which requires a custom date-string format (e.g. "\/Date(1633984531266)\/"), you invariably get Utc instances.
In PowerShell (Core) 7+, which additionally recognizes string values that are (variations of) ISO 8601 date-time strings (e.g. "2021-10-11T13:27:12.3318432-04:00"), the .Kind value depends on the specifics of the string value:
If the string ends in Z, denoting UTC, you get a Utc instance.
If the string ends in a UTC offset, e.g. -04:00 you get a Local instance (even if the offset value is 00:00)
Note that this means that the timestamp is translated to the caller's local time zone, so the original offset information is lost (unless the caller's time zone's offset happens to match).
Otherwise you get an Unspecified instance.
While Windows PowerShell will see no new features, there is a hope for PowerShell (Core): GitHub issue #13598 proposes adding a -DateTimeKind parameter to ConvertFrom-Json, so as to allow explicitly requesting the kind of interest, and to alternatively construct [datetimeoffset] instances, which are preferable.
Workaround:
Note: In the event that you need access to the raw string values, exactly as defined, the solution below wont' work. You'll have to retrieve the raw JSON text and perform your own parsing, using Invoke-WebRequest and the response's .Content property, as Mathias R. Jessen notes.
The following snippet walks a [pscustomobject] graph, as returned from Invoke-RestMethod and explicitly converts any [datetime] instances encountered to Local instances in place (Unspecified instances are treated as Local):
# Call Invoke-RestMethod to retrieve and parse a web service's JSON response.
$fromJson = Invoke-RestMethod ...
# Convert any [datetime] instances in the object graph that aren't already
# local dates (whose .Kind value isn't already 'Local') to local ones.
& {
# Helper script block that walks the object graph.
$sb = {
foreach ($el in $args[0]) { # iterate over elements (if an array)
foreach ($prop in $el.psobject.Properties) {
# iterate over properties
if ($dt = $prop.Value -as [datetime]) {
switch ($dt.Kind) {
'Utc' { $prop.Value = $dt.ToLocalTime() }
# Note: calling .ToLocalTime() is not an option, because it interprets
# an 'Unspecified' [datetime] as UTC.
'Unspecified' { $prop.Value = [datetime]::new($dt.Ticks, 'Local') }
}
}
elseif ($prop.Value -is [Array] -or $prop.Value -is [System.Management.Automation.PSCustomObject]) {
& $sb $prop.Value # recurse
}
}
}
}
# Start walking.
& $sb $args[0]
} $fromJson
# Output the transformed-in-place object graph
# that now contains only Local [datetime] instances.
$fromJson
$response = Invoke-RestMethod -Method Get -Uri $url -Headers $requestHeaders
$changeddate = $response.fields.'System.ChangedDate'
$datetime = ([DateTime]$changeddate).ToLocalTime()
Related
Say I have JSON like:
{
"a" : {
"b" : 1,
"c" : 2
}
}
Now ConvertTo-Json will happily create PSObjects out of that. I want to access an item I could do $json.a.b and get 1 - nicely nested properties.
Now if I have the string "a.b" the question is how to use that string to access the same item in that structure? Seems like there should be some special syntax I'm missing like & for dynamic function calls because otherwise you have to interpret the string yourself using Get-Member repeatedly I expect.
No, there is no special syntax, but there is a simple workaround, using iex, the built-in alias[1] for the Invoke-Expression cmdlet:
$propertyPath = 'a.b'
# Note the ` (backtick) before $json, to prevent premature expansion.
iex "`$json.$propertyPath" # Same as: $json.a.b
# You can use the same approach for *setting* a property value:
$newValue = 'foo'
iex "`$json.$propertyPath = `$newValue" # Same as: $json.a.b = $newValue
Caveat: Do this only if you fully control or implicitly trust the value of $propertyPath.
Only in rare situation is Invoke-Expression truly needed, and it should generally be avoided, because it can be a security risk.
Note that if the target property contains an instance of a specific collection type and you want to preserve it as-is (which is not common) (e.g., if the property value is a strongly typed array such as [int[]], or an instance of a list type such as [System.Collections.Generic.List`1]), use the following:
# "," constructs an aux., transient array that is enumerated by
# Invoke-Expression and therefore returns the original property value as-is.
iex ", `$json.$propertyPath"
Without the , technique, Invoke-Expression enumerates the elements of a collection-valued property and you'll end up with a regular PowerShell array, which is of type [object[]] - typically, however, this distinction won't matter.
Note: If you were to send the result of the , technique directly through the pipeline, a collection-valued property value would be sent as a single object instead of getting enumerated, as usual. (By contrast, if you save the result in a variable first and the send it through the pipeline, the usual enumeration occurs). While you can force enumeration simply by enclosing the Invoke-Expression call in (...), there is no reason to use the , technique to begin with in this case, given that enumeration invariably entails loss of the information about the type of the collection whose elements are being enumerated.
Read on for packaged solutions.
Note:
The following packaged solutions originally used Invoke-Expression combined with sanitizing the specified property paths in order to prevent inadvertent/malicious injection of commands. However, the solutions now use a different approach, namely splitting the property path into individual property names and iteratively drilling down into the object, as shown in Gyula Kokas's helpful answer. This not only obviates the need for sanitizing, but turns out to be faster than use of Invoke-Expression (the latter is still worth considering for one-off use).
The no-frills, get-only, always-enumerate version of this technique would be the following function:
# Sample call: propByPath $json 'a.b'
function propByPath { param($obj, $propPath) foreach ($prop in $propPath.Split('.')) { $obj = $obj.$prop }; $obj }
What the more elaborate solutions below offer: parameter validation, the ability to also set a property value by path, and - in the case of the propByPath function - the option to prevent enumeration of property values that are collections (see next point).
The propByPath function offers a -NoEnumerate switch to optionally request preserving a property value's specific collection type.
By contrast, this feature is omitted from the .PropByPath() method, because there is no syntactically convenient way to request it (methods only support positional arguments). A possible solution is to create a second method, say .PropByPathNoEnumerate(), that applies the , technique discussed above.
Helper function propByPath:
function propByPath {
param(
[Parameter(Mandatory)] $Object,
[Parameter(Mandatory)] [string] $PropertyPath,
$Value, # optional value to SET
[switch] $NoEnumerate # only applies to GET
)
Set-StrictMode -Version 1
# Note: Iteratively drilling down into the object turns out to be *faster*
# than using Invoke-Expression; it also obviates the need to sanitize
# the property-path string.
$props = $PropertyPath.Split('.') # Split the path into an array of property names.
if ($PSBoundParameters.ContainsKey('Value')) { # SET
$parentObject = $Object
if ($props.Count -gt 1) {
foreach ($prop in $props[0..($props.Count-2)]) { $parentObject = $parentObject.$prop }
}
$parentObject.($props[-1]) = $Value
}
else { # GET
$value = $Object
foreach ($prop in $props) { $value = $value.$prop }
if ($NoEnumerate) {
, $value
} else {
$value
}
}
}
Instead of the Invoke-Expression call you would then use:
# GET
propByPath $obj $propertyPath
# GET, with preservation of the property value's specific collection type.
propByPath $obj $propertyPath -NoEnumerate
# SET
propByPath $obj $propertyPath 'new value'
You could even use PowerShell's ETS (extended type system) to attach a .PropByPath() method to all [pscustomobject] instances (PSv3+ syntax; in PSv2 you'd have to create a *.types.ps1xml file and load it with Update-TypeData -PrependPath):
'System.Management.Automation.PSCustomObject',
'Deserialized.System.Management.Automation.PSCustomObject' |
Update-TypeData -TypeName { $_ } `
-MemberType ScriptMethod -MemberName PropByPath -Value { #`
param(
[Parameter(Mandatory)] [string] $PropertyPath,
$Value
)
Set-StrictMode -Version 1
$props = $PropertyPath.Split('.') # Split the path into an array of property names.
if ($PSBoundParameters.ContainsKey('Value')) { # SET
$parentObject = $this
if ($props.Count -gt 1) {
foreach ($prop in $props[0..($props.Count-2)]) { $parentObject = $parentObject.$prop }
}
$parentObject.($props[-1]) = $Value
}
else { # GET
# Note: Iteratively drilling down into the object turns out to be *faster*
# than using Invoke-Expression; it also obviates the need to sanitize
# the property-path string.
$value = $this
foreach ($prop in $PropertyPath.Split('.')) { $value = $value.$prop }
$value
}
}
You could then call $obj.PropByPath('a.b') or $obj.PropByPath('a.b', 'new value')
Note: Type Deserialized.System.Management.Automation.PSCustomObject is targeted in addition to System.Management.Automation.PSCustomObject in order to also cover deserialized custom objects, which are returned in a number of scenarios, such as using Import-CliXml, receiving output from background jobs, and using remoting.
.PropByPath() will be available on any [pscustomobject] instance in the remainder of the session (even on instances created prior to the Update-TypeData call [2]); place the Update-TypeData call in your $PROFILE (profile file) to make the method available by default.
[1] Note: While it is generally advisable to limit aliases to interactive use and use full cmdlet names in scripts, use of iex to me is acceptable, because it is a built-in alias and enables a concise solution.
[2] Verify with (all on one line) $co = New-Object PSCustomObject; Update-TypeData -TypeName System.Management.Automation.PSCustomObject -MemberType ScriptMethod -MemberName GetFoo -Value { 'foo' }; $co.GetFoo(), which outputs foo even though $co was created before Update-TypeData was called.
This workaround is maybe useful to somebody.
The result goes always deeper, until it hits the right object.
$json=(Get-Content ./json.json | ConvertFrom-Json)
$result=$json
$search="a.c"
$search.split(".")|% {$result=$result.($_) }
$result
You can have 2 variables.
$json = '{
"a" : {
"b" : 1,
"c" : 2
}
}' | convertfrom-json
$a,$b = 'a','b'
$json.$a.$b
1
I'm trying to setup an IIS application pool via PowerShell 7.1.1.
I read configuration from a JSON file into the variable $configuration which is hand over to Windows Powershell because of WebAdministration module which isn't natively supported PS 7.1.1.
A script block is defined in the top level function, the configuration is injected as PSCustomObject into the script block and executed in Windows PowerShell.
function Set-AxisAppPool
{
Write-Message 'Setting up a resource pool for Axis...'
$executeInWindowsPowerShellForCompatibilityReasons = {
param (
[Parameter(Mandatory)]
[ValidateNotNullOrEmpty()]
[PSCustomObject]
$Configuration
)
Import-Module WebAdministration
Remove-WebAppPool -Name $Configuration.AppPool.Name -Confirm:$false -ErrorAction:SilentlyContinue
New-WebAppPool -Name $Configuration.AppPool.Name -Force | Write-Verbose
$params = #{
Path = "IIS:\AppPools\$($Configuration.AppPool.Name)"
Name = 'processModel'
Value = #{
userName = $Configuration.AxisUser.Name
password = $Configuration.AxisUser.Password
identitytype = 'SpecificUser'
}
}
Set-ItemProperty #params
}
powershell -NoLogo -NoProfile $executeInWindowsPowerShellForCompatibilityReasons -Args $configuration # This is a line 546
}
When the configuration JSON file exceeds a certain level, PowerShell can't pass through this deserialized JSON, the PSCustomObject, into Windows PowerShell.
Program 'powershell.exe' failed to run: The Process object must have the UseShellExecute property set to false in order to use environment
| variables.At C:\Users\JohnDoe\Desktop\Localhost automatization\Set-AxisEnvironment.ps1:546 char:5 + powershell -NoLogo -NoProfile
| $executeInWindowsPowerShellForCompa … +
It literally work with level n of objects in the JSON and it doesn't with n+1 level of objects in the configuration JSON. The JSON schema is validated, deserialization works as expected.
When I use Start-Process for invoking Windows PowerShell, I receive a different problem. Does anybody have any hint on this one?
Update
This seems to be a bug in PowerShell.
I suspect it is the size of the argument list overflowing into other fields, thus giving you weird error messages. From Start Process:
The length of the string assigned to the Arguments property must
be less than 32,699.
If you are passing a configuration that is larger than 32,699 characters (including spaces), then that likely may be your problem. It would likely take those first 32,699 characters then continue to the next field, -UseShellExecute which would receive a character which is not zero or false, and thus true. This would trip the "wrong", and misleading error message.
Prolog
It turns out that in my case it is important to understand the source of the objects - it is a JSON payload from a REST API response. Unfortunately, JSON -> Object conversion produces different results on PS desktop vs PS core. On desktop the numbers are deserialized into Int32 types, but on core - to Int64 types. From that it follows that I cannot use Export-CliXml, because the binary layout of the objects is different.
Main question
I have a unit test that needs to compare the actual result with an expected. The expected result is saved in a json file, so the procedure is:
Convert the actual result to json string
Read the expected result from disk to string
Compare the actual and the expected as strings
Unfortunately, this scheme does not work because PS desktop ConvertTo-Json and PS core ConvertTo-Json do not produce identical results. So, if the expected result was saved on desktop and the test runs on core - boom, failure. And vice versa.
One way is to keep two versions of jsons. Another way is to use a library to create the json.
First I tried the Newtonsoft-Json powershell module, but it just does not work. I think the problem is that whatever C# library we use, it must be aware of PSCustomObject and alike and treat them specially. So, we cannot just take any C# JSON library.
At this point I am left with having two jsons - one per PS edition, which is kind of sad.
Are there better options?
EDIT 1
I guess I can always read the json, convert to object and then back to json again. That sucks.
EDIT 2
I tried to use ConvertTo-Json -Compress. This eliminates the difference in spacing, but the problem is that for some reason the desktop version translates all the non characters to \u000... representation. The core version does not do it.
Please, observe:
Desktop
C:\> #{ x = "'a'" } |ConvertTo-Json -Compress
{"x":"\u0027a\u0027"}
C:\>
Core
C:\> #{ x = "'a'" } |ConvertTo-Json -Compress
{"x":"'a'"}
C:\>
Now the core version has the flag -EscapeHandling, so:
C:\> #{ x = "'a'" } |ConvertTo-Json -Compress -EscapeHandling EscapeHtml
{"x":"\u0027a\u0027"}
C:\>
Bingo! Same result. But now this code does not run on the desktop version, which does not have this flag. More massaging is needed. I will check if that is the only problem.
EDIT 3
It is impossible to reconcile the differences between the core and the desktop versions without expensive post processing. Please, observe:
Desktop
C:\> #{ x = '"a"';y = "'b'" } |ConvertTo-Json -Compress
{"y":"\u0027b\u0027","x":"\"a\""}
C:\>
Core
C:\> #{ x = '"a"';y = "'b'" } |ConvertTo-Json -Compress -EscapeHandling EscapeHtml
{"y":"\u0027b\u0027","x":"\u0022a\u0022"}
C:\> #{ x = '"a"';y = "'b'" } |ConvertTo-Json -Compress
{"y":"'b'","x":"\"a\""}
C:\>
Any suggestions on how to salvage the json approach?
EDIT 4
The Export-CliXml approach does not work too, because of the differences between the PS versions.
Desktop
C:\> ('{a:1}' | ConvertFrom-Json).a.gettype()
IsPublic IsSerial Name BaseType
-------- -------- ---- --------
True True Int32 System.ValueType
C:\>
Core
C:\> ('{a:1}' | ConvertFrom-Json).a.gettype()
IsPublic IsSerial Name BaseType
-------- -------- ---- --------
True True Int64 System.ValueType
C:\>
So the same JSON is represented using different numeric types - Int32 in desktop and Int64 in core. That puts to bed the option of using Export-CliXml.
Unless I am missing something.
I believe there is no other choice, but do the double conversion - json -> object -> json and then I will have two jsons created on the same PS edition. That sucks big time.
On converting from the original JSON, use the third-party Newtonsoft.Json PowerShell wrapper's ConvertFrom-JsonNewtonsoft cmdlet - this should ensure cross-edition compatibility (which the built-in ConvertFrom-Json does not guarantee across PowerShell editions, because Windows PowerShell uses a custom parser, whereas PowerShell [Core] v6+ uses Newtonsoft.json up to at least v7.1, though a move to the new(ish) .NET Core System.Text.Json API is coming).
Important: ConvertFrom-JsonNewtonsoft returns (arrays of) nested ordered hashtables ([ordered] #{ ... }, System.Collections.Specialized.OrderedDictionary), unlike the nested [pscustomobject] graphs that the built-in ConvertFrom-Json outputs. Similarly, ConvertTo-JsonNewtonsoft expects only (arrays of) hashtables (dictionaries) as input, and notably does not support [pscustomobject] instances, as you've learned yourself.
Caveat: As of this writing, the wrapper module was last updated in May 2019, and the version of the underlying bundled Newtonsoft.Json.dll assembly is quite old (8.0, whereas 12.0 is current as of this writing). See the module's source code.
Note that in order to parse JSON obtained from a RESTful web service manually, you mustn't use Invoke-RestMethod, as it implicitly parses and returns [pscustomobject] object graphs. Instead, use Invoke-WebRequest and access the returned response's .Content property.
On converting to a format suitable for storing on disk, you have two options:
(A) If you do need the serialized format to be JSON also, you must convert all [pscustomobject] graphs to (ordered) hashtables before passing them to ConvertTo-JsonNewtonsoft.
See below for function ConvertTo-OrderedHashTable, which does just that.
(B) If the specific serialization format isn't important, i.e. if all that matters is that the formats are identical across PowerShell editions for the purpose of comparison, no extra works is needed: use the built-in Export-Clixml cmdlet, which can handle any type and produces PowerShell's native, XML-based serialization format called CLIXML (as notably used in PowerShell remoting), which should be cross-edition-compatible (at least with v5.1 on the Windows PowerShell side and as of PowerShell [Core] v7.1, both of which use the same version of the serialization protocol, 1.1.0.1, as reported by $PSVersionTable.SerializationVersion).
While you could re-convert such a persisted file to objects with Import-Clixml, the potential loss of type fidelity on deserialization makes comparing the serialized (CLIXML) representations advisable.
Also note that as of PowerShell v7.1 there is no cmdlet-based way to create an in-memory CLIXML representation, so you'll have to use the PowerShell API directly for now: System.Management.Automation.PSSerializer.Serialize. However, providing in-memory counterparts to Import-CliXml / Export-CliXml in the form of ConvertFrom-CliXml / ConvertTo-CliXml cmdlets has been green-lighted as a future enhancement.
Re (A): Here's function ConvertTo-OrderedHashtable, which converts (potentially nested) [pscustomobject] objects to ordered hashtables while passing other types through, so you should be able to simply insert it into a pipeline as follows:
# CAVEAT: ConvertTo-JsonNewtonSoft only accepts a *single* input object.
[pscustomobject] #{ foo = 1 }, [pscustomobject] #{ foo = 2 } |
ConvertTo-OrderedHashtable |
ForEach-Object { ConvertTo-JsonNewtonSoft $_ }
function ConvertTo-OrderedHashtable {
<#
.SYNOPSIS
Converts custom objects to ordered hashtables.
.DESCRIPTION
Converts PowerShell custom objects (instances of [pscustomobject]) to
ordered hashtables (instances of [System.Collections.Specialized.OrderedDictionary]),
which is useful for to-JSON serialization via the Newtonsoft.JSON library.
Note:
* Custom objects are processed recursively.
* Any scalar non-custom objects are passed through as-is.
* Any (non-dictionary) collections in property values are converted to
[object[]] arrays.
.EXAMPLE
1, [pscustomobject] #{ foo = [pscustomobject] #{ bar = 'none' }; other = 2 } | ConvertTo-OrderedHashtable
Passes integer 1 through, and converts the custom object to a nested ordered
hashtable.
#>
[CmdletBinding()]
param(
[Parameter(ValueFromPipeline)] $InputObject
)
begin {
# Recursive helper function
function convert($obj) {
if ($obj -is [System.Management.Automation.PSCustomObject]) {
# a custom object: recurse on its properties
$oht = [ordered] #{ }
foreach ($prop in $obj.psobject.Properties) {
$oht.Add($prop.Name, (convert $prop.Value))
}
return $oht
}
elseif ($obj -isnot [string] -and $obj -is [System.Collections.IEnumerable] -and $obj -isnot [System.Collections.IDictionary]) {
# A collection of sorts (other than a string or dictionary (hash table)), recurse on its elements.
return #(foreach ($el in $obj) { convert $el })
}
else {
# a non-custom object, including .NET primitives and strings: use as-is.
return $obj
}
}
}
process {
convert $InputObject
}
}
Re (B): A demonstration of the Export-CliXml approach (you can run this code from either PS edition):
$sb = {
Install-Module -Scope CurrentUser Newtonsoft.json
if (-not $IsCoreClr) {
# Workaround for PS Core's $env:PSModulePath overriding WinPS'
Import-Module $HOME\Documents\WindowsPowerShell\Modules\newtonsoft.json
}
#'
{
"results": {
"users": [
{
"userId": 1,
"emailAddress": "jane.doe#example.com",
"date": "2020-10-05T08:08:43.743741-04:00",
"attributes": {
"height": 165,
"weight": 60
}
},
{
"userId": 2,
"emailAddress": "john.doe#example.com",
"date": "2020-10-06T08:08:43.743741-04:00",
"attributes": {
"height": 180,
"weight": 72
}
}
]
}
}
'# | ConvertFrom-JsonNewtonsoft | Export-CliXml "temp-$($PSVersionTable.PSEdition).xml"
}
# Execute the script block in both editions
Write-Verbose -vb 'Running in Windows PowerShell...'
powershell -noprofile $sb
Write-Verbose -vb 'Running in PowerShell Core...'
pwsh -noprofile $sb
# Compare the resulting CLIXML files.
Write-Verbose -vb "Comparing the resulting files: This should produce NO output,`n indicating that the files have identical content."
Compare-Object (Get-Content 'temp-Core.xml') (Get-Content 'temp-Desktop.xml')
Write-Verbose -vb 'Cleaning up...'
Remove-Item 'temp-Core.xml', 'temp-Desktop.xml'
You should see the following verbose output:
VERBOSE: Running in Windows PowerShell...
VERBOSE: Running in PowerShell Core...
VERBOSE: Comparing the resulting files: This should produce NO output,
indicating that the files have identical content.
VERBOSE: Cleaning up...
I need to HttpPost a Json-body to a ASP.NET Core Web Api endpoint (controller) using a PowerShell script.
$CurrentWindowsIdentity = New-Object Security.Principal.WindowsPrincipal([Security.Principal.WindowsIdentity]::GetCurrent())
$CurrentPrincipalName = $CurrentWindowsIdentity.Identity.Name
# Build JSON payload
$JsonString = #"
{
"CurrentPrincipalName":"$CurrentPrincipalName"
}
"#
$response = Invoke-RestMethod -Uri "https://webapiendpoint.tld/api/somecontroller" -Method Post -Body $JsonString -ContentType "application/json"
Since the value of the variable $CurrentPrincipalName can be domain\username, the json get's invalid because of the backslash, which is not properly escaped.
Error in the log of the web api:
JSON input formatter threw an exception: 'C' is an invalid escapable character within a JSON string. The string should be correctly escaped. Path: $.CurrentPrincipalName | LineNumber: 15 | BytePositionInLine: 36.
System.Text.Json.JsonException: 'C' is an invalid escapable character within a JSON string. The string should be correctly escaped. Path: $.CurrentPrincipalName
How can i make sure that when creating a json string and adding variables - whose values of course can not be controlled - the json string gets properly escaped?
i also tried ConvertTo-Json like:
$JsonConverted = $JsonString | ConvertTo-Json
and then HttpPost that object, but that was even worse:
JSON input formatter threw an exception: The JSON value could not be converted to solutionname.model. Path: $ | LineNumber: 0 | BytePositionInLine: 758.
The robust way to create JSON text is to construct your data as a hash table (#{ ... }) or custom object ( [pscustomobject] #{ ... }) first and pipe to ConvertTo-Json:
$JsonString = #{
CurrentPrincipalName = $CurrentPrincipalName
} | ConvertTo-Json
That way, PowerShell performs any necessary escaping of values for you, notably including doubling the literal \ character in your $CurrentPrincipalName value to ensure that it is treated as a literal.
Note:
Depending on how deeply nested the hashtable is, you may have to add a -Depth argument to the ConvertTo-Json call to prevent more data from getting truncated - see this post for more information.
If you have multiple properties and want to preserve their definition order in the JSON representation, use an ordered hash table ([ordered] #{ ... }) or a custom object.
Following on from a question yesterday (JSON and references to other JSON objects).
Is it possible to merge JSON objects at runtime in a similar fashion?
In my test.json I wish to insert the $Defaults.wimos object into WIMS.wimos at runtime similar to what I did for the $Paths.drive value in $Defaults.
{
Paths: {drive: "W:"},
Defaults: {wimos: {dstdrive: "$($Paths.drive)"}
},
WIMS: {
winos: "$($Defaults.wimos)",
wimre: {dstdrive: "$($Paths.drive)"}
}
}
In the following code I cannot work out the syntax to have the object replaced at runtime.
$JSONConfig="test.json"
$rawJSON = (Get-Content $JSONConfig -Raw)
$pathJSON = $rawJSON | ConvertFrom-Json
#
# Load Paths from JSON into $Paths
#
$Paths=$pathJSON.Paths
#
# Merge JSON objects to have the defaults replaced
#
$DefaultsJSON=($ExecutionContext.InvokeCommand.ExpandString($rawJSON)) | ConvertFrom-Json
#
# Load Defaults into $Defaults
#
$Defaults=$DefaultsJSON.Defaults
#
# Merge JSON objects to have the System replaced
#
$JSON = ($ExecutionContext.InvokeCommand.ExpandString($rawJSON)) | ConvertFrom-Json
$JSON.WIMS
write-host ("JSON.WIMS.wimre.dstdrive =" + $JSON.WIMS.wimre.dstdrive)
write-host ("JSON.WIMS.wimos.dstdrive =" + $JSON.WIMS.wimos.dstdrive) #This fails to access and print "W:"
I would like to be able to replace $($Defaults.wimos) and be able to query the dstdrive member.
Is this possible to achieve this in powershell? Any suggestions on how?
Thanks
Stuart
There's a couple of issues in your code. First, the value that would be inserted into your template would be interpolated as a string and not a json-object.
To do this, you could use the ConvertTo-json within the template. I'll show you how in a minute.
If you're gonna be recreating and using different "templates" within the same file (like path, defaults etc), I would create a helper function for recreating / filling out the template on your session variables all the time (in order to avoid repeating code).
Create the following method:
function Get-ParsedJsonTemplate(){
return ($ExecutionContext.InvokeCommand.ExpandString((Get-Content $JSONConfig -Raw))) | ConvertFrom-Json
}
Change your template json to the following (showing you that you can use powershell code within your template file:
{
Paths: {drive: "W:"},
Defaults: {wimos: {dstdrive: "$($Paths.drive)"}},
WIMS: {
winos: $(
if($Defaults -and $Defaults.wimos){
$Defaults.wimos|ConvertTo-Json -Compress
} else {
#{"dstdrive"=""}|ConvertTo-Json -compress
}
),
wimre: {dstdrive: "$($Paths.drive)"}
}
}
You also had a typo in either your template or your printout (I see that you used wimos in your "printout" in powershell and "winos" in your template. I modified the code and the following should work for you (given the above test.json):
$JSONConfig="c:\tmp\test.json"
function Get-ParsedJsonTemplate(){
return ($ExecutionContext.InvokeCommand.ExpandString((Get-Content $JSONConfig -Raw))) | ConvertFrom-Json
}
$pathJSON = Get-ParsedJsonTemplate;
$Paths=$pathJSON.Paths
$DefaultsJSON=Get-ParsedJsonTemplate;
$Defaults=$DefaultsJSON.Defaults
$JSON = Get-ParsedJsonTemplate;
$JSON.WIMS
write-host ("JSON.WIMS.wimre.dstdrive =" + $JSON.WIMS.wimre.dstdrive)
write-host ("JSON.WIMS.wimos.dstdrive =" + $JSON.WIMS.winos.dstdrive)
The reason your script was failing was due to the fact that $Defaults.wimos was not the string { dstdrive: "W:" } it was instead a PSCustomObject which would be interpolated to a the value #{drive=W:}. To obtain the correct behavior, you need to convert the $Defaults.wimos value back to a json string.
$Defaults = $DefaultsJSON.Defaults
$Defaults.wimos = $Defaults.wimos | ConvertTo-Json -Compress