Create JSON in Powershell with Empty array/list [duplicate] - json

Why do I get unexpected ConvertTo-Json results, why do I get values like System.Collections.Hashtable and/or why does a round-trip ($Json | ConvertFrom-Json | ConvertTo-Json) fail?
Meta issue
Stackoverflow has a good mechanism to prevent duplicate questions but as far as I can see there is no mechanism to prevent questions that have a duplicate cause. Take this question as a an example: almost every week a new question comes in with the same cause, yet it is often difficult to define it as a duplicate because the question itself is just a slightly different.
Nevertheless, I wouldn't be surprised if this question/answer itself ends up as a duplicate (or off-topic) but unfortunately stackoverflow has no possibility to write an article to prevent other programmers from continuing writing questions caused by this “known” pitfall.
Duplicates
A few examples of similar questions with the same common cause:
PowerShell ConvertTo-Json does not convert Array as expected
(yesterday)
Powershell ConvertTo-json with embedded hashtable
powershell “ConvertTo-Json” has messed json format output
Nested arrays and ConvertTo-Json
Powershell ConvertTo-JSON missing nested level
How to save a JSON object to a file using Powershell?
Cannot convert PSCustomObjects within array back to JSON correctly
ConvertTo-Json flattens arrays over 3 levels deep
Add an array of objects to a PSObject at once
Why does ConvertTo-Json drop values
How to round-trip this JSON to PSObject and back in Powershell
…
Different
So, were does this “self-answered” question differ from the above duplicates?
It has the common cause in the title and with that it might better prevent repeating questions due to the same cause.

Answer
ConvertTo-Json has a -Depth parameter:
Specifies how many levels of contained objects are included in the
JSON representation.
The default value is 2.
Example
To do a full round-trip with a JSON file you need to increase the -Depth for the ConvertTo-Json cmdlet:
$Json | ConvertFrom-Json | ConvertTo-Json -Depth 9
TL;DR
Probably because ConvertTo-Json terminates branches that are deeper than the default -Depth (2) with a (.Net) full type name, programmers assume a bug or a cmdlet limitation and do not read the help or about.
Personally, I think a string with a simple ellipsis (three dots: …) at the end of the cut off branch, would have a clearer meaning (see also: Github issue: 8381)
Why?
This issue often ends up in another discussion as well: Why is the depth limited at all?
Some objects have circular references, meaning that a child object could refer to a parent (or one of its grandparents) causing a infinitive loop if it would be serialized to JSON.
Take for example the following hash table with a parent property that refers to the object itself:
$Test = #{Guid = New-Guid}
$Test.Parent = $Test
If you execute: $Test | ConvertTo-Json it will conveniently stop at a depth level of 2 by default:
{
"Guid": "a274d017-5188-4d91-b960-023c06159dcc",
"Parent": {
"Guid": "a274d017-5188-4d91-b960-023c06159dcc",
"Parent": {
"Guid": "a274d017-5188-4d91-b960-023c06159dcc",
"Parent": "System.Collections.Hashtable"
}
}
}
This is why it is not a good idea to automatically set the -Depth to a large amount.

Update: PowerShell 7.1 introduced a warning when truncation occurs. While that is better than the previous quiet truncation, the solution suggested below seems much preferable to me.
Your helpful question and answer clearly illustrate how much of a pain point the current default ConvertTo-Json behavior is.
As for the justification of the behavior:
While -Depth can be useful to intentionally truncate an input object tree whose full depth you don't need, -Depth defaulting to 2 and quietly truncating the output amounts to quiet de-facto failure of the serialization from the unsuspecting user's perspective - failure that may not be discovered until later.
The seemingly arbitrary and quiet truncation is surprising to most users, and having to account for it in every ConvertTo-Json call is an unnecessary burden.
I've created GitHub issue #8393 containing a proposal to change the current behavior, specifically as follows:
Ignore -Depth for [pscustomobject] object graphs (a hierarchy of what are conceptually DTOs (data-transfer objects, "property bags"), such as returned from Convert*From*-Json), specifically.
By contrast, it does make sense to have an automatic depth limit for arbitrary .NET types, as they can be object graphs of excessive depths and may even contain circular references; e.g., Get-ChildItem | ConvertTo-Json can get quickly out of hand, with -Depth values as low as 4. That said, it is generally ill-advised to use arbitrary .NET types with JSON serialization: JSON is not designed to be a general-purpose serialization format for a given platform's types; instead, it is focused on DTOs, comprising properties only, with a limited set set of data types.
Note that nested collections, including hashtables, are not themselves subject to the depth limit only their (scalar) elements.
This distinction between DTOs and other types is, in fact, employed by PowerShell itself behind the scenes, namely in the context of serialization for remoting and background jobs.
Use of -Depth is then only needed to intentionally truncate the input object tree at the specified depth (or, mostly hypothetically, in order to serialize to a deeper level than the internal maximum-depth limit, 100).

Related

Powershell custom object to Json [duplicate]

Why do I get unexpected ConvertTo-Json results, why do I get values like System.Collections.Hashtable and/or why does a round-trip ($Json | ConvertFrom-Json | ConvertTo-Json) fail?
Meta issue
Stackoverflow has a good mechanism to prevent duplicate questions but as far as I can see there is no mechanism to prevent questions that have a duplicate cause. Take this question as a an example: almost every week a new question comes in with the same cause, yet it is often difficult to define it as a duplicate because the question itself is just a slightly different.
Nevertheless, I wouldn't be surprised if this question/answer itself ends up as a duplicate (or off-topic) but unfortunately stackoverflow has no possibility to write an article to prevent other programmers from continuing writing questions caused by this “known” pitfall.
Duplicates
A few examples of similar questions with the same common cause:
PowerShell ConvertTo-Json does not convert Array as expected
(yesterday)
Powershell ConvertTo-json with embedded hashtable
powershell “ConvertTo-Json” has messed json format output
Nested arrays and ConvertTo-Json
Powershell ConvertTo-JSON missing nested level
How to save a JSON object to a file using Powershell?
Cannot convert PSCustomObjects within array back to JSON correctly
ConvertTo-Json flattens arrays over 3 levels deep
Add an array of objects to a PSObject at once
Why does ConvertTo-Json drop values
How to round-trip this JSON to PSObject and back in Powershell
…
Different
So, were does this “self-answered” question differ from the above duplicates?
It has the common cause in the title and with that it might better prevent repeating questions due to the same cause.
Answer
ConvertTo-Json has a -Depth parameter:
Specifies how many levels of contained objects are included in the
JSON representation.
The default value is 2.
Example
To do a full round-trip with a JSON file you need to increase the -Depth for the ConvertTo-Json cmdlet:
$Json | ConvertFrom-Json | ConvertTo-Json -Depth 9
TL;DR
Probably because ConvertTo-Json terminates branches that are deeper than the default -Depth (2) with a (.Net) full type name, programmers assume a bug or a cmdlet limitation and do not read the help or about.
Personally, I think a string with a simple ellipsis (three dots: …) at the end of the cut off branch, would have a clearer meaning (see also: Github issue: 8381)
Why?
This issue often ends up in another discussion as well: Why is the depth limited at all?
Some objects have circular references, meaning that a child object could refer to a parent (or one of its grandparents) causing a infinitive loop if it would be serialized to JSON.
Take for example the following hash table with a parent property that refers to the object itself:
$Test = #{Guid = New-Guid}
$Test.Parent = $Test
If you execute: $Test | ConvertTo-Json it will conveniently stop at a depth level of 2 by default:
{
"Guid": "a274d017-5188-4d91-b960-023c06159dcc",
"Parent": {
"Guid": "a274d017-5188-4d91-b960-023c06159dcc",
"Parent": {
"Guid": "a274d017-5188-4d91-b960-023c06159dcc",
"Parent": "System.Collections.Hashtable"
}
}
}
This is why it is not a good idea to automatically set the -Depth to a large amount.
Update: PowerShell 7.1 introduced a warning when truncation occurs. While that is better than the previous quiet truncation, the solution suggested below seems much preferable to me.
Your helpful question and answer clearly illustrate how much of a pain point the current default ConvertTo-Json behavior is.
As for the justification of the behavior:
While -Depth can be useful to intentionally truncate an input object tree whose full depth you don't need, -Depth defaulting to 2 and quietly truncating the output amounts to quiet de-facto failure of the serialization from the unsuspecting user's perspective - failure that may not be discovered until later.
The seemingly arbitrary and quiet truncation is surprising to most users, and having to account for it in every ConvertTo-Json call is an unnecessary burden.
I've created GitHub issue #8393 containing a proposal to change the current behavior, specifically as follows:
Ignore -Depth for [pscustomobject] object graphs (a hierarchy of what are conceptually DTOs (data-transfer objects, "property bags"), such as returned from Convert*From*-Json), specifically.
By contrast, it does make sense to have an automatic depth limit for arbitrary .NET types, as they can be object graphs of excessive depths and may even contain circular references; e.g., Get-ChildItem | ConvertTo-Json can get quickly out of hand, with -Depth values as low as 4. That said, it is generally ill-advised to use arbitrary .NET types with JSON serialization: JSON is not designed to be a general-purpose serialization format for a given platform's types; instead, it is focused on DTOs, comprising properties only, with a limited set set of data types.
Note that nested collections, including hashtables, are not themselves subject to the depth limit only their (scalar) elements.
This distinction between DTOs and other types is, in fact, employed by PowerShell itself behind the scenes, namely in the context of serialization for remoting and background jobs.
Use of -Depth is then only needed to intentionally truncate the input object tree at the specified depth (or, mostly hypothetically, in order to serialize to a deeper level than the internal maximum-depth limit, 100).

Adding array to a JSON object in PowerShell [duplicate]

Why do I get unexpected ConvertTo-Json results, why do I get values like System.Collections.Hashtable and/or why does a round-trip ($Json | ConvertFrom-Json | ConvertTo-Json) fail?
Meta issue
Stackoverflow has a good mechanism to prevent duplicate questions but as far as I can see there is no mechanism to prevent questions that have a duplicate cause. Take this question as a an example: almost every week a new question comes in with the same cause, yet it is often difficult to define it as a duplicate because the question itself is just a slightly different.
Nevertheless, I wouldn't be surprised if this question/answer itself ends up as a duplicate (or off-topic) but unfortunately stackoverflow has no possibility to write an article to prevent other programmers from continuing writing questions caused by this “known” pitfall.
Duplicates
A few examples of similar questions with the same common cause:
PowerShell ConvertTo-Json does not convert Array as expected
(yesterday)
Powershell ConvertTo-json with embedded hashtable
powershell “ConvertTo-Json” has messed json format output
Nested arrays and ConvertTo-Json
Powershell ConvertTo-JSON missing nested level
How to save a JSON object to a file using Powershell?
Cannot convert PSCustomObjects within array back to JSON correctly
ConvertTo-Json flattens arrays over 3 levels deep
Add an array of objects to a PSObject at once
Why does ConvertTo-Json drop values
How to round-trip this JSON to PSObject and back in Powershell
…
Different
So, were does this “self-answered” question differ from the above duplicates?
It has the common cause in the title and with that it might better prevent repeating questions due to the same cause.
Answer
ConvertTo-Json has a -Depth parameter:
Specifies how many levels of contained objects are included in the
JSON representation.
The default value is 2.
Example
To do a full round-trip with a JSON file you need to increase the -Depth for the ConvertTo-Json cmdlet:
$Json | ConvertFrom-Json | ConvertTo-Json -Depth 9
TL;DR
Probably because ConvertTo-Json terminates branches that are deeper than the default -Depth (2) with a (.Net) full type name, programmers assume a bug or a cmdlet limitation and do not read the help or about.
Personally, I think a string with a simple ellipsis (three dots: …) at the end of the cut off branch, would have a clearer meaning (see also: Github issue: 8381)
Why?
This issue often ends up in another discussion as well: Why is the depth limited at all?
Some objects have circular references, meaning that a child object could refer to a parent (or one of its grandparents) causing a infinitive loop if it would be serialized to JSON.
Take for example the following hash table with a parent property that refers to the object itself:
$Test = #{Guid = New-Guid}
$Test.Parent = $Test
If you execute: $Test | ConvertTo-Json it will conveniently stop at a depth level of 2 by default:
{
"Guid": "a274d017-5188-4d91-b960-023c06159dcc",
"Parent": {
"Guid": "a274d017-5188-4d91-b960-023c06159dcc",
"Parent": {
"Guid": "a274d017-5188-4d91-b960-023c06159dcc",
"Parent": "System.Collections.Hashtable"
}
}
}
This is why it is not a good idea to automatically set the -Depth to a large amount.
Update: PowerShell 7.1 introduced a warning when truncation occurs. While that is better than the previous quiet truncation, the solution suggested below seems much preferable to me.
Your helpful question and answer clearly illustrate how much of a pain point the current default ConvertTo-Json behavior is.
As for the justification of the behavior:
While -Depth can be useful to intentionally truncate an input object tree whose full depth you don't need, -Depth defaulting to 2 and quietly truncating the output amounts to quiet de-facto failure of the serialization from the unsuspecting user's perspective - failure that may not be discovered until later.
The seemingly arbitrary and quiet truncation is surprising to most users, and having to account for it in every ConvertTo-Json call is an unnecessary burden.
I've created GitHub issue #8393 containing a proposal to change the current behavior, specifically as follows:
Ignore -Depth for [pscustomobject] object graphs (a hierarchy of what are conceptually DTOs (data-transfer objects, "property bags"), such as returned from Convert*From*-Json), specifically.
By contrast, it does make sense to have an automatic depth limit for arbitrary .NET types, as they can be object graphs of excessive depths and may even contain circular references; e.g., Get-ChildItem | ConvertTo-Json can get quickly out of hand, with -Depth values as low as 4. That said, it is generally ill-advised to use arbitrary .NET types with JSON serialization: JSON is not designed to be a general-purpose serialization format for a given platform's types; instead, it is focused on DTOs, comprising properties only, with a limited set set of data types.
Note that nested collections, including hashtables, are not themselves subject to the depth limit only their (scalar) elements.
This distinction between DTOs and other types is, in fact, employed by PowerShell itself behind the scenes, namely in the context of serialization for remoting and background jobs.
Use of -Depth is then only needed to intentionally truncate the input object tree at the specified depth (or, mostly hypothetically, in order to serialize to a deeper level than the internal maximum-depth limit, 100).

Unable to create proper json object using powershell [duplicate]

Why do I get unexpected ConvertTo-Json results, why do I get values like System.Collections.Hashtable and/or why does a round-trip ($Json | ConvertFrom-Json | ConvertTo-Json) fail?
Meta issue
Stackoverflow has a good mechanism to prevent duplicate questions but as far as I can see there is no mechanism to prevent questions that have a duplicate cause. Take this question as a an example: almost every week a new question comes in with the same cause, yet it is often difficult to define it as a duplicate because the question itself is just a slightly different.
Nevertheless, I wouldn't be surprised if this question/answer itself ends up as a duplicate (or off-topic) but unfortunately stackoverflow has no possibility to write an article to prevent other programmers from continuing writing questions caused by this “known” pitfall.
Duplicates
A few examples of similar questions with the same common cause:
PowerShell ConvertTo-Json does not convert Array as expected
(yesterday)
Powershell ConvertTo-json with embedded hashtable
powershell “ConvertTo-Json” has messed json format output
Nested arrays and ConvertTo-Json
Powershell ConvertTo-JSON missing nested level
How to save a JSON object to a file using Powershell?
Cannot convert PSCustomObjects within array back to JSON correctly
ConvertTo-Json flattens arrays over 3 levels deep
Add an array of objects to a PSObject at once
Why does ConvertTo-Json drop values
How to round-trip this JSON to PSObject and back in Powershell
…
Different
So, were does this “self-answered” question differ from the above duplicates?
It has the common cause in the title and with that it might better prevent repeating questions due to the same cause.
Answer
ConvertTo-Json has a -Depth parameter:
Specifies how many levels of contained objects are included in the
JSON representation.
The default value is 2.
Example
To do a full round-trip with a JSON file you need to increase the -Depth for the ConvertTo-Json cmdlet:
$Json | ConvertFrom-Json | ConvertTo-Json -Depth 9
TL;DR
Probably because ConvertTo-Json terminates branches that are deeper than the default -Depth (2) with a (.Net) full type name, programmers assume a bug or a cmdlet limitation and do not read the help or about.
Personally, I think a string with a simple ellipsis (three dots: …) at the end of the cut off branch, would have a clearer meaning (see also: Github issue: 8381)
Why?
This issue often ends up in another discussion as well: Why is the depth limited at all?
Some objects have circular references, meaning that a child object could refer to a parent (or one of its grandparents) causing a infinitive loop if it would be serialized to JSON.
Take for example the following hash table with a parent property that refers to the object itself:
$Test = #{Guid = New-Guid}
$Test.Parent = $Test
If you execute: $Test | ConvertTo-Json it will conveniently stop at a depth level of 2 by default:
{
"Guid": "a274d017-5188-4d91-b960-023c06159dcc",
"Parent": {
"Guid": "a274d017-5188-4d91-b960-023c06159dcc",
"Parent": {
"Guid": "a274d017-5188-4d91-b960-023c06159dcc",
"Parent": "System.Collections.Hashtable"
}
}
}
This is why it is not a good idea to automatically set the -Depth to a large amount.
Update: PowerShell 7.1 introduced a warning when truncation occurs. While that is better than the previous quiet truncation, the solution suggested below seems much preferable to me.
Your helpful question and answer clearly illustrate how much of a pain point the current default ConvertTo-Json behavior is.
As for the justification of the behavior:
While -Depth can be useful to intentionally truncate an input object tree whose full depth you don't need, -Depth defaulting to 2 and quietly truncating the output amounts to quiet de-facto failure of the serialization from the unsuspecting user's perspective - failure that may not be discovered until later.
The seemingly arbitrary and quiet truncation is surprising to most users, and having to account for it in every ConvertTo-Json call is an unnecessary burden.
I've created GitHub issue #8393 containing a proposal to change the current behavior, specifically as follows:
Ignore -Depth for [pscustomobject] object graphs (a hierarchy of what are conceptually DTOs (data-transfer objects, "property bags"), such as returned from Convert*From*-Json), specifically.
By contrast, it does make sense to have an automatic depth limit for arbitrary .NET types, as they can be object graphs of excessive depths and may even contain circular references; e.g., Get-ChildItem | ConvertTo-Json can get quickly out of hand, with -Depth values as low as 4. That said, it is generally ill-advised to use arbitrary .NET types with JSON serialization: JSON is not designed to be a general-purpose serialization format for a given platform's types; instead, it is focused on DTOs, comprising properties only, with a limited set set of data types.
Note that nested collections, including hashtables, are not themselves subject to the depth limit only their (scalar) elements.
This distinction between DTOs and other types is, in fact, employed by PowerShell itself behind the scenes, namely in the context of serialization for remoting and background jobs.
Use of -Depth is then only needed to intentionally truncate the input object tree at the specified depth (or, mostly hypothetically, in order to serialize to a deeper level than the internal maximum-depth limit, 100).

How to properly render nested hashtables in Powershell [duplicate]

Why do I get unexpected ConvertTo-Json results, why do I get values like System.Collections.Hashtable and/or why does a round-trip ($Json | ConvertFrom-Json | ConvertTo-Json) fail?
Meta issue
Stackoverflow has a good mechanism to prevent duplicate questions but as far as I can see there is no mechanism to prevent questions that have a duplicate cause. Take this question as a an example: almost every week a new question comes in with the same cause, yet it is often difficult to define it as a duplicate because the question itself is just a slightly different.
Nevertheless, I wouldn't be surprised if this question/answer itself ends up as a duplicate (or off-topic) but unfortunately stackoverflow has no possibility to write an article to prevent other programmers from continuing writing questions caused by this “known” pitfall.
Duplicates
A few examples of similar questions with the same common cause:
PowerShell ConvertTo-Json does not convert Array as expected
(yesterday)
Powershell ConvertTo-json with embedded hashtable
powershell “ConvertTo-Json” has messed json format output
Nested arrays and ConvertTo-Json
Powershell ConvertTo-JSON missing nested level
How to save a JSON object to a file using Powershell?
Cannot convert PSCustomObjects within array back to JSON correctly
ConvertTo-Json flattens arrays over 3 levels deep
Add an array of objects to a PSObject at once
Why does ConvertTo-Json drop values
How to round-trip this JSON to PSObject and back in Powershell
…
Different
So, were does this “self-answered” question differ from the above duplicates?
It has the common cause in the title and with that it might better prevent repeating questions due to the same cause.
Answer
ConvertTo-Json has a -Depth parameter:
Specifies how many levels of contained objects are included in the
JSON representation.
The default value is 2.
Example
To do a full round-trip with a JSON file you need to increase the -Depth for the ConvertTo-Json cmdlet:
$Json | ConvertFrom-Json | ConvertTo-Json -Depth 9
TL;DR
Probably because ConvertTo-Json terminates branches that are deeper than the default -Depth (2) with a (.Net) full type name, programmers assume a bug or a cmdlet limitation and do not read the help or about.
Personally, I think a string with a simple ellipsis (three dots: …) at the end of the cut off branch, would have a clearer meaning (see also: Github issue: 8381)
Why?
This issue often ends up in another discussion as well: Why is the depth limited at all?
Some objects have circular references, meaning that a child object could refer to a parent (or one of its grandparents) causing a infinitive loop if it would be serialized to JSON.
Take for example the following hash table with a parent property that refers to the object itself:
$Test = #{Guid = New-Guid}
$Test.Parent = $Test
If you execute: $Test | ConvertTo-Json it will conveniently stop at a depth level of 2 by default:
{
"Guid": "a274d017-5188-4d91-b960-023c06159dcc",
"Parent": {
"Guid": "a274d017-5188-4d91-b960-023c06159dcc",
"Parent": {
"Guid": "a274d017-5188-4d91-b960-023c06159dcc",
"Parent": "System.Collections.Hashtable"
}
}
}
This is why it is not a good idea to automatically set the -Depth to a large amount.
Update: PowerShell 7.1 introduced a warning when truncation occurs. While that is better than the previous quiet truncation, the solution suggested below seems much preferable to me.
Your helpful question and answer clearly illustrate how much of a pain point the current default ConvertTo-Json behavior is.
As for the justification of the behavior:
While -Depth can be useful to intentionally truncate an input object tree whose full depth you don't need, -Depth defaulting to 2 and quietly truncating the output amounts to quiet de-facto failure of the serialization from the unsuspecting user's perspective - failure that may not be discovered until later.
The seemingly arbitrary and quiet truncation is surprising to most users, and having to account for it in every ConvertTo-Json call is an unnecessary burden.
I've created GitHub issue #8393 containing a proposal to change the current behavior, specifically as follows:
Ignore -Depth for [pscustomobject] object graphs (a hierarchy of what are conceptually DTOs (data-transfer objects, "property bags"), such as returned from Convert*From*-Json), specifically.
By contrast, it does make sense to have an automatic depth limit for arbitrary .NET types, as they can be object graphs of excessive depths and may even contain circular references; e.g., Get-ChildItem | ConvertTo-Json can get quickly out of hand, with -Depth values as low as 4. That said, it is generally ill-advised to use arbitrary .NET types with JSON serialization: JSON is not designed to be a general-purpose serialization format for a given platform's types; instead, it is focused on DTOs, comprising properties only, with a limited set set of data types.
Note that nested collections, including hashtables, are not themselves subject to the depth limit only their (scalar) elements.
This distinction between DTOs and other types is, in fact, employed by PowerShell itself behind the scenes, namely in the context of serialization for remoting and background jobs.
Use of -Depth is then only needed to intentionally truncate the input object tree at the specified depth (or, mostly hypothetically, in order to serialize to a deeper level than the internal maximum-depth limit, 100).

Unexpected ConvertTo-Json results? Answer: it has a default -Depth of 2

Why do I get unexpected ConvertTo-Json results, why do I get values like System.Collections.Hashtable and/or why does a round-trip ($Json | ConvertFrom-Json | ConvertTo-Json) fail?
Meta issue
Stackoverflow has a good mechanism to prevent duplicate questions but as far as I can see there is no mechanism to prevent questions that have a duplicate cause. Take this question as a an example: almost every week a new question comes in with the same cause, yet it is often difficult to define it as a duplicate because the question itself is just a slightly different.
Nevertheless, I wouldn't be surprised if this question/answer itself ends up as a duplicate (or off-topic) but unfortunately stackoverflow has no possibility to write an article to prevent other programmers from continuing writing questions caused by this “known” pitfall.
Duplicates
A few examples of similar questions with the same common cause:
PowerShell ConvertTo-Json does not convert Array as expected
(yesterday)
Powershell ConvertTo-json with embedded hashtable
powershell “ConvertTo-Json” has messed json format output
Nested arrays and ConvertTo-Json
Powershell ConvertTo-JSON missing nested level
How to save a JSON object to a file using Powershell?
Cannot convert PSCustomObjects within array back to JSON correctly
ConvertTo-Json flattens arrays over 3 levels deep
Add an array of objects to a PSObject at once
Why does ConvertTo-Json drop values
How to round-trip this JSON to PSObject and back in Powershell
…
Different
So, were does this “self-answered” question differ from the above duplicates?
It has the common cause in the title and with that it might better prevent repeating questions due to the same cause.
Answer
ConvertTo-Json has a -Depth parameter:
Specifies how many levels of contained objects are included in the
JSON representation.
The default value is 2.
Example
To do a full round-trip with a JSON file you need to increase the -Depth for the ConvertTo-Json cmdlet:
$Json | ConvertFrom-Json | ConvertTo-Json -Depth 9
TL;DR
Probably because ConvertTo-Json terminates branches that are deeper than the default -Depth (2) with a (.Net) full type name, programmers assume a bug or a cmdlet limitation and do not read the help or about.
Personally, I think a string with a simple ellipsis (three dots: …) at the end of the cut off branch, would have a clearer meaning (see also: Github issue: 8381)
Why?
This issue often ends up in another discussion as well: Why is the depth limited at all?
Some objects have circular references, meaning that a child object could refer to a parent (or one of its grandparents) causing a infinitive loop if it would be serialized to JSON.
Take for example the following hash table with a parent property that refers to the object itself:
$Test = #{Guid = New-Guid}
$Test.Parent = $Test
If you execute: $Test | ConvertTo-Json it will conveniently stop at a depth level of 2 by default:
{
"Guid": "a274d017-5188-4d91-b960-023c06159dcc",
"Parent": {
"Guid": "a274d017-5188-4d91-b960-023c06159dcc",
"Parent": {
"Guid": "a274d017-5188-4d91-b960-023c06159dcc",
"Parent": "System.Collections.Hashtable"
}
}
}
This is why it is not a good idea to automatically set the -Depth to a large amount.
Update: PowerShell 7.1 introduced a warning when truncation occurs. While that is better than the previous quiet truncation, the solution suggested below seems much preferable to me.
Your helpful question and answer clearly illustrate how much of a pain point the current default ConvertTo-Json behavior is.
As for the justification of the behavior:
While -Depth can be useful to intentionally truncate an input object tree whose full depth you don't need, -Depth defaulting to 2 and quietly truncating the output amounts to quiet de-facto failure of the serialization from the unsuspecting user's perspective - failure that may not be discovered until later.
The seemingly arbitrary and quiet truncation is surprising to most users, and having to account for it in every ConvertTo-Json call is an unnecessary burden.
I've created GitHub issue #8393 containing a proposal to change the current behavior, specifically as follows:
Ignore -Depth for [pscustomobject] object graphs (a hierarchy of what are conceptually DTOs (data-transfer objects, "property bags"), such as returned from Convert*From*-Json), specifically.
By contrast, it does make sense to have an automatic depth limit for arbitrary .NET types, as they can be object graphs of excessive depths and may even contain circular references; e.g., Get-ChildItem | ConvertTo-Json can get quickly out of hand, with -Depth values as low as 4. That said, it is generally ill-advised to use arbitrary .NET types with JSON serialization: JSON is not designed to be a general-purpose serialization format for a given platform's types; instead, it is focused on DTOs, comprising properties only, with a limited set set of data types.
Note that nested collections, including hashtables, are not themselves subject to the depth limit only their (scalar) elements.
This distinction between DTOs and other types is, in fact, employed by PowerShell itself behind the scenes, namely in the context of serialization for remoting and background jobs.
Use of -Depth is then only needed to intentionally truncate the input object tree at the specified depth (or, mostly hypothetically, in order to serialize to a deeper level than the internal maximum-depth limit, 100).