How to stop Powershell 7 ConvertFrom-Json reformatting timestamps? - json

Get-Content 'file.json' | ConvertFrom-Json
This produces a different result for powershell 5 vs 7.
v5 gives me actual timestamp values from the json: eg 2018-01-26T17:48:51.220Z
v7 gives me reprocessed timestamp values from the json eg 26/01/2018 17:48:51
How can I get v7 to behave as v5? I need the original values from the json.

The behavior of ConvertFrom-Json changed in PowerShell [Core] v6+: string values formatted with the o (round-trip) standard date/time format are now converted to [datetime]) instances rather than being parsed as strings - this is a convenient way to round-trip timestamps via (v6+) ConvertTo-Json, without having to do explicit to/from string conversions.
If you need the old behavior back, convert the resulting [datetime] instances back to strings explicitly, using .ToString('o').
Here's a simple example:
# v6+
PS> ('{ "timestamp": "2018-01-26T17:48:51.220Z" }' |
ConvertFrom-Json).timestamp.ToString('o')
2018-01-26T17:48:51.2200000Z
There is some flexibility around variations in the input format: the fractional seconds are optional, and if, present, the number of decimal places is allowed to vary.
By contrast, the o format always uses 7 decimal places, which differs from your input.
You're free to apply custom formatting based on a fixed number of decimal places, but note that you won't be able to tell how many decimal places were actually used in the input.
E.g., to get 3 decimal places:
[datetime]::UtcNow.ToString("yyyy-MM-dd'T'HH':'mm':'ss'.'fffK")
If you want to prevent the to-[datetime] conversion at the source, you'll have to use a lower-level approach - ConvertFrom-Json doesn't offer a solution.

Related

ERROR: malformed array literal in PostgreSQL

I want filter on integer array in postgresql but when I am executing below query its giving me malformed array literal error.
select * from querytesting where 1111111111 = any((jsondoc->>'PhoneNumber')::integer[]);
Open image for reference-
https://i.stack.imgur.com/Py3Z2.png
any(x) wants a PostgreSQL array as x. (jsondoc->>'PhoneNumber'), however, is giving you a text representation of a JSON array. A PostgreSQL array would look like this as text:
'{1,2,3}'
but the JSON version you get from ->> would look like:
'[1,2,3]'
You can't mix the two types of array.
You could use a JSON operator instead:
jsondoc->'PhoneNumber' #> 1111111111::text::jsonb
Using -> instead of ->> gives you a JSON array rather than text. Then you can see if the number you're looking for is in that array with #>. The double cast (::text::jsonb) is needed to convert the PostgreSQL number to a JSON number for the #> operator.
As an aside, storing phone numbers as numbers might not be the best idea. You don't do arithmetic on phone numbers so they're not really numbers at all, they're really strings that contain digit characters. Normalizing the phone number format to international standards and then treating them as strings will probably serve you better in the long term.

Powershell Core deserializes numbers in JSON as Int64 vs Windows Powershell which does it as Int32

Please observe:
Windows Powershell
C:\> ("1" | ConvertFrom-Json).gettype().name
Int32
C:\>
Powershell Core
C:\> ("1" | ConvertFrom-Json).gettype().name
Int64
C:\>
This is not benign. Consider a map with keys being integers:
$x = #{
123 = 1
}
The key 123 is an Int32, not Int64. So, if 123 comes from parsed JSON it would be of different types in different shells. Now:
C:\> $x[[Int32]123]
1
C:\> $x[[Int64]123]
C:\>
And this is true on both shells. This change in behavior wrecks havoc in our automation scripts that manipulate things using REST APIs.
Can this behavior of Powershell Core be turned off?
The two PowerShell editions use different implementations, causing the divergent behavior you observed:
Windows PowerShell uses a custom implementation, whereas PowerShell [Core] v6+, as of v7.1, uses the Json.NET library behind the scenes; see this answer for that library's rationale for deserializing to System.Int64 ([long]) by default.
As of PowerShell 7.1, there's a planned move to the now native .NET JSON functionality (available in .NET Core 3+) available via the System.Text.Json namespace, which may restore serializing to System.Int32 ([int]) by default, given that breaking changes are inevitable anyway:
See the discussion in GitHub issue #14264 (created by iRon based on this question) and GitHub PR #11198, which is preparing the move to System.Text.Json.
A related problem is that numbers too large to fit into a System.Int64 ([long]) are also serialized differently (see GitHub issue #9207):
Windows PowerShell: first chooses System.Decimal ([decimal]) and for even larger numbers System.Double ([double]).
PowerShell [Core] as of v7.1: always chooses System.BigInt ([bigint]).
Also, there are differences with respect to the number formats supported:
Windows PowerShell does not recognize hexadecimal numbers (e.g., 0x10), in accordance with the JSON spec[1], whereas PowerShell [Core] as of v7.1 does; however, as another extension to the spec, both support scientific notation (e.g., 1e2 for 100) and parse it as [double].
Windows PowerShell, as another extension to the spec, does support +-prefixed numbers (e.g., +10), whereas PowerShell [Core] as of v7.1 does not.
(Also, both editions support single-quoted strings as an extension.)
Workaround:
Generally, note that the problem may often not surface, given PowerShell's ability to mix different numeric types and widen types on demand.
However, as the question shows, when numbers are used as the keys of a hashtable (dictionary), a type-exact value must be passed for lookup in order to locate entries.
Therefore, the simplest workaround is to cast the hashtable key to [int], which allows later lookups with just, say, [123] (or even .123):
# Works in both Windows PowerShell and PowerShell [Core]
# Without the [int] cast, the lookup would fail in PowerShell [Core] as of v7.1
PS> $key = '123' | ConvertFrom-Json; #{ [int] $key = 'bingo' }[123]
bingo
Another option is to use a [pscustomobject] rather than a hashtable, in which case the numeric keys implicitly become property names, which are always strings.
# Note the use of .123, i.e. property access.
PS> $key = '123' | ConvertFrom-Json; ([pscustomobject] #{ [int] $key = 'bingo' }).123
bingo
This would work even with a numeric variable:
$lookup=123; ([pscustomobject] #{ [int] $key = 'bingo' }).$lookup
Caveat: When a key is implicitly stringified to become a property name, it is invariably a decimal representation that is used; e.g., [pscustomobject] #{ [int] 0x10 = 'bingo' } results in an object whose property name is '16'.[2]
However, note that hashtables / dictionaries are more lightweight than [pscustomobject]s.
[1] However, the JSON5 format, which is meant to improve on JSON, does support hex. numbers, along with other notable improvements, such as support for comments, extraneous trailing commas, and single-quoted strings.
[2] Also, with [double] values the conversion is culture-sensitive, so that 1.2 can in some cultures result in '1.2' (as of v7.1, which is unexpected - see GitHub issue #14278); also, large [double]s can end up in scientific notation, so that 1000000000000000.1 results in '1E+15'. That said, using [double]s as dictionary keys is generally ill-advised, given the accuracy limits of its from-decimal-number conversion.

ConvertFrom-Json converting lowercase e's into capital case (sometimes)

I'm processing JSON files in PowerShell, and it seems that ConvertFrom-Json changes case on its inputs only on some (rare) occasions.
For example, when I do:
$JsonStringSrc = '{"x":2.2737367544323206e-13,"y":1759,"z":33000,"width":664}'
$JsonStringTarget = $JsonStringSrc | ConvertFrom-Json | ConvertTo-Json -Depth 100 -Compress
$JsonStringTarget
It returns:
{"x":2.2737367544323206E-13,"y":1759,"z":33000,"width":664}
Lower case e became an uppercase E, messing up my hashes when validating proper i/o during processing.
Is this expected behavior (perhaps a regional setting)? Is there a setting for ConvertFrom-Json to leave my inputs alone for the output?
The problem lies in the way PowerShell's JSON library output the CLR foating point numbers. By converting from JSON you turn the JSON string into a CLR/PowerShell object with associated types for numbers and strings and such. Converting back to JSON serializes that object back to JSON, but uses the .NET default formatter configuration to do so. There is no metadata from the original JSON document to aid the conversion. Rounding errors and truncation, different order for elements may happen here too.
The JSON spec for canonical form (the form you want to use when hashing) is as follows:
MUST represent all non-integer numbers in exponential notation
including a nonzero single-digit significant integer part, and
including a nonempty significant fractional part, and
including no trailing zeroes in the significant fractional part (other than as part of a ā€œ.0ā€ required to satisfy the preceding point), and
including a capital ā€œEā€, and
including no plus sign in the exponent, and
including no insignificant leading zeroes in the exponent
Source: https://gibson042.github.io/canonicaljson-spec/
Though the specs for JSON supports both options (e and E).
exponent
""
'E' sign digits
'e' sign digits
Source: https://www.crockford.com/mckeeman.html
You may be able to convert the object to JSON using the Newtonsoft.Json classes directly and passing in a custom Convertor.
https://stackoverflow.com/a/28743082/736079
A better solution would probably be to use a specialized formatter component that directly manipulates the existing JSON document without converting it to CLR objects first.

Value that is printed by " jq . " is different from value that is present in json file [duplicate]

Why this ("Filter" in jqplay.org):
{"key":633447818234478180}
returns this ("Result" in jqplay.org):
{"key": 633447818234478200}
Original JSON doesn't matter.
Why is it changing 180 into 200? How can I overcome this? Is this a bug? A number too big?
I believe this is because jq can only represent legal JSON data and the number you've given is outside the range that can be represented without loss of precision. See also
What is JavaScript's highest integer value that a number can go to without losing precision?
If you need to work with larger numbers as strings in jq you may want to try this library:
jq-bigintA big integer library for working with possibly-signed arbitrarily long decimal strings. Written by Peter Koppstein (#pkoppstein) and released under the MIT license.

get wrong result by converting json string containing large number by using jsonlite

Here is the MWE, how to get the correct number as character.
require(jsonlite)
j <- "{\"id\": 323907258301939713}"
a <- fromJSON(j)
print(a$id, digits = 20)
class(a$id)
a$id <- as.character(a$id)
a$id
class(a$id)
Here is the output.
Loading required package: jsonlite
Loading required package: methods
[1] 323907258301939712
[1] "numeric"
[1] "323907258301939712"
[1] "character"
I want to get the exact number 323907258301939713 as character in a
In JavaScript, numbers are double precision floating point, even when they look like integers, so they only have about 16 decimal digits of precision. In particular, the JavaScript code:
console.log(12345678901234567 == 12345678901234568)
prints "true".
The JSON standard inherits this limitation from JavaScript, so jsonlite is actually correctly interpreting your JSON by reading the number as a double.
Because this is actually a limitation of the JSON standard, if you have control over the program generating the JSON, you will save yourself pain and heartache down the road if you fix your JSON (for example, by representing the id attribute as a string):
{ "id": "323907258301939713" }
But, if you absolutely must parse this badly formed JSON, then you're in luck. The fromJSON function takes an undocumented boolean argument bigint_as_char which reads these large numbers into R as character values:
> a <- fromJSON(j, bigint_as_char=TRUE)
> print(a$id)
[1] "323907258301939713"
>
However, you must be prepared to handle both plain numbers and character values in the rest of your R code, since fromJSON will still read small integers as normal numbers and only read these too-big integers as strings.