JSON formatting, either from file or variable - json

I have a PS script, which get JSON in variable ant then saves it in file.
Unfortunately, it get value in one string, like this:
{ "persistentdataapi": "https://somevalue.azurewebsites.net/", "collectioncountapi": "https://anothervalue.azurewebsites.net/", "eventserviceapi": "https://thirdvalue.azurewebsites.net/", "securityserviceapi": "https://fourthvalue.azurewebsites.net/" }
Is there any way, to process this value through some (preferably PS) JSON formatting, to get this one:
{
"persistentdataapi": "https://somevalue.azurewebsites.net/",
"collectioncountapi": "https://anothervalue.azurewebsites.net/",
"eventserviceapi": "https://thirdvalue.azurewebsites.net/",
"securityserviceapi": "https://fourthvalue.azurewebsites.net/",
}
Code to get value in Jenkins:
Import-Module "C:\Program Files\WindowsPowerShell\Modules\Octopus-Cmdlets\0.4.4\Octopus-Cmdlets.psd1"
connect-octoserver http://internal-Octopus.azure.com:8082 API-123456789012345678
$raw = (Get-OctoVariable var.Portal.Web DataAPIJson | Where-Object { $_.Environment -eq "QA" } )
$raw.Value | Out-File "$env:WORKSPACE\portal\var.Portal.Web\dataapi.json"

Powershell by default pretty-prints any JSON it produces.
So the correct way to do pretty-printing is to parse the JSON string into an object, and immediately convert it back to a JSON string.
$json = '{ "persistentdataapi": "https://somevalue.azurewebsites.net/", "collectioncountapi": "https://anothervalue.azurewebsites.net/", "eventserviceapi": "https://thirdvalue.azurewebsites.net/", "securityserviceapi": "https://fourthvalue.azurewebsites.net/" }'
$json | ConvertFrom-Json | ConvertTo-Json
produces
{
"persistentdataapi": "https://somevalue.azurewebsites.net/",
"collectioncountapi": "https://anothervalue.azurewebsites.net/",
"eventserviceapi": "https://thirdvalue.azurewebsites.net/",
"securityserviceapi": "https://fourthvalue.azurewebsites.net/"
}
or in your case
$file = "$env:WORKSPACE\portal\var.Portal.Web\dataapi.json"
$raw.Value | ConvertFrom-Json | ConvertTo-Json | Out-File $file -Encoding UTF8
As a side-effect this also makes sure that the JSON in the file is valid, because otherwise ConvertFrom-Json will throw an error.
Please always explicitly specify UTF8 encoding when reading and writing JSON files.
$data = Get-Content $file -Encoding UTF8 | ConvertFrom-Json
$data | ConvertTo-Json | Set-Content $file -Encoding UTF8
The reason for that is
By widely-accepted convention, JSON files ought to be UTF8.
Unless specified otherwise, Get-Content and Set-Content will use the system's default encoding to read/write text files.
The system default is very seldom UTF-8, most of the time it will be a legacy single-byte encoding like Windows-1252.
This creates the risk of
mangling Unicode characters, which are legal in JSON, when reading a JSON file.
creating JSON files that are not UTF-8, making them hard to consume by others.
In fact, always specify an encoding explicitly when working with text files, not only in the case of JSON.

Related

Adjust json file using ConvertFrom-Json

I have a json file with the following content
{
"name":"myApp",
"version":"1"
}
I would like to use Powershell to read this json file, adjust the version number, then output the new json to the same file.
I currently have this, but it doesn't work and only outputs an empty file
Get-Content file.json | ConvertFrom-Json | ForEach{$_.version = "2"} | ConvertTo-Json | Out-File file.json -Encoding ascii
I am getting stuck on how to update values on the Json object created by ConvertFrom-Json and pipe this back into ConvertTo-Json
I would like to do this all in one line if possible.
Once you assign the JSON to an object, you should be able to change it from there then write it to a file (if desired).
Create JSON object from file contents:
$jsonObject = Get-Content file.json -Raw | ConvertFrom-Json
Change the current object. In your example the "version" property:
#update object
$jsonObject| %{$_.version = "2"}
Write object to file:
$jsonObject | ConvertTo-Json | Out-File "file.json" -Encoding ascii

How to convert cyrillic into utf16

tl;dr Is there a way to convert cyrillic stored in hashtable into UTF-16?
Like кириллица into \u043a\u0438\u0440\u0438\u043b\u043b\u0438\u0446\u0430
I need to import file, parse it into id and value then convert it into .json and now im struggling to find a way to convert value into utf codes.
And yes, it is needed that way
cyrillic.txt:
1 кириллица
PH:
clear-host
foreach ($line in (Get-Content C:\Users\users\Downloads\cyrillic.txt)){
$nline = $line.Split(' ', 2)
$properties = #{
'id'= $nline[0] #stores "1" from file
'value'=$nline[1] #stores "кириллица" from file
}
$temp+=New-Object PSObject -Property $properties
}
$temp | ConvertTo-Json | Out-File "C:\Users\user\Downloads\data.json"
Output:
[
{
"id": "1",
"value": "кириллица"
},
]
Needed:
[
{
"id": "1",
"value": "\u043a\u0438\u0440\u0438\u043b\u043b\u0438\u0446\u0430"
},
]
At this point as a newcomer to PH i have no idea even how to search for it properly
Building on Jeroen Mostert's helpful comment, the following works robustly, assuming that the input file contains no NUL characters (which is usually a safe assumption for text files):
# Sample value pair; loop over file lines omitted for brevity.
$nline = '1 кириллица'.Split(' ', 2)
$properties = [ordered] #{
id = $nline[0]
# Insert aux. NUL characters before the 4-digit hex representations of each
# code unit, to be removed later.
value = -join ([uint16[]] [char[]] $nline[1]).ForEach({ "`0{0:x4}" -f $_ })
}
# Convert to JSON, then remove the escaped representations of the aux. NUL chars.,
# resulting in proper JSON escape sequences.
# Note: ... | Out-File ... omitted.
(ConvertTo-Json #($properties)) -replace '\\u0000', '\u'
Output (pipe to ConvertFrom-Json to verify that it works):
[
{
"id": "1",
"value": "\u043a\u0438\u0440\u0438\u043b\u043b\u0438\u0446\u0430"
}
]
Explanation:
[uint16[]] [char[]] $nline[1] converts the [char] instances of the strings stored in $nline[1] into the underlying UTF-16 code units (a .NET [char] is an unsigned 16-bit integer encoding a Unicode code point).
Note that this works even with Unicode characters that have code points above 0xFFFF, i.e. that are too large to fit into a [uint16]. Such characters outside the so-called BMP (Basic Multilingual Plane), e.g. 👍, are simply represented as pairs of UTF-16 code units, so-called surrogate pairs, which a JSON processor should recognize (ConvertFrom-Json does).
However, on Windows such chars. may not render correctly, depending on your console window's font. The safest option is to use Windows Terminal, available in the Microsoft Store
The call to the .ForEach() array method processes each resulting code unit:
"`0{0:x4}" -f $_ uses an expandable string to create a string that starts with a NUL character ("`0"), followed by a 4-digit hex. representation (x4) of the code unit at hand, created via -f, the format operator.
This trick of replacing what should ultimately be a verbatim \u prefix temporarily with a NUL character is needed, because a verbatim \ embedded in a string value would invariably be doubled in its JSON representation, given that \ acts the escape character in JSON.
The result is something like "<NUL>043a", which ConvertTo-Json transforms as follows, given that it must escape each NUL character as \u0000:
"\u0000043a"
The result from ConvertTo-Json can then be transformed into the desired escape sequences simply by replacing \u0000 (escaped as \\u0000 for use with the regex-based -replace oeprator) with \u, e.g.:
"\u0000043a" -replace '\\u0000', '\u' # -> "\u043a", i.e. к
Here's a way simply saving it to a utf16be file and then reading out the bytes, and formatting it, skipping the first 2 bytes, which is the bom (\ufeff). $_ didn't work by itself. Note that there's two utf16 encodings that have different byte orders, big endian and little endian. The range of cyrillic is U+0400..U+04FF. Added -nonewline.
'кириллица' | set-content utf16be.txt -encoding BigEndianUnicode -nonewline
$list = get-content utf16be.txt -Encoding Byte -readcount 2 |
% { '\u{0:x2}{1:x2}' -f $_[0],$_[1] } | select -skip 1
-join $list
\u043a\u0438\u0440\u0438\u043b\u043b\u0438\u0446\u0430
There must be a simpler way of doing this, but this could work for you:
$temp = foreach ($line in (Get-Content -Path 'C:\Users\users\Downloads\cyrillic.txt')){
$nline = $line.Split(' ', 2)
# output an object straight away so it gets collected in variable $temp
[PsCustomObject]#{
id = $nline[0] #stores "1" from file
value = (([system.Text.Encoding]::BigEndianUnicode.GetBytes($nline[1]) |
ForEach-Object {'{0:x2}' -f $_ }) -join '' -split '(.{4})' -ne '' |
ForEach-Object { '\u{0}' -f $_ }) -join ''
}
}
($temp | ConvertTo-Json) -replace '\\\\u', '\u' | Out-File 'C:\Users\user\Downloads\data.json'
Simpler using .ToCharArray():
$temp = foreach ($line in (Get-Content -Path 'C:\Users\users\Downloads\cyrillic.txt')){
$nline = $line.Split(' ', 2)
# output an object straight away so it gets collected in variable $temp
[PsCustomObject]#{
id = $nline[0] #stores "1" from file
value = ($nline[1].ToCharArray() | ForEach-Object {'\u{0:x4}' -f [uint16]$_ }) -join ''
}
}
($temp | ConvertTo-Json) -replace '\\\\u', '\u' | Out-File 'C:\Users\user\Downloads\data.json'
Value "кириллица" will be converted to \u043a\u0438\u0440\u0438\u043b\u043b\u0438\u0446\u0430

Using Powershell to convert a file's contents into a string that can be transferred using JSON

How does one convert a text file's contents into a string and then insert this string into a JSON file?
For example, if a file contains:
this
is
a
sample
file
The script would generate:
"this\r\nis\r\na\r\nsample\r\nfile"
To insert into a JSON template:
"something":"<insertPoint>"
To produce:
"something":"this\r\nis\r\na\r\nsample\r\nfile"
I'm using Powershell 5 and have managed to load the file, generate some JSON and insert it by running:
# get contents and convert to JSON
$contentToInsert = Get-Content $sourceFilePath -raw | ConvertTo-Json
# write in output file
(Get-Content $outputFile -Raw).replace('<insertPoint>', $contentToInsert) | Set-Content $outputFile
However, a lot of other, unwanted fields are also added.
"something":"{
"value": "this\r\nis\r\na\r\nsample\r\nfile"
"PSPath": "C:\\src\\intro.md",
"PSParentPath": "C:\\src",
"PSChildName": "intro.md",
etc...
Ultimately, I'm trying to send small rich text segments to a web page via JSON but want to edit and store them locally using Markdown. If this doesn't make sense and there's a better way of sending these then please let me know also.
iRon's answer helpfully suggests not using string manipulation to create JSON in PowerShell, but to use hashtables (or custom objects) to construct the data and then convert it to JSON.
However, that alone does not solve your problem:
PS> #{ something = Get-Content -Raw $sourceFilePath } | ConvertTo-Json
{
"something": {
"value": "this\nis\na\nsample\nfile\n",
"PSPath": "/Users/mklement/Desktop/pg/lines.txt",
# ... !! unwanted properties are still there
}
The root cause of the problem is that Get-Content decorates the strings it outputs with metadata in the form of NoteProperty properties, and ConvertTo-Json currently invariably includes these.
A proposal to allow opting out of this decoration when calling Get-Content can be found in GitHub issue #7537.
Complementarily, GitHub issue #5797 suggests that ConvertTo-Json should ignore the extra properties for primitive .NET types such as strings.
The simplest workaround is to access the underlying .NET instance with .psobject.baseobject, which bypasses the invisible wrapper object PowerShell uses to supply the extra properties:
PS> #{ something = (Get-Content -Raw $sourceFilePath).psobject.baseobject } |
ConvertTo-Json
{
"something": "this\nis\na\nsample\nfile\n"
}
Just a general recommendation apart from the actually issue described by #mklement0 and metadata added to the Get-Content results:
Do not poke (replace, insert, etc.) in any Json content.
Instead, modify the object (if necessary, use ConvertFrom-Json to restore the object) prior converting it into (ConvertTo-Json) a Json file.
In this example, I would use a hash-table with a here-string for this:
#{'something' = #'
this
is
a
sample
file
'#
} | ConvertTo-Json
Result:
{
"something": "this\nis\na\nsample\nfile"
}
You can use the Out-String cmdlet to coerce the output of Get-Content into a flat string first:
#{ "something" = (Get-Content lines.txt | Out-String) } | ConvertTo-Json
This produces:
{
"something": "this\r\nis\r\na\r\nsample\r\nfile\r\n"
}

Powershell: Get value of a raw JSON

I have a JSON file which looks like this:
{
"body": {
"mode": "raw",
"raw": "{\n \"id\": \"middleware://so/998655555{{sguid}}\",\n \"reference\": \"998655555{{sguid}}\",\n \"branchCode\": \"1234\"\n }"
}
}
Now, I want to output the value of e.g. "branchCode".
I tried it with the following PowerShell commands:
$path = "file.json"
$raw = Get-Content $path -raw
$obj = ConvertFrom-Json $raw
$obj.body.raw.branchCode
Unfortunately, I don't get the value. It seems like PowerShell has a problem to retrieve the value from the raw json.
I would be very happy if someone could help me.
The value of "raw" is itself JSON.
You have to convert from JSON twice.
$data = Get-Content "file.json" -Raw -Encoding UTF8 | ConvertFrom-Json
$body = $data.body.raw | ConvertFrom-Json
$body.branchCode
prints
1234
Note that you should always explicitly specify an encoding when reading from and writing to text files. For the purposes of JSON (and, frankly, for pretty much every other purpose), UTF-8 is the appropriate choice.
Unfortunately Get-Content/Set-Content do not default to UTF-8. Save yourself from trouble by always specifying an encoding when working with text files.

PowerShell: ConvertTo-Json problem containing special characters

I am writing a script to make changes to a JSON file but when the file is converted back to JSON it expands special characters.
For example the JSON File contain passwords with "&". A quick way to replicate the problem is using following command:
PS> "Password&123" | Convertto-Json
output is:"Password\u0026123"
##Here is how I import the JSON FILE:
$jsonfile = (Get-Content .\example.json -Encoding Ascii) -join "`n" | ConvertFrom-Json
##Exporting JSON FILE without modifying it.
$jsonfile | ConvertTo-Json |Out-File "new.json"
--here is an example of simplified JSON FILE
{
"Server1":
{
"username":"root",
"password":"Password&dfdf"
},
"Server2":
{
"username":"admin",
"password":"Password&1234"
}
}
Try the Unescape() method:
$jsonfile | ConvertTo-Json | % { [System.Text.RegularExpressions.Regex]::Unescape($_) } | Out-File "new.json"
This is caused by the automatic character escape feature of Convertto-Json and it affects several symbols such as <>\'&
ConvertFrom-Json will read the escaped characters properly. Using your example:
PS C:\> {"Password\u0026123"} | ConvertFrom-Json
Password&123
And your example code results in a file that has escaped characters, but ConvertFrom-Json can read it back to the original passwords. See below:
PS C:\> (Get-Content .\example.json -Encoding Ascii) -join "`n" | ConvertFrom-Json
Server1 Server2
------- -------
#{username=root; password=Password&dfdf} #{username=admin; password=Password&1234}
PS C:\> (Get-Content .\new.json -Encoding Ascii) -join "`n" | ConvertFrom-Json
Server1 Server2
------- -------
#{username=root; password=Password&dfdf} #{username=admin; password=Password&1234}
If you need the passwords to be stored unescaped, some fancier work may be needed. See this thread about Converting Unicode strings to escaped ascii strings
Alternatively, avoid affected characters if possible.
Tested with PowerShell 7.2 and appears unicode and other special characters are converted successfully. Not to mention indentation also appears improved.