Powershell: Modify key value pair in JSON file - json

How do I modify a Key Value Pair in a JSON File with powershell?
We are trying to modify Database Connection, sometimes it can be two levels nested deep, sometimes it can be three levels deep.
Trying to utilize this answer,
Currently we are switching servers in multiple json files, so we can test in different server environments.
Add new key value pair to JSON file in powershell.
"JWTToken": {
"SecretKey": "Security Key For Generate Token",
"Issuer": "ABC Company"
},
"AllowedHosts": "*",
"ModulesConfiguration": {
"AppModules": [ "ABC Modules" ]
},
"ConnectionStrings": {
"DatabaseConnection": "Server=testserver,1433;Database=TestDatabase;User Id=code-developer;password=xyz;Trusted_Connection=False;MultipleActiveResultSets=true;",
"TableStorageConnection": "etc",
"BlobStorageConnection": "etc"
},

Once you convert JSON string to an object with PowerShell, it's not really a problem to then change the properties. The main issue you are going to face here is that your string is currently invalid JSON for .Net or at least it won't be expecting it in the current format. We can fix that though.
Here is your current JSON.
"JWTToken": {
"SecretKey": "Security Key For Generate Token",
"Issuer": "ABC Company"
},
"AllowedHosts": "*",
"ModulesConfiguration": {
"AppModules": [ "ABC Modules" ]
},
"ConnectionStrings": {
"DatabaseConnection": "Server=testserver,1433;Database=TestDatabase;User Id=code-developer;password=xyz;Trusted_Connection=False;MultipleActiveResultSets=true;",
"TableStorageConnection": "etc",
"BlobStorageConnection": "etc"
},
There may be other issues, for PowerShell JSON, in your application.config file, but these two are immediately noticeable to me.
Unnecessary trailing commas
No definitive opening { and closing }
How Can We Fix This?
We can use simple string concatenation to add { and } where necessary.
$RawText = Get-Content -Path .\path_to\application.config -Raw
$RawText = "{ " + $RawText + " }"
To remove any unnecessary parsing issues with trailing commas when parsing the JSON with ConvertFrom-Json we need to remove them via regex. My proposed approach would be to identify them by whether the current array } or ] closes after them, it might be that these closing brackets have a number of spaces or \s before they appear. So we would have a regex that looks like this:
"\,(?=\s*?[\}\]])".
We could then use that with -replace in PowerShell. Of course we will replace them with an empty string.
$FormattedText = $RawText -replace "\,(?=\s*?[\}\]])",""
From here we convert to JSON.
$JsonObj = $FormattedText | ConvertFrom-Json
We can now change your database string by setting a property.
$JsonObj.ConnectionStrings.DatabaseConnection = "your new string"
We use ConvertTo-Json to convert the array back to a Json string.
$JsonString = $JsonObj | ConvertTo-Json
It's not important to return the trailing commas, they aren't valid JSON, but your file needs the first { and last } removing before we commit it back to file with Set-Content.
# Remove the first { and trim white space. Second TrimStart() clears the space.
$JsonString = $JsonString.TrimStart("{").TrimStart()
# Repeat this but for the final } and use TrimEnd().
$JsonString = $JsonString.TrimEnd("}").TrimEnd()
# Write back to file.
$JsonString | Set-Content -Path .\path_to\application.config -Force
Your config file should be written back more or less as you found it. I will try and think of a regex to fix the appearance of the formatting, it shouldn't error, it just doesn't look great. Hope that helps.
EDIT
Here is a function to fix the unsightly appearance of the text in the file.
function Restore-Formatting {
Param (
[parameter(Mandatory=$true,ValueFromPipeline=$true)][string]$InputObject
)
$JsonArray = $InputObject -split "\n"
$Tab = 0
$Output = #()
foreach ($Line in $JsonArray) {
if ($Line -match "{" -or $Line -match "\[") {
$Output += (" " * $Tab) + $Line.TrimStart()
$Tab += 4
}
elseif ($Line -match "^\s+}" -or $Line -match "^\s+\]") {
$Tab -= 4
$Output += (" " * $Tab) + $Line.TrimStart()
}
else {
$Output += (" " * $Tab) + $Line.TrimStart()
}
}
$Output
}
TL;DR Script:
$RawText = Get-Content -Path .\path_to\application.config -Raw
$RawText = "{ " + $RawText + " }"
$FormattedText = $RawText -replace "\,(?=\s*?[\}\]])",""
$JsonObj = $FormattedText | ConvertFrom-Json
$JsonObj.ConnectionStrings.DatabaseConnection = "your new string"
$JsonString = $JsonObj | ConvertTo-Json
$JsonString = $JsonString.TrimStart("{").TrimStart()
$JsonString = $JsonString.TrimEnd("}").TrimEnd()
$JsonString | Restore-Formatting | Set-Content -Path .\path_to\application.config -NoNewLine -Force

Related

Extract multiline regex from extra large files in Powershell

I have extra large log file in CSV format which includes JSON formatted data inside. What I'm trying to do is extract JSON parts from the data and store it in a separate file.
The real problem is that the file size is almost 70Gb which causes some interesting problems to tackle.
The file size makes it impossible to read the whole file in one chunk. With Powershell's Get-Content combined with -ReadCount and Foreach-Object I can take smaller chunks and run regex pattern over them, chunk by chunk.
$Path = <pathToFile>
$outPath = <pathToOutput>
Out-File -Encoding utf8 -FilePath $outPath
$JsonRegex = "(?smi)\{.*?\}"
Get-Content -Path $Path -ReadCount 100000 | Foreach-Object {
( "$_" | Select-String -Pattern $JsonRegex -AllMatches | Foreach-Object { $_.Matches } | Foreach-Object { $_.Value } ) | Add-Content $outPath
}
But here what happens is, every 100k lines the ReadCount is in the middle of a JSON object thus skipping said object and continuing from next object.
Here is an example how this log data looks like. It includes some columns on first row and then JSON formatted data which is not consistent so I cannot use any fixed ReadCount value to avoid being in the middle of a JSON object.
"5","5","9/10/2019 12:00:46 AM","2","some","data","removed","comment","{
"message": "comment",
"level": "Information",
"logType": "User",
"timeStamp": "2019-09-10T03:00:46.5573047+03:00",
"fingerprint": "some",
}","11"
"5","5","9/10/2019 12:00:46 AM","2","some","data","removed","comment","{
"message": "comment",
"level": "Information",
"logType": "User",
"timeStamp": "2019-09-10T03:00:46.5672713+03:00",
"fingerprint": "some",
"windowsIdentity": "LOCAL\\WinID",
"machineName": "TK-141",
"processVersion": "1.0.71",
"jobId": "24a8",
"machineId": 11
}","11"
Is there any way to accomplish this without missing any data rows from the gigantous logfile?
Use a switch statement with the -Regex and -File parameters to efficiently (by PowerShell standards) read the file line by line and keep state across multiple lines.
For efficient writing to a file, use a .NET API, namely a System.IO.StreamWriter instance.
The following code assumes:
Each JSON string spans multiple lines and is non-nested.
On a given line, an opening { / closing } unambiguously marks the start / end of a (multi-line) JSON string.
# Input file path
$path = '...'
# Output file path
# Important: specify a *full* path
$outFileStream = [System.IO.StreamWriter] "$PWD/out.txt"
$json = ''
switch -Regex -File $path {
'\{.*' { $json = $Matches[0]; continue }
'.*\}' {
$json += "`n" + $Matches[0]
$outFileStream.WriteLine($json)
$json = ''
continue
}
default { if ($json) { $json += "`n" + $_ } }
}
$outFileStream.Close()
If you can further assume that no part of the JSON string follows the opening { / precedes the closing } on the same line, as your sample data suggest, you can simplify (and speed up) the switch statement:
$json = ''
switch -Regex -File $path {
'\{$' { $json ='{'; continue }
'^\}' { $outFileStream.WriteLine(($json + "`n}")); $json = ''; continue }
default { if ($json) { $json += "`n" + $_ } }
}
$outFileStream.Close()
Doug Maurer had a solution attempt involving a System.Text.StringBuilder instance so as to optimize the iterative concatenation of the parts making up each JSON string:
However, at least with an input file crafted from many repetitions of the sample data, I saw only a small performance gain in my informal tests.
For the sake of completeness, here's the System.Text.StringBuilder solution:
$json = [System.Text.StringBuilder]::new(512) # tweak the buffer size as needed
switch -Regex -File $path {
'\{$' { $null = $json.Append('{'); continue }
'^\}' { $outFileStream.WriteLine($json.Append("`n}").ToString()); $null = $json.Clear(); continue }
default { if ($json.Length) { $null = $json.Append("`n").Append($_) } }
}
$outFileStream.Close()

How to expand environmental variables in JSON into valid JSON notation by PowerShell

This question relates to my other question and to this issue.
When I try to read a configuration from a JSON which contains an environmental variable, e.g. %USERPROFILE%\\source an obvious choice [System.Environment]::ExpandEnvironmentVariables($jsonString) expands the JSON string into non-valid JSON notation, e.g. "C:\Users\JohnDoe". The \U is not a valid JSON notation.
The question is how to overcome this problem (with some clean code).
If you are performing the replacement on the entire $jsonString then, yes,
$jsonString -replace '\\', '\\' will replace too many backslashes.
You could then beter do this instead:
$match = ([regex] '(%([^%]+)%)').Match($jsonString)
while ($match.Success) {
$search = $match.Groups[1].Value
$replace = [environment]::GetEnvironmentVariable($match.Groups[2].Value) -replace '\\', '\\'
$jsonString = $jsonString -replace $search, $replace
$match = $match.NextMatch()
}
This will only replace the matched %SomeVar% environment variables by what they expand into doubles all possible backslashes in that.
Test using some bogus JSON:
$jsonString = #"
{
"Name": "%USERNAME%",
"UserPath": "%USERPROFILE%\\source"
"WinDir": "%SystemRoot%"
"InetPath": "%SystemDrive%\\inetpub\\wwwroot",
}
"#
$match = ([regex] '(%([^%]+)%)').Match($jsonString)
while ($match.Success) {
$search = $match.Groups[1].Value
$replace = [environment]::GetEnvironmentVariable($match.Groups[2].Value) -replace '\\', '\\'
Write-Host $search
write-host $replace
$jsonString = $jsonString -replace $search, $replace
$match = $match.NextMatch()
}
$jsonString
Output:
{
"Name": "KUTlime",
"UserPath": "C:\\Users\\KUTlime\\source"
"WinDir": "C:\\WINDOWS"
"InetPath": "C:\\inetpub\\wwwroot",
}
In is general a bad practice to peek and poke into serialized object strings (as e.g. a Json and XML) using string methods like Replace. Instead it is better to de-serialize the string to an object, make your changes, and serialize it to a string again. This way you prevent pitfalls along with incorrectly replacing a backslash with a special meaning (such as an escape for a double quote or a Unicode character).
As it might be a hassle to recursively craw through a complex object to expand all the strings, I have written a small function for this:
function Expand-String {
[CmdletBinding()] param(
[Parameter(ValueFromPipeLine = $True)]$Object
)
Process {
if ($Object -is [string]) { $Object = [System.Environment]::ExpandEnvironmentVariables($Object) }
elseif ($Object -is [Collections.IDictionary]) {
Foreach ($Key in #($Object.get_Keys())) { $Object[$Key] = Expand-String $Object[$Key] }
}
elseif ($Object -is [Collections.IEnumerable]) {
for ($i = 0; $i -lt $Object.Count; $i++) { $Object[$i] = Expand-String $Object[$i] }
}
else {
Foreach ($Name in ($Object.PSObject.Properties | Where-Object { $_.MemberType -eq 'NoteProperty' }).Name) {
$Object.$Name = Expand-String $Object.$Name
}
}
$Object
}
}
$Json = #'
{
"Environment Variables": {
"Name": "%USERNAME%",
"UserPath": "%USERPROFILE%\\source",
"WinDir": "%SystemRoot%",
"InetPath": "%SystemDrive%\\inetpub\\wwwroot"
},
"Special Characters": {
"Quote": "Hello \"World\"",
"Unicode": ["\u1F60A", "\u1F44D"]
}
}
'#
$Object = $Json | ConvertFrom-Json | Expand-String
$Object.'Environment Variables'
Name UserPath WinDir InetPath
---- -------- ------ --------
KUTlime C:\Users\KUTlime\source C:\WINDOWS C:\inetpub\wwwroot
$Json = $Object | ConvertTo-Json -Depth 9
$Json
{
"Environment Variables": {
"Name": "KUTlime",
"UserPath": "C:\\Users\\KUTlime\\source",
"WinDir": "C:\\WINDOWS",
"InetPath": "C:\\inetpub\\wwwroot"
},
"Special Characters": {
"Quote": "Hello \"World\"",
"Unicode": [
"ὠA",
"ὄD"
]
}
}

Convert and format XML to JSON on Azure Storage Account in Powershell

So i'm trying to convert XML files on an Azure Storage Container to JSON in the same container.
This way I'm able to read the information into an Azure SQL Database via an Azure Datafactory.
I'd like to stay clear from using Logic apps if able.
The JSON files need to be formatted.
And all this through the use of PowerShell scripting.
What i've got so far after some searching on the interwebs and shamelessly copying and pasting powershell code:
#Connect-AzAccount
# Helper function that converts a *simple* XML document to a nested hashtable
# with ordered keys.
function ConvertFrom-Xml {
param([parameter(Mandatory, ValueFromPipeline)] [System.Xml.XmlNode] $node)
process {
if ($node.DocumentElement) { $node = $node.DocumentElement }
$oht = [ordered] #{}
$name = $node.Name
if ($node.FirstChild -is [system.xml.xmltext]) {
$oht.$name = $node.FirstChild.InnerText
} else {
$oht.$name = New-Object System.Collections.ArrayList
foreach ($child in $node.ChildNodes) {
$null = $oht.$name.Add((ConvertFrom-Xml $child))
}
}
$oht
}
}
function Format-Json
{
<#
.SYNOPSIS
Prettifies JSON output.
.DESCRIPTION
Reformats a JSON string so the output looks better than what ConvertTo-Json outputs.
.PARAMETER Json
Required: [string] The JSON text to prettify.
.PARAMETER Minify
Optional: Returns the json string compressed.
.PARAMETER Indentation
Optional: The number of spaces (1..1024) to use for indentation. Defaults to 4.
.PARAMETER AsArray
Optional: If set, the output will be in the form of a string array, otherwise a single string is output.
.EXAMPLE
$json | ConvertTo-Json | Format-Json -Indentation 2
#>
[CmdletBinding(DefaultParameterSetName = 'Prettify')]
Param(
[Parameter(Mandatory = $true, Position = 0, ValueFromPipeline = $true)]
[string]$Json,
[Parameter(ParameterSetName = 'Minify')]
[switch]$Minify,
[Parameter(ParameterSetName = 'Prettify')]
[ValidateRange(1, 1024)]
[int]$Indentation = 4,
[Parameter(ParameterSetName = 'Prettify')]
[switch]$AsArray
)
if ($PSCmdlet.ParameterSetName -eq 'Minify')
{
return ($Json | ConvertFrom-Json) | ConvertTo-Json -Depth 100 -Compress
}
# If the input JSON text has been created with ConvertTo-Json -Compress
# then we first need to reconvert it without compression
if ($Json -notmatch '\r?\n')
{
$Json = ($Json | ConvertFrom-Json) | ConvertTo-Json -Depth 100
}
$indent = 0
$regexUnlessQuoted = '(?=([^"]*"[^"]*")*[^"]*$)'
$result = $Json -split '\r?\n' |
ForEach-Object {
# If the line contains a ] or } character,
# we need to decrement the indentation level unless it is inside quotes.
if ($_ -match "[}\]]$regexUnlessQuoted")
{
$indent = [Math]::Max($indent - $Indentation, 0)
}
# Replace all colon-space combinations by ": " unless it is inside quotes.
$line = (' ' * $indent) + ($_.TrimStart() -replace ":\s+$regexUnlessQuoted", ': ')
# If the line contains a [ or { character,
# we need to increment the indentation level unless it is inside quotes.
if ($_ -match "[\{\[]$regexUnlessQuoted")
{
$indent += $Indentation
}
$line
}
if ($AsArray) { return $result }
return $result -Join [Environment]::NewLine
}
# Storage account details
$resourceGroup = "insert resource group here"
$storageAccountName = "insert storage account name here"
$container = "insert container here"
$storageAccountKey = (Get-AzStorageAccountKey -ResourceGroupName $resourceGroup -Name $storageAccountName).Value[0]
$storageAccount = Get-AzStorageAccount -ResourceGroupName $resourceGroup -Name $storageAccountName
# Creating Storage context for Source, destination and log storage accounts
#$context = New-AzStorageContext -StorageAccountName $storageAccountName -StorageAccountKey $storageAccountKey
$context = New-AzStorageContext -ConnectionString "insert connection string here"
$blob_list = Get-AzStorageBlob -Container $container -Context $context
foreach($blob_iterator in $blob_list){
[XML](Get-AzStorageBlobContent $blob_iterator.name -Container $container -Context $context) | ConvertFrom-Xml | ConvertTo-Json -Depth 11 | Format-Json | Set-Content ($blob_iterator.name + '.json')
}
Output =
Cannot convert value "Microsoft.WindowsAzure.Commands.Common.Storage.ResourceModel.AzureStorageBlob" to type "System.Xml.XmlDocument". Error: "The specified node cannot
be inserted as the valid child of this node, because the specified node is the wrong type."
At C:\Users\.....\Convert XML to JSON.ps1:116 char:6
+ [XML](Get-AzStorageBlobContent $blob_iterator.name -Container $c ...
+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+ CategoryInfo : InvalidArgument: (:) [], RuntimeException
+ FullyQualifiedErrorId : InvalidCastToXmlDocument
When I run the code the script asks me if I want to download the xml file to a local folder on my laptop.
This is not what I want, I want the conversion to be done in Azure on the Storage container.
And I think that I'm adding ".json" to the .xml file name.
So the output would become something like filename.xml.json instead of just filename.json
What's going wrong here?
And how can it be fixed?
Thank you in advance for your help.

Add square brackets while convert it to json in powershell

I am taking the parameters from TemplateJSON file and reading data from CSV file and the converting it to JSON again.
TemplateJSON:
{
"adhocUARs":[
""
],
"rolefullpath": ""
}
CSV File:
RoleFullPath,ResourceName
FolderName\ABCD,ABCD User Account Resource
I am getting the output in this format (without the square brackets):
NEWJSON:
{
"adhocUARs":"ABCD User Account Resource",
"rolefullpath": "xyz"
}
But I expect the output to be in the following format(with the square brackets):
NEWJSON:
{
"adhocUARs":["ABCD User Account Resource"]
"rolefullpath": "xyz"
}
Code used:
$TemplateJSON = Convertfrom-json ([IO.File]::ReadAllText("TemplateJSON.json"))
$RoleFile = Import-CSV "CSVFile.csv" -Delimiter ','
[int]$Row = 0
Foreach ($Line in $RoleFile)
{
$Row = $Row + 1
$NewJSON = $TemplateJSON
$NewJSON.adhocUARs = $Line.ResourceName
$NewJSON.roleFullPath= $Line.RoleFullPath
$RolePath = "D:\\DummyFolder\"
$JSONPath = $RolePath + "patch.json"
Convertto-JSON $NewJSON | Out-file -Encoding "UTF8" $JSONPath
}
What am I missing?gr
so in JSON the [ ] is an array.
Currently you have $NewJSON.adhocUARs as a string single value.
A simple solution would be :
$NewJSON.adhocUARs = #($Line.ResourceName)

Convert to JSON with comments from PowerShell

I have a very simple json and this code works for him:
function Get-CustomHeaders() {
return Get-Content -Raw -Path $JsonName | ConvertFrom-Json
}
However, if my json has any comments // wololo it breaks. Would it be too hard to make this parser accept comments ?
The solution in the other answer only removes // comments if they are at the beginning of a line (with or without spaces), and doesn't remove /* multiline comments */
This code removes all kind of // and /* multiline comments *//
$configFile = (Get-Content path-to-jsonc-file -raw)
# Keep reading, for an improvement
# $configFile = $configFile -replace '(?m)\s*//.*?$' -replace '(?ms)/\*.*?\*/'
As #Jiří Herník indicates in his answer, this expression doesn't have into account the case of strings with comments inside it, for example "url": "http://mydomian.com". To handle this case:
$configFile = $configFile -replace '(?m)(?<=^([^"]|"[^"]*")*)//.*' -replace '(?ms)/\*.*?\*/'
for example removing the comments in this file:
{
// https://github.com/serilog/serilog-settings-configuration
"Serilog": {
"MinimumLevel": "Error", // Verbose, Debug, Information, Warning, Error or Fatal
"WriteTo": [
{
"Name": "File",
"Args": {
"path": "D:\\temp\\MyService\\log.txt",
"rollingInterval": "Day",
"outputTemplate": "{Timestamp:yyyy-MM-dd HH:mm:ss.fff zzz} [{Level:u3}] ({App}) ({Environment}) {Message:lj}{NewLine}{Exception}"
}
},
{/*
"Name": "Seq",*/
"Args": {
"serverUrl": "http://localhost:5341"
}
}
]
}
}
results in:
{
"Serilog": {
"MinimumLevel": "Error",
"WriteTo": [
{
"Name": "File",
"Args": {
"path": "D:\\temp\\MyService\\log.txt",
"rollingInterval": "Day",
"outputTemplate": "{Timestamp:yyyy-MM-dd HH:mm:ss.fff zzz} [{Level:u3}] ({App}) ({Environment}) {Message:lj}{NewLine}{Exception}"
}
} ,
{
"Args": {
"serverUrl": "http://localhost:5341"
}
}
]
}
}
Remove comment lines from your input before the conversion:
(Get-Content $JsonName) -replace '^\s*//.*' | Out-String | ConvertFrom-Json
Here you have an example which can't be handled right by previous answers:
{
"url":"http://something" // note the double slash in URL
}
so here is regexp that solves also this problem.
$configFile = $configFile -replace '(?m)(?<=^([^"]|"[^"]*")*)//.*' -replace '(?ms)/\*.*?\*/'
IMPORTANT NOTE:
Powershell 6.0+ can load JSON with comments in it.
A simpler pattern that catches all combinations of string, escapes and comments is:
$configFile = $configFile -replace '("(\\.|[^\\"])*")|/\*[\S\s]*?\*/|//.*', '$1';
This assumes the file is valid, with no unclosed strings or comments. Invalid files are beyond the scope if this question.
The first part ("(\\.|[^\\"])*") matches full strings and skips any escaped characters, including \\ and \". This is captured so it can be placed back in the replacement string.
The second part /\*[\S\s]*?\*/ matches multiline comments. It uses [\S\s] instead of ., so linebreaks are also matched. It is a combination of non-whitespace characters (\S) and whitespace characters (\s). The *? is a lazy repetition, so it will prefer to match as little as possible, so it won't skip over any closing */.
The last part //.* matches single line comments. The . won't match any linebreak, so it will only match until the end of the line.
When a string is matched, it is captured into slot 1. When a comment is matched, nothing is captured. The replacement is with whatever is in slot 1 ($1). The result is that strings are matched but preserved, but comments are removed.
I wrote a function that takes any comments and puts them back into the JSON file if found.
This also allows reading and writing to the JSON file.
There are comments within. Tested in v5.1 and v7.
# Helper Function
# Write the contents of argument content to a file.
# Will create the file if it does not exist.
Function Write-ToFile {
Param ([Parameter(Mandatory=$true, Position=0)] [string] $path,[Parameter(Mandatory=$true, Position=1)] [string] $content)
[System.IO.File]::WriteAllText($path, $content)
}
Function Invoke-ReadWriteJSON {
<#
.SYNOPSIS
Reads and writes properties from a JSON file.
.DESCRIPTION
This will allow JSON files to have comments, either multi-line or single line
comments are supported.
If the file does not exist or is empty then the default file contents are
written to it.
.NOTES
Author: Ste
Date Created: 2021.05.01
Tested with PowerShell 5.1 and 7.1.
Posted here: https://stackoverflow.com/questions/51066978/convert-to-json-with-comments-from-powershell
.BUGS: NA
.TODO: NA
.PARAMETER filePath
The file path of the JSON file.
.PARAMETER Mode
This parameter is either Read or Write.
.PARAMETER Property
The property of the JSON object.
.PARAMETER newValue
The new property of the JSON object.
.INPUTS
None. You cannot pipe objects to Add-Extension.
.OUTPUTS
Writes to or reads a file using the filePath parameter.
.EXAMPLE (Write the property "Prop 1" with the value "Get in you machine!" to a file)
PS> Invoke-ReadWriteJSON -filePath $jsonFilePath "Write" "Prop 1" "Get in you machine!"
.EXAMPLE (Read a property from a file)
PS> Invoke-ReadWriteJSON -filePath $jsonFilePath "Read" "Prop 2"
PS> temp
#>
Param
(
[Parameter(Mandatory = $true, HelpMessage = 'The file path of the JSON file.')]
[String]$filePath,
[Parameter(Mandatory = $true, HelpMessage = 'This parameter is either Read or Write.')]
[String]$Mode,
[Parameter(Mandatory = $true, HelpMessage = 'The property of the JSON object.')]
[String]$Property,
[Parameter(Mandatory = $false, HelpMessage = 'The new property of the JSON object.')]
[String]$newValue
)
# If there is a file then set its content else set the content variable to empty.
if (Test-Path -LiteralPath $filePath) {
$contents = Get-Content -LiteralPath $filePath
$contents = $contents -replace '\s*' # Replace any whitespaces so that the length can be checked.
}
else {
$contents = ''
}
# if the file does not exist or the contents are empty
if ((Test-Path -LiteralPath $filePath) -eq $false -or $contents.length -eq 0) {
Write-ToFile $filePath $jsonSettingFileDefaultContents
}
# This will allow single and multiline comments in the json file.
# Regex for removing comments: https://stackoverflow.com/a/59264162/8262102
$jsonContents = (Get-Content -LiteralPath $filePath -Raw) -replace '(?m)(?<=^([^"]|"[^"]*")*)//.*' -replace '(?ms)/\*.*?\*/' | Out-String | ConvertFrom-Json
# Grab the comments that will be used late on.
$jsonComments = (Get-Content -LiteralPath $filePath -Raw) -replace '(?s)\s*\{.*\}\s*'
# Read the property.
if ($Mode -eq "Read") {return $jsonContents.$Property}
# Write the property.
if ($Mode -eq "Write") {
$jsonContents.$Property = $newValue
$jsonContents | ConvertTo-Json -depth 32 | set-content $filePath
# Trims any whitespace from the beginning and end of contents.
Set-content $filePath ((Get-Content -LiteralPath $filePath -Raw) -replace '(?s)^\s*|\s*$')
}
# If there are comments then this section will add them back in. Important to
# read contents with -Raw switch here.
if ($jsonComments.length -gt 0) {
$jsonNewcontents = (Get-Content -LiteralPath $filePath -Raw) -replace '(?m)(?<=^([^"]|"[^"]*")*)//.*' -replace '(?ms)/\*.*?\*/'
# Trims any whitespace from the beginning and end of contents.
Set-content $filePath (("$jsonComments`n" + $jsonNewcontents) -replace '(?s)^\s*|\s*$')
}
}
$deskTopFolder = [Environment]::GetFolderPath("DesktopDirectory")
$jsonFilePath = "$deskTopFolder\color-dialog-settings.json"
$jsonSettingFileDefaultContents = #'
// Some comments go here.
// Some comments go here.
// Some comments go here.
{
"Prop 1": "temp",
"Prop 2": "temp"
}
'#
# Write the JSON property.
# Invoke-ReadWriteJSON -filePath $jsonFilePath "Write" "Prop 1" "Get in you machine!"
# Read the JSON property.
Invoke-ReadWriteJSON -filePath $jsonFilePath "Read" "Prop 2"
# PS> temp