How to treat strings in ConvertFrom-Json as literals? - json

I'm importing a set of configuration values from a file in JSON format using the following:
$configFileContent = Get-Content -Path run.config | ConvertFrom-Json
This produces a result that among other things, contains the following (the contents of the variable $configFileContent):
{
"config-values":{
"path":"..\temp-path"
}
}
Next, I try to access the value of path from that config as follows:
$conf = $configFileContent.'config-values'
$tempPath = $conf.'path'
..but this fails due to the characters \t in ..\temp-path being interpreted as an escape sequence representing a Tab instead. This is clear by printing the contents of $conf, which is now:
path
----
.. emp-path
As you can see, the value of Path is .. <tab> emp-path instead of ..\temp-path, as intended. Obviously this causes trouble later when I'm trying to use the variable $tempPath as an actual path.
How can I make Powershell interpret this as intended - i.e., treat strings as literals here?

I understand this may not be the answer you are looking for.
But the quick fix is to use a double backslash in your JSON file
It is also a common workaround in all other languages
{
"config-values":{
"path":"..\\temp-path"
}
}

Related

Is there a difference in the string object created by out-string and toString()?

I'm parsing some text from a json file and based on the text I want to do things to the text.
foreach ($jsonText in $jsonFile.row[0]){
$stringA = $jsonText.ToString()
$stringB = $jsonText| Out-String
switch ($stringA)
{
'A' {'do things'}
'B' {'do other things'}
'C' {'do somethings'}
}
The string from piping out-string does not produce a string that would work in the switch case, so I am wondering if there is a difference with these two strings?
The fundamental difference is that Out-String uses PowerShell's rich for-display output-formatting system for non-string, non-primitive objects, whereas .ToString() simply delegates the stringification to the .NET type at hand (which, unless overridden by a type, simply reports the full type name).
In other words: Out-String by default creates a single, multi-line string with the same richly formatted, for-the-human-observer representation you would see in the console.
Adding -Stream sends this representation line by line through the pipeline (resulting in an array of lines if captured in a variable). Given that representations of complex objects are multi-line and in tabular form share a header, a single line does not correspond to a single input object.
As of PowerShell 7.2, an unfortunate aspect of Out-String (without -Stream) is that it appends a trailing newline to whatever the output-formatting system reports, and that also applies to strings as input, which PowerShell's formatting system otherwise represents as-is; similarly, .NET primitive types and a few single-value-only types are represented by their .ToString() return values.
GitHub issue #14444 discusses this problematic behavior.
# Unfortunately, these equivalences are true.
# With a single input object, adding -Stream would avoid the trailing newline.
('foo'.ToString() + [Environment]::NewLine) -eq ('foo' | Out-String)
((42).ToString() + [Environment]::NewLine) -eq (42 | Out-String)
Uses of Out-String
Use it if you explicitly want a string representation of the rich, for-display representation that PowerShell's formatting system produces.
Combined with -Stream, this can serve as a quick-and-dirty way to search through the display output of a command via Select-String; PowerShell even comes with function, oss, that wraps Output-String -Stream, so that you can do something like:
# Quick-and-dirty search through the formatted representations
# of all defined drives, without having to worry about property names.
Get-PSDrive | oss | Select-String \\server1
Regrettably, Select-String doesn't search through the formatted representations by default (in which case you could omit the oss call), even though that would make sense (it actually searches through the typically useless .ToString() representations).
GitHub issue #10726 proposes changing the default behavior.
With external programs, you can use it to join their output lines - which PowerShell invariably interprets as text (strings) - into a single-multiline string.
E.g, to get the output from tzutil /l as a single, multi-line string:
$output = tzutil /l | Out-String
Unfortunately, this is again hampered by the unexpected addition of a trailing newline, as discussed in GitHub issue #14444; workarounds:
Trim the trailing newline, in the simplest case with .Trim()
$output = (tzutil /l | Out-String).Trim()
Use a -join operation instead:
$output = (tzutil /l) -join [Environment]::NewLine

Powershell removing escape chars on conversion

I have some automatically generated json files I need to modify using PowerShell. However, when I use the ConvertFrom-Json I'm in some cases losing chars.
I tried using
ForEach-Object {
[System.Text.RegularExpressions.Regex]::Unescape($_)
}
To handle the unescape chars, but no luck
The example of a string getting modified
<?xml version=\"1.0\" encoding=\"UTF-16\"?><ExchangeRates>
is getting transformed to
<?xml version="1.0" encoding="UTF-16"?><ExchangeRates>
Losing the backslashes.
How would I getting around this without transforming the unintentional parts of the file ?
I redid the testing in a clean environment, and found out I had something that enforced a UTF8 encoding when I loaded the content into a son object in PowerShell which in this case causes the chars to be converted into escape chars which results in them being replaced by nothing in my case
tldr; UTF8 formatting when doing the convertfrom-json was causing the problem

Interpolating a JSON string removes JSON quotations

I have the following two lines of code:
json_str = _cases.to_json
path += " #{USER} #{PASS} #{json_str}"
When I use the debugger, I noticed that json_str appears to be formatted as JSON:
"[["FMCE","Wiltone","Wiltone","04/10/2018","Marriage + - DOM"]]"
However, when I interpolate it into another string, the quotes are removed:
"node superuser 123456 [["FMCE","Wiltone","Wiltone","04/10/2018","Marriage + - DOM"]]"
Why does string interpolation remove the quotes from JSON string and how can I resolve this?
I did find one solution to the problem, which was manually escaping the string:
json_str = _cases.to_json.gsub('"','\"')
path += " #{USER} #{PASS} \"#{json_str}\""
So basically I escape the double quotes generated in the to_json call. Then I manually add two escaped quotes around the interpolated variable. This will produce a desired result:
node superuser 123456 "[[\"FMCE\",\"Wiltone\",\"Wiltone\",\"04/10/2018\",\"Marriage + - DOM\"]]"
Notice how the outer quotes around the collection are not escaped, but the strings inside the collection are escaped. That will enable JavaScript to parse it with JSON.parse.
It is important to note that in this part:
json_str = _cases.to_json.gsub('"','\"')
it is adding a LITERAL backslash. Not an escape sequence.
But in this part:
path += " #{USER} #{PASS} \"#{json_str}\""
The \" wrapping the interpolated variable is an escape sequence and NOT a literal backslash.
Why do you think the first and last quote marks are part of the string? They do not belong to the JSON format. Your program’s behavior looks correct to me.
(Or more precisely, your program seems to be doing exactly what you told it to. Whether your instructions are any good is a question I can’t answer without more context.)
It's hard to tell with the small sample, but it looks like you might be getting quotes from your debugger output. assuming the output of .to_json is a string (usually is), then "#{json_str}" should be exactly equal to json_str. If it isn't, that's a bug in ruby somehow (doubtful).
If you need the quotes, you need to either add them manually or escape the string using whatever escape function is appropriate for your use case. You could use .to_json as your escape function even ("#{json_str.to_json}", for example).

TCL: file normalize gives unexpected output

I have the following line of code:
file normalize [string map {\\ /} $file]
The string map operation is to make the line work for paths containing backslashes instead of forward (as is the case in Windows)
For some values of $file (let's say it's "/path/to/my/file") I get output similar to:
/path/to/"/path/to/my/file/"
This doesn't happen for all paths but I'm unable to figure out what causes it. There are no symbolic links in the path.
Is there something I'm doing wrong, or is there an alternative to file normalize that I could try?
my tcl version is 8.5
UPDATE:
On further investigation I see that the string map is not making any difference. The output of file normalize itself is coming with that extra text before the desired text. Also, the extra text seems to be from a previous run of the code.
UPDATE 2: It was because of the quotation marks in the input to file normalize
Most likely the path has backslashes where it shouldn't have them.
% file normalize {"/path/to/some/file"}
/path/to/"/path/to/some/file"
% file normalize \"/path/to/some/file\"
/path/to/"/path/to/some/file"
Perhaps some pathname handling code escaped special characters for some reason and left the path in a mangled state.
I would try to keep the pathname pristine and when it needs to be changed for display or other processing, make a copy of it first.

Output of array as comma separated BASH

I'm trying to pull variables from an API in json format and then put them back together with one variable changed and fire them back as a put.
Only issue is that every value has quote marks in it and must go back to the API separated by commas only.
example of what it should see with redacted information, variables inside the **'s:
curl -skv -u redacted:redacted -H Content-Type: application/json -X PUT -d'{properties:{basic:{request_rules:[**"/(req) testrule","/test-body","/(req) test - Admin","test-Caching"**]}}}' https://x.x.x.x:9070/api/tm/1.0/config/active/vservers/xxx-xx
Obviously if I fire them as a plain array I get spaces instead of commas. However I tried outputting it as a plain string
longstr=$(echo ${valuez[#]})
output=$(echo $longstr |sed -e 's/" /",/g')
And due to the way bash is interpreted it seems to either interpret the quotes wrong or something else. I guess it might well be the single ticks encapsulating after the PUT -d as well but I'm not sure how I can throw a variable into something that has single ticks.
If I put the raw data in manually it works so it's either the way the variable is being sent or the single ticks. I don't get an error and when I echo the line out it looks perfect.
Any ideas?
valuez=( "/(req) testrule" "/test-body" "/(req) test - Admin" "test-Caching" )
# Temporarily set IFS to some character which is known not to appear in the array.
oifs=$IFS
IFS=$'\014'
# Flatten the array with the * expansion giving a string containing the array's elements separated by the first character of $IFS.
d_arg="${valuez[*]}"
IFS=$oifs
# If necessary, quote or escape embedded quotation marks. (Implementation-specific, using doubled double quotes as an example.)
d_arg="${d_arg//\"/\"\"}"
# Substitute the known-to-be-absent character for the desired quote+separator+quote.
d_arg="${d_arg//$'\014'/\",\"}"
# Prepend and append quotes.
d_arg="\"$d_arg\""
# insert the prepared arg into the final string.
d_arg="{properties:{basic:{request_rules:[${d_arg}]}}}"
curl ... -d"$d_arg" ...
if you have gnu awk with version 4 and above, which support FPAT
output=$(echo $longstr |awk '$1=$1' FPAT="(\"[^\"]+\")" OFS=",")
Explanation
FPAT #
This is a regular expression (as a string) that tells gawk to create the fields based on text that matches the regular expression. Assigning a value to FPAT overrides the use of FS and FIELDWIDTHS for field splitting. See Splitting By Content, for more information.
If gawk is in compatibility mode (see Options), then FPAT has no special meaning, and field-splitting operations occur based exclusively on the value of FS.
valuez=( "/(req) testrule" "/test-body" "/(req) test - Admin" "test-Caching" )
csv="" sep=""
for v in "${valuez[#]}"; do csv+="$sep\"$v\""; sep=,; done
echo "$csv"
"/(req) testrule","/test-body","/(req) test - Admin","test-Caching"
If it's something you need to do repeatedly, but it into a function:
toCSV () {
local csv sep val
for val; do
csv+="$sep\"$val\""
sep=,
done
echo "$csv"
}
csv=$(toCSV "${valuez[#]}")