I'm not sure how I'm supposed to input a list of IP addresses as a string array in Azure Automation. I get "invalid JSON primitive: 10.10.3.0" when I used JSON format
['10.10.3.0/24', '10.10.4.0/24']
Am I supposed to escape the forward slash?
[string] $backendAddressPoolName = "backendPool",
[string[]] $backendIPAddresses,
That's it. These are the parameters to my runbook. Azure Automation won't accept
["10.10.3.0/24", "10.10.4.0/24"]
As an input for backendIPAddresses
param(
[string[]]$backendIPAddresses
)
$backendipaddresses | % { "input: $_" }
Input this exactly: ["str1","str2"] - this has to be valid json
json validator: https://jsonlint.com/
Related
I have a powershell based http trigger azure function that uses the Az module to call get-azvm. It get's it's data from a logic-app POST. The output looks correct and it's type shows as string but the cmdlet does not like the variable. The function looks like this:
Write-Host "PowerShell HTTP trigger function processed a request."
$sub = $Request.Body.subject | Out-String
write-host "sub:" $sub
$split = $sub -split "[/]"
write-host "split:" $split
$avd = $split[8] | Out-string
Write-Host "avd" $avd
$rgName_avd = 'rg-azgroup'
Get-AzVM -Name $avd -ResourceGroupName $rgName_avd
The post input looks like:
/subscriptions/xxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxx/resourcegroups/rg-azgroup/providers/Microsoft.Compute/virtualMachines/myvmname-0
The error in the logs starts as:
[Error] ERROR: Unexpected character encountered while parsing value: <. Path '', line 0, position 0.Exception :Type : Newtonsoft.Json.JsonReaderExceptionTargetSite :Name : ParseValueDeclaringType : Newtonsoft.Json.JsonTextReaderMemberType
It feels like it's an issue with the input type but using ConvertFrom-Json does not seem to work. Any ideas what I'm doing wrong? If I hard code the VM name or define a variable locally and use it the command executes.
The AZ module versions being used in requirements.psd1:
#{
'Az.Accounts' = '2.7.6'
'Az.Compute' = '4.26.0'
}
Here are the few workarounds that you can try:
Solution 1:
You might not be passing JSON to DeserializeObject.
It looks like that type of tmpfile from File.WriteAllText(tmpfile,... is string that contain file path. JsonConvert.DeserializeObject takes JSON value rather than the file path, it fails trying to convert something like #"c:\temp\fooo" - which is not JSON.
Solution 2:
Check that is the file containing JSON string has BOM.
Once u remove BOM the problem may get resolved.
Solution 3:
For a Web API action that was binding to a string instead to an object or a JObject when JSON was correct, but the binder tried to obtain a string from the JSON structure and failed.
So, instead of:
[HttpPost("[action]")]
public object Search([FromBody] string data)
Try this:
[HttpPost("[action]")]
public object Search([FromBody] JObject data)
References:
Unexpected character encountered while parsing value
Unexpected character encountered while parsing value ASP.NET Core and Newtonsoft
I have a WebHook activity in Azure Data Factory pipeline but I'm not able to pass variables here.
#json('{
"body": "#{pipeline().parameters.body}",
"name": "#{variables('name')}"
}')
There is a problem with '. I've tried with \'name\' but it does not work.
The body representing the payload to be be sent to endpoint must be Valid JSON or an expression that results a value of type JSON. So in WebHook activity you can just pass the JSON string rather than using the function json() again.
Checkout this example:
Use any string variable:
Using the variable and parameter in JSON string:
{
"var":"#{variables('variable')}",
"param":"#{pipeline().parameters.parameter}",
"age":"23"
}
by string interpolation, the value of variable is replaced in place
I have a Clojure function that writes out the contents of a vector of URLs to JSON, and returns this from an API endpoint. This endpoint data is read by an Elm JSON Decoder on a frontend app. Running this Clojure function directly gives me the following:
(json/write-str ["http://www.example.com?param=value" "http://www.example2.com?param=value"])
Which returns:
"[\"http:\\/\\/www.example.com?param=value\",\"http:\\/\\/www.example2.com?param=value\"]"
Wonderful! If I input this result directly into an Elm Decoder such as this:
stringArrayDecoder : Json.Decoder.Decoder (List String)
stringArrayDecoder =
Json.Decoder.list Json.Decoder.string
It parses it happily with no error.
However, when I view the JSON response from the endpoint, it is losing some of the escaping and I get this:
["http:\/\/www.example.com?param=value","http:\/\/www.example2.com?param=value"]
Which my Elm decoder cannot read.
How do I avoid this? How can I get the fully escaped JSON values produced by my internal function through the API endpoint and into my Elm frontend Decoder?
JSON allows you to escape forward slashes / and other characters as to prevent stuff like </ from popping up in html.
write-str has an :escape-slash boolean option:
:escape-slash boolean
If true (default) the slash / is escaped as \/
Thus you can instead write
(json/write-str ["http://url.one" "http://url.two"] :escape-slash false)
=> "[\"http://url.one\",\"http://url.two\"]"
Using Powershell in Azure DevOps release pipeline, I am trying to convert parameters into json format before posting to a asp netcore endpoint. One of the parameters is datetime.
I am getting the following error for the DateTime type:
"errors":{"$.CreatedDate":["The JSON value could not be converted to System.Nullable`1[System.DateTime].
Here is the Powershell script. The variable $(RELEASE.DEPLOYMENT.STARTTIME) is a DevOps variable and outputs the date format 2020-06-15 10:00:46Z
$params = #{
Name = "Test"
CreatedDate = $(RELEASE.DEPLOYMENT.STARTTIME)
}
Invoke-WebRequest -Uri https://mynetcoreendpoint -Method POST -Body ($params | ConvertTo-Json) -ContentType "application/json" -UseBasicParsing
The json is evaluated at the endpoint side. Here is the NetCore endpoint
[HttpPost]
public ActionResult<ReleaseDTO> CreateRelease(ReleaseDTO release)
{
// Do some stuff
}
// Where ReleaseDTO has the property
public DateTime? CreatedDate { get; set; }
Make sure that the hashtable that is the basis for the to-JSON conversion contains a [datetime] instance, not just a string:
# !! This may work in PowerShell [Core, v6+] only, not in Windows PowerShell.
$params = #{
Name = "Test"
# Note the cast to [datetime] and the need to enclose the
# Azure macro in quotes.
CreatedDate = [datetime] '$(RELEASE.DEPLOYMENT.STARTTIME)'
}
That way, the JSON representation of the timestamp should deserialize properly in the endpoint's C#-based code - assuming that both the serializing code and the deserializing code use the same string-based convention for representing [datetime] instances[1]
If you're using Windows PowerShell (versions up to v5.1) and the web-service end-point uses .NET Core / Json.NET, the conventions are mismatched, so you need to create the string representation of the timestamp as required by the web-service endpoint manually:
$params = #{
Name = "Test"
# Note the cast to [datetime] and the need to enclose the
# Azure macro in quotes.
CreatedDate = ([datetime] '$(RELEASE.DEPLOYMENT.STARTTIME)').ToString('o')
}
Read on for an explanation.
Unfortunately, there are different conventions in the .NET world with respect to how timestamps are represented in JSON:
On the PowerShell side (verify with #{ dt = [datetime]::now } | ConvertTo-Json):
Windows PowerShell: ConvertTo-Json uses the convention of "\/Date(<epochTimeMs>)\/", where <epochTimeMs> is a Unix epoch timestamp in milliseconds.
Example: #{ dt = [datetime]::now } | ConvertTo-Json -Compress yields
{"dt":"\/Date(1592309341640)\/"}
PowerShell [Core, v6+], as of version 7.0: ConvertTo-Json uses Json.NET behind the scenes, which uses the ISO 8601-compatible standard roundtrip date/time string-formatting pattern ("o") that you can pass to the .ToString() method of a [datetime] instance; e.g., [datetime]::now.ToString('o') yields something like:
"2020-06-15T11:54:06.114098-04:00"
Example: #{ dt = [datetime]::now } | ConvertTo-Json -Compress
yields {"dt":"2020-06-16T08:07:50.356321-04:00"}
On the C# (.NET) side:
It seems that at least at some point in the ASP.NET world the old Windows PowerShell convention was used.
Json.NET as well as the new .NET-native System.Text.Json types use the new convention as also used in PowerShell [Core].
[1] Note that the JSON standard does not define a value type for date/time instances, so it is ultimately up to a given implementation to use string JSON values to represent timestamps, as a convention; alternatively, a numeric representation (such as ticks) is also an option, but only a string representation allows you to infer the intended data type (in the absence of schema information).
It is therefore important that both the JSON serializer as well as the JSON deserializer adhere to the same convention in order to pass timestamps properly.
I am trying to pass pass parameter with space to aws cloudformation create-stack aws cli.
The issue is that my parameter has space. I am using powershell for scripting.
Below is example of my parameter
$JsonParameter = '[{"ParameterKey":"name","ParameterValue":"John"},{"ParameterKey":"Occupation","ParameterValue":"Test Engineer"}]'| ConvertTo-Json
This returns
"[{\"ParameterKey\":\"name\",\"ParameterValue\":\"John\"},{\"ParameterKey\":\"Occupation\",\"ParameterValue\":\"Test Engineer\"}]"
cli command is
aws cloudformation create-stack --stack-name $stackName --template-url $templateUrl --capabilities $capabilityList --parameters $JsonParameter --region "us-east-1"
The error goes
Error parsing parameter '--parameters': Invalid JSON:
[{"ParameterKey":"name","ParameterValue":"John"},{"ParameterKey":"Occupation","ParameterValue":"Test
From the error, it looks like cli doesn't like the space in the ParameterValue.
How do I escape the space, so that cli doesn't complain about the space in the value?
Remove | ConvertTo-Json.
Your string is already a json string so you do not want to perform a conversion.
$JsonParameter = '[{"ParameterKey":"name","ParameterValue":"John"},{"ParameterKey":"Occupation","ParameterValue":"Test Engineer"}]'
Just use the string as is.
Alternate scenario
Should you be working with a Powershell object rather than a json string, you might want at some point to convert it into a json to pass it as parameter to your aws call.
That's the moment where ConvertTo-Json would reveal itself to be useful.
Take this for instance
$JsonParameter = #(
#{
ParameterKey = 'name'
ParameterValue = 'John'
},
#{
ParameterKey = 'Occupation'
ParameterValue = 'Test Engineer'
}
)
This is a Powershell object, for which you might had, in a different context, built from scratch with the intent of passing it as a json paramerter to your aws call.
Now, to achieve the transition from this state of "array of hashtables" to a valid json string, you need to use the ConvertTo-Json cmdlet.
$JsonParameterString = $JsonParameter | Convertto-json -Compress
The resulting string he same as you had initially, ready to be passed down to aws :
[{"ParameterKey":"name","ParameterValue":"John"},{"ParameterKey":"Occupation","ParameterValue":"Test Engineer"}]
If on the other hand, you had a json string and needed to edit it without fuss, you could use the ConvertFrom-Json cmdlet, then edit the resulting object as needed and convert it back to json again before passing it down.
Additional note
In my Powershell to Json example,
I used the -compress switch parameter. This is optional. This will create a compressed json string (one line) instead of an expanded one.
Reference
Powershell doc - ConvertTo-Json