Azure devops pipeline, powershell split string, convert to json and assign to pipeline variable doesn't work - json

I want to convert a pipeline variable - delimited string - to a json array and assign the json array to an other pipeline variable. See my code below, output stays empty. What am I missing here?
script:
steps:
- task: PowerShell#2
inputs:
targetType: inline
script: |
$test = "LZ-UK;LZ-ES;LZ-NL"
$json = $test.Split(";") | ConvertTo-Json -AsArray
Write-Host "##vso[task.setvariable variable=JsonLZ]$json"
Write-Host "Assigned the variable"
Write-Host "New `r`n $($JsonLZ)"
- script: |
echo ${{ variables.JsonLZ }}
output:
Starting: PowerShell
==============================================================================
Task : PowerShell
Description : Run a PowerShell script on Linux, macOS, or Windows
Version : 2.200.0
Author : Microsoft Corporation
Help : https://learn.microsoft.com/azure/devops/pipelines/tasks/utility/powershell
==============================================================================
Generating script.
========================== Starting Command Output ===========================
/usr/bin/pwsh -NoLogo -NoProfile -NonInteractive -Command . '/home/vsts/work/_temp/380b437f-74c4-4883-9d4a-7b4f3ac79266.ps1'
"LZ-UK",
"LZ-ES",
"LZ-NL"
]
Assigned the variable
New
Finishing: PowerShell

You're very close. There were a few minor issues that I spotted with your YAML/PowerShell:
You forgot the semicolon after the variable name in "##vso[task.setvariable variable=JsonLZ]$json", it should be: "##vso[task.setvariable variable=JsonLZ;]$json"
You should be using $(JsonLZ) instead of ${{ variables.JsonLZ }}. The former will be evaluated at runtime, the latter at compile-time. Here's a link to the MS Docs: Understand variable syntax
Give this a try to see a working example:
name: Stackoverflow-Example-Pipeline
trigger:
- none
variables:
JsonLZ: 'UNSET'
stages:
- stage: StageA
displayName: "Stage A"
jobs:
- job: example_job
displayName: "Example Job"
pool:
vmImage: "ubuntu-latest"
steps:
- task: PowerShell#2
inputs:
targetType: inline
script: |
$test = "LZ-UK;LZ-ES;LZ-NL"
$json = $test.Split(";") | ConvertTo-Json -Compress
Write-Host "##vso[task.setvariable variable=JsonLZ;]$json"
Write-Host "Assigned the variable"
- script: |
echo $(JsonLZ)
echo ${{ variables.JsonLZ }}

Related

How to set a JSON file as a variable in Azure Pipeline and use it in subsequent task?

I have a myJson.json that look like this:
{
"FirewallGroupsToEnable": [
"Remote Event Log Management",
"Windows Remote Management",
"Performance Logs and Alerts",
"File and Printer Sharing",
"Windows Management Instrumentation (WMI)"
],
"MemoryStartupBytes": "3GB"
}
I'd like to serialize it as a string and then set it as a variable to be used by other tasks. If there is a better way to use this file inside a pipeline please let me know.
I'm serializing it and setting it like this:
- task: PowerShell#2
inputs:
targetType: 'inline'
script: |
$Configs= Get-Content -Path $(Build.SourcesDirectory)\sources\myJson.json -Raw | ConvertFrom-Json
Write-Host "##vso[task.setvariable variable=Configs]$Configs"
In the following task, I am running a PowerShell script.
- task: PowerShell#2
displayName: 'myTask'
inputs:
targetType: 'filePath'
filePath: 'sources\myScript.ps1'
pwsh: true
I'm using the variable in my script like this:
$env:Configs
[Configs]$envConfigs = ConvertFrom-Json -InputObject $env:Configs -ErrorAction Stop
The Configs is a class that is being imported at the top of the script like so Using module .\Configs.psm1. I know it's being read because if it wasn't the error would be about a missing type.
Configs.psm1
class Configs
{
[string[]]$FirewallGroupsToEnable
[string]$MemoryStartupBytes
}
This is what I get in the pipeline.
##[debug]Processed: ##vso[task.setvariable variable=Configs]#{FirewallGroupsToEnable=System.Object[]; MemoryStartupBytes=3GB}
#{FirewallGroupsToEnable=System.Object[]; MemoryStartupBytes=3GB}
Cannot convert the "#{FirewallGroupsToEnable=System.Object[]; MemoryStartupBytes=3GB}" value of type "System.String" to type "Configs".
I've always casted a deserialized JSON into custom types like this and it always worked. But right now there is something wrong!
I tried to remove ConvertFrom-Json while serializing the JSON (before setting it as a variable) but it doesn't serialize it right. It shows like this in the pipeline:
It looks like it's only getting the first curly braces!
So, how do I serialize a JSON regardless of its depth into the pipeline to be used in later tasks inside a script file?
You could pass the json data via a file, here is my YAML snippet that can pass the full JSON content successfully in next task.
- task: PowerShell#2
inputs:
targetType: 'inline'
script: |
$Configs= Get-Content -Path $(Build.SourcesDirectory)\wit.json
$Configs | Out-File Test.json
- task: PowerShell#2
inputs:
targetType: 'inline'
script: |
$Input = Get-Content Test.json
Write-Host $Input
Here are similar ticket about pass Json variable for reference.
As it turns out, you can't Write-Host multi-line variables even if they were strings due to the fact of PowerShell formatting. Hence, I had to ConvertFrom-Json and then ConvertTo-Json -Compress to get my string in one line.
$Configs= Get-Content -Path $(Build.SourcesDirectory)\sources\myJson.json -Raw | ConvertFrom-Json | ConvertTo-Json -Compress

Github Actions ##vso[task.setvariable] equivalent

In Azure Devops, I can set a pipeline variable at runtime by echoing:
##vso[task.setVariable var=value]
How can I do the same thing in Github Workflows?
I'm not making a custom action, so I don't think outputs are relevant, I just want to pass a variable from one step to another. However, I might be missing something.
The following will set a value as an env variable named environment_variable_name
echo "{environment_variable_name}={value}" >> $GITHUB_ENV
An example on how you would use this could be
steps:
- name: Set the value
id: step_one
run: |
echo "action_state=yellow" >> $GITHUB_ENV
- name: Use the value
id: step_two
run: |
echo "${{ env.action_state }}" # This will output 'yellow'
More on this can be found here

Azure DevOps PowerShell Json to text file - Unexpected token ':' in expression or statement

I have an azure devops pipeline that I'd like to write the content of a variable that holds json to a text file.
Here are two tasks from the pipeline:
- task: CmdLine#2
displayName: 'echo swagger content'
inputs:
script: |
echo "print value of swaggerContent output variable set in get-swagger-from-azure.ps1"
echo $(swaggerContent)
- task: PowerShell#2
displayName: 'write swagger content to file'
inputs:
targetType: 'inline'
script: $env:swaggerContent | Out-File "$(Pipeline.Workspace)/swagger-content.json"'
The CmdLine task works ok and outputs the json, as seen below:
However, the PowerShell task gives the following error:
At D:\a_temp\05c70744-c4cc-4322-99a0-98f55e41fbba.ps1:7 char:1
} else {
~ Unexpected token '}' in expression or statement.
CategoryInfo : ParserError: (:) [], ParseException
FullyQualifiedErrorId : UnexpectedToken
Anyone see what I'm doing wrong?
$(swaggerContent) is an Azure Pipelines variable, not a PowerShell variable. It's just a placeholder that contains the JSON.
So in the line
$(swaggerContent) | ConvertTo-Json | Out-File "$(Pipeline.Workspace)/swagger-content.json"
Think of what happens if you just replace $(swaggerContent) with some JSON. You get something like
{ "foo": "bar" } | ConvertTo-Json | Out-File "$(Pipeline.Workspace)/swagger-content.json"
Note that the JSON is completely unescaped. It's not a string, it's just random text inserted in the middle of the script.
Azure Pipelines treats non-secret variables as environment variables when running scripts, so you can try something along the lines of:
$env:swaggerContent | Out-File "$(Pipeline.Workspace)/swagger-content.json"

Use Json variable between two PowerShell steps

In an Azure Devops Pipeline I need to pass over a Json variable from a Powershell script in step 1 to another Powershell script in step 2. The double quotes of the Json variable seem to be messing things up. Escaping them also does not seem to work.
Here the 2 steps:
- task: PowerShell#2
displayName: 'Debug -> Step 1'
inputs:
targetType: 'inline'
script: |
$json = [pscustomobject]#{ prop1 = "value1"; prop2 = "value2" } | ConvertTo-Json
Write-Host "##vso[task.setvariable variable=MYVAR]$json"
- task: PowerShell#2
displayName: 'Debug -> Step 2'
inputs:
targetType: 'inline'
script: |
echo $env:MYVAR
This results in:
Any idea how I can pass an object (in Json) to another step?
The logging command ##vso[task.setvariable] can only accept a single line string. You need to use -Compress to convert the json object to a single line string. See below example:
$json = [pscustomobject]#{ prop1 = "value1"; prop2 = "value2" } | ConvertTo-Json -Compress
Write-Host "##vso[task.setvariable variable=MYVAR]$json "

Azure Devops - Output Json Object from one task and Consume in another task

Say I have a release pipeline in Azure DevOps written in yaml which has two tasks, one for reading json from a file and the second one for setting a key into a different json file using the json read in the first task. I have the following pipeline.yml -
trigger:
- master
pool:
vmImage: 'ubuntu-latest'
steps:
- task: PowerShell#2
name: ReadMetadataJson
inputs:
filePath: 'MetadataReader.ps1'
arguments: 'Run MetadataReader.ps1 -pathToMetadata metadata.json'
- task: PowerShell#2
name: SetAppSetting
inputs:
filePath: 'AppSettingSetter.ps1'
arguments: 'Run AppSettingSetter.ps1 -pathToAppSetting SomeApp/Data.json -appSettingKey testkey -appSettingValue $($(ReadMetadataJson)).testkey'
- script: echo $(ReadMetadataJson.metadata)
Below are the Powershell scripts being called from each tasks -
Powershell 1
# Read From the Metadata.json
param ($pathToMetadata)
echo $pathToMetadata
$metadata = Get-content $pathToMetadata | out-string | ConvertFrom-Json
Write-Output "Metadata Json from metadata reader ps script - $metadata"
echo "##vso[task.setvariable variable=metadata;]$metadata"
Powershell 2
# For now just accept the values in parameter and log them
param ($pathToAppSetting, $appSettingKey, $appSettingValue)
echo "pathToAppSetting : $pathToAppSetting"
echo "appSettingKey : $appSettingKey"
echo "appSettingValue : $appSettingValue"
# Code to set in another file. I have this working, so omitting for brevity
And these are the json files -
Metadata.json
{
"testkey": "TestValueFromMetadata",
"testkey1": "TestValueFromMetadata1"
}
appSetting.json
{
"testkey": "TestValueInAppSetting",
"testkey1": "TestValueInAppSetting1"
}
The problem is when I want to return the json data as output from the first task and use it in the second task to pass the parameter to the second powershell script. Below is a screenshot of the pipeline result after I run it.
As can be seen, it says ReadMetadataJson.metadata: command not found. I have been following the Microsoft document as a reference and have searched for other articles, but all I could find was handling values like string or integer, but not a json object. What is it that I am missing or doing wrong.
You can convert your JSON object to string (ConvertTo-Json) and pass it as variable to the second script.
Then on the second script, you just parse the string to JSON object again, using the ConvertFrom-Json method.
Except the method that Hugo mentioned above, there has another solution can achieve what you want without any additional step added.
Just add one line into your MetadataReader.ps1:
param ($pathToMetadata)
echo $pathToMetadata
$metadata = Get-content $pathToMetadata | out-string | ConvertFrom-Json
$metadata | Get-Member -MemberType NoteProperty | % { $o = $metadata.($_.Name); Write-Host "##vso[task.setvariable variable=$($_.Name);isOutput=true]$o" }
Then, it will parse those json objects into corresponding variables after the json file contents get.
(I make use of the work logic of terroform outputs here)
Then you can directly use {reference name}.{object name} to call corresponding json value.
- task: PowerShell#2
name: ReadMetadataJson
inputs:
filePath: 'MetadataReader.ps1'
arguments: 'Run MetadataReader.ps1 -pathToMetadata metadata.json'
- task: PowerShell#2
name: SetAppSetting
inputs:
filePath: 'AppSettingSetter.ps1'
arguments: 'Run AppSettingSetter.ps1 -pathToAppSetting Data.json -appSettingKey testkey -appSettingValue $(ReadMetadataJson.testkey)'
- script: echo $(ReadMetadataJson.testkey)
Note: I made changes here: -appSettingValue $(ReadMetadataJson.testkey)