I would like to set a property in a component process which is available from there on in all subsequent steps, in the rest of the current process and in all other processes that are called from there.
So, in a component process, I'm using Deploy Process plugin to set a value to a property, in the scope of the parent request.
Here's the illustration:
Deploy Request
Application Process: AppProcess1
Install Component
component name: Comp1
component process: Comp1-Proc1
Step 1: Set Process Request Property
name: PROP_1
value: val1
process request id: ${p:parentRequest.id}
Step 2: Shell
Shell Script: echo ${p:PROP_1} --> Output: <empty-string>
Step 3: Run Component Process
component process: Comp1-Proc2
Step 1: Shell
Shell Script: echo ${p:PROP_1} --> Output: val1
Step 4: Shell
Shell Script: echo ${p:PROP_1} --> Output: val1
The problem is that the value is not available in the steps in the current process (Comp1-Proc1) when referenced with ${p:PROP_1}, unless another component process (Comp1-Proc2) is called, where the value is available, and then come back to first process, when the value becomes available, too.
Am I doing something wrong? Is this an expected behavior?
I'm using an on-premise UrbanCode Deploy - version 7.0.2.3.ifix01.1022337.
I don't find anything in the official UCD documentation, nor in plugin doc which would explain the above behavior.
Try Set Property. Via this you can extend the scope of the property which you are setting. For eg
Application process 1
Step 1 - shell
Deploy Request
Application Process: AppProcess1
Install Component
component name: Comp1
component process: Comp1-Proc1
Shell 1 - (here use Set Environment property)
Install Component
component name: Comp2
component process: Comp2-Proc1
Shell 1 - (here you can refer to it) ${p:YourEnvironmentName/YourPropertyname}
You just need to extend the scope of property to a higher level
Related
In Azure, I have a release pipeline with two stages. The first stage stores a value to a pipeline variable. I need to access that variables value in the 2nd stage's tasks. Currently the value retrieved in the second stage is empty. Is it however not empty when accessed within the same stage (multiple tasks in the same stage).
I have checked out Microsoft's documentation and it seems to only show YAML.
How to access stageDependencies variables in multiple stages in Azure Release pipeline using bash
Yes, the Stage to stage dependencies only used for the YAML pipeline.
For the Classic pipeline, we need to pass them manually so that we can use them in the next stage.
We could use the REST API to update the variable in the Variables tab.
Steps:
Define a variable in the release definition Variable.
Use REST API (Definitions - Update) to update the value of the release
definition variable in the stage 1.
Use the updated value of the release definition variable in the second stage.
The details info about using REST API to update the value of the release definition variable, you can follow the below ticket:
How to modify Azure DevOps release definition variable from a release task?
or you could use the Azure CLI to update the variable:
az pipelines variable update --name
[--allow-override {false, true}]
[--detect {false, true}]
[--new-name]
[--org]
[--pipeline-id]
[--pipeline-name]
[--project]
[--prompt-value {false, true}]
[--secret {false, true}]
[--subscription]
[--value]
I had a groovy code wich contains "import groovy.json.JsonSlurper".
I have spent a day testing and i dont know how to load external libraries using declarative syntax.
This is my code:
pipeline {
agent any
import groovy.json.JsonSlurper
stages {
stage("test") {
steps {
}
}
}
}
I have read the jenkins documentation, and i have tried to use the next but without success:
#Grab('groovy.json.JsonSlurper')
import groovy.json.JsonSlurper
both import and #Grab is not recognized. Some idea?
Thanks!
What #Daniel Majano says is true about the import syntax, but the #Grab syntax I found holds differences of behavior between a Pipeline script maintained directly in Jenkins vs Pipeline script from SCM.
When I placed a Grab command in the Pipeline script for a tester pipeline job I found that it didn't make any difference whether the Grab command was there or if it was commented out.
However when used from a Pipeline script from SCM it would throw the following exception...
java.lang.RuntimeException: No suitable ClassLoader found for grab
I removed it from the SCM script and everything worked out in the end.
Additional Background
I'm not sure why the grab was choking in the SCM version, but there's definitely some working parts to the groovy editor because if you define a partial grab command it will give you some validation errors pointing to the broken line as you see in the red X box below, with the error The missing attribute "module" is required in #Grab annotations:
Therefore the script validator is aware of the Grab annotation as it calls it and that it has both a group and module attribute. I'm using the so called shorthand notation in this example.
Given a configuration named "Data:ConnectionString" in appsettings.json file (ASP.NET Core application), how do I override this in the build? By overriding it can either be that there is a step which changes the value in appsettings.json before compilation during build, or that I override the parameter when using "dotnet test", or something else.
More info:
I have a ASP.NET Core application with standard configuration in appsettings.json. I do not want any connection string or sensitive data checked in the source control.
I am building my application using Visual Studio Team Service (cloud TFS). There is a step where tests are executed, and I want these tests to run against a remote service for which I do not want to check in the credentials.
There are a number of extensions available on http://marketplace.visualstudio.com that will help you without any complicated ness.
https://marketplace.visualstudio.com/items?itemName=YodLabs.VariableTasks
I like the Variable Tasks Pack that comes with:
Set Variable Set a variable value and optionally apply a transformation to it.
Set Variables with Credential Sets the username/password from an existing service endpoint
Set Variable from JSON Extracts a value from JSON using JSONPath
Set Variable from XML Extracts a value from XML using XPath
Update Build Number Allows you to change a build number
Increment Version Increments a semver version number
Super easy... You can also just search for "json" or "variable" to find other options...
Most popular ways:
Use app secrets
Use scripts section in your project.json. You have 4 events -
precompile, postcompile, prepublish, postpublish
You can set the an environmental variable ASPNETCORE_ENVIRONMENT in the build to something like "Test". Create an appsettings.json file named appsettings.Test.Json. Then when you are setting up your configuration in Startup.cs do something like...
var builder = new ConfigurationBuilder()
.SetBasePath(env.ContentRootPath)
.AddJsonFile("appsettings.json", optional: true, reloadOnChange: true)
.AddJsonFile($"appsettings.{env.EnvironmentName}.json", optional: true);
When the environmental variable is set to TEST, you new appsettings file will be loaded and can set the connection string to whatever you want.
I am implementing a custom auditing framework, logging ETL events such as start, end, error, insertrows etc.
As well as logging at a package level, I'm implementing "session logging" where a sequence of package executions, i.e. a controller package that executes several packages, is a session. In order to keep track of the "session", the stored procedures always return a SessionLogID.
I was hoping I could map this result set to a project parameter as otherwise, I will have to save it to a user var and then pass it around between packages via parameters. This will mean every single package will have a Package Parameter and User Variable called SessionLogID. I don't want to do this if I don't need to.
Open to other suggestions.
Thanks,
Adam
Parameters cannot change at runtime. They are a set once kind of deal whereas variables can change at any time. You can set the variable once in the parent package and map the variable to the child package's using a parameter.
I have a very simple SSIS package and an absurdly simple Script task
var x = Dts.Variables["User::OrderPath"];
That fails with:
The element cannot be found in a collection. This error happens when
you try to retrieve an element from a collection on a container during
execution of the package and the element is not there.
I have a variable, OrderPath, that is scoped to the package. Any time I try to add the variable to the script's ReadOnlyVariables, it disappears whenever I execute the task.
This really shouldn't be this difficult so I assume I'm missing something monumentally basic. Any idea what that might be?
When accessing variables through the Dts.Variables collection, you do not prefix the variable name with the namespace; thus
var x = Dts.Variables["User::OrderPath"];
fails, while
var x = Dts.Variables["OrderPath"];
succeeds. (Assuming, of course, that OrderPath is added to either the ReadWriteVariables or ReadOnlyVariables task properties.)
See Using Variables in the Script Task in MSDN for another example.