I'm using the Azure Powershell command New-AzureRmResourceGroupDeployment together with a JSON template file, and I'm feeding a bunch of parameters to the command that it will use with the JSON file. The JSON also instructs the newly-created VM to download a Powershell script from Azure storage and run it.
I need to pass some values from my Azure Powershell script to that "VM-local" Powershell script. For argument's sake, let's say that my Azure Powershell script has a variable $foo with a value of bar, representing "the name of a folder to be created on C:\ (so C:\bar)".
How?
How can a script running within the VM access the value bar (by any means)? It's fine if I need to use the JSON file as "messenger", or any other necessary trick. I don't think I can modify the "VM-local" Powershell script between downloading it from Azure storage and subsequently running it.
If you use the script extension in your JSON template on the VM to run the script, you can specify the entire cmd line for that script. On that cmd line you would pass parameters just as you would running it interactively. IOW, think about the cmdline to run that script and that's what you would put into the script extension of the template.
Take a look at this example:
https://github.com/Azure/azure-quickstart-templates/blob/f18a95e857a4caf86b4c2e77e652cec678cd524c/201-vm-winrm-windows/azuredeploy.json
Look at the "commandToExecute" property. You can see how to invoke powershell.exe with params, the script file being one of those params and then the script file itself also accepts some params through a variable.
You could also do this with DSC (very similar in JSON, but very different PS) but if you already have a PS script you want to use, this should work.
Is that what you needed?
You can pass variables like this:
New-AzureRmResourceGroupDeployment -Name ((Get-ChildItem $TemplateFile).BaseName + '-' + ((Get-Date).ToUniversalTime()).ToString('MMdd-HHmm')) `
-ResourceGroupName $ResourceGroupName `
-TemplateFile $TemplateFile -TemplateParameterObject #{accountName=$AutomationAccount;moduleName=$Name;moduleURI=$ModuleURI} -Force -Verbose
Related
My command in the .sh file is running.
The command is: ($ZEEK -C -r $i dir)
i: pcap (file) name to be processed
dir: directory to be extracted
When the command is running, there are the extract files in desired location. It works pretty well. But I need that filename in the main.zeek. The question was that how can i access the filename in the main.zeek (used in the .sh file).
As I learned from here, packet_source() function could be called in script. But I can not implement it because I just started using it and I'm trying to get used to the script of Zeek.
In my script (main.zeek), after loading script index which contains packet_source() as the built-in function (#load base/bif/zeek.bif.zeek), how can i define a variable and use it (e.g global filename: function packet_source():, is it valid)?
I would be glad if you help.
In main.zeek, the variable could be defined as global to use in the every function that script has.
global filename_s: string;
After that, packet_source() is used to access the value. With its $path value, which file is read in there from PCAP would get. It should be placed in event zeek_init().
event zeek_init()
{
local filename_source = packet_source();
filename_s = filename_source$path;
}
That filename_s has the directory of the file Zeek read. It could be used in that script file (e.g. main.zeek).
SQL Server 2019. In the SSRS Report Server web interface I can upload any file, Pdf, Excel, Word, etc. I've got a lot of files I want to upload and the web interface only allows me to do one at a time. Can I upload all files in a folder to the SSRS server using Powershell? So far what I've found only seems to work for SSRS files - rdl, rsd, etc. Is there some other way to upload multiple non-SSRS files? Thanks!
You can use the below PowerShell script, you will need to change the folder location and the report server URL as well as the -RSFolder reference. The script will upload all files within the folder. Please be aware that SSRS does restrict some file types, these can be found using the following SQL code :
SELECT ConfigInfoID
,Name
,Value
FROM ReportServer.dbo.ConfigurationInfo
WHERE Name = 'AllowedResourceExtensionsForUpload'
-- PowerShell Script --
$FileLocation = "C:\Files\"
$Files = Get-ChildItem $FileLocation
foreach ($File in $Files)
{
Write-RSRestCatalogItem -Overwrite -ReportPortalUri http://ReportServer/Reports/ -Path $Files.FullName -RsFolder "Files" -RestApiVersion V1.0
}
Also should have mentioned that you need to install a PowerShell module:
Install-Module -Name ReportingServicesTools
I'm using Google Cloud Deployment and I am trying to get external input into my template. Namely, I want to set a metadata variable on my instance (when creating the instance) but provide this value on execution.
I've tried:
gcloud deployment-manager deployments create test-api-backend --config test-api-backend.yaml --properties 'my_value=hello'
Which fails (The properties flag should only be used when passing in a template as your config file.)
I've tried:
my_value=hello gcloud deployment-manager deployments create test-api-backend --config test-api-backend.yaml
And use {{env['my_value']}} but the value isn't picked up.
I guess I could add the property in a .jinja file and re-write this file before I run everything, but it feels like a hack. That, or my idea of passing a variable from shell into Deploy Manager is a hack. I'm honestly not sure.
As the error message indicates, the command line properties can only be used with a template. They are essentially meant to replace the config yaml file.
The easiest thing to do is to just rename your yaml file to a .py or .jinja file. Then use that template as the file in the gcloud command instead of the yaml file.
In that new template file, add any defaults you would like if you don't pass them in on the command line.
For python, something like:
if 'myparam' in context.properties:
valuetouse = context.properities['myparam']
else:
valuetouse = mydefaultvalue
If the template uses another template then you'll also need to create a schema file for the new, top level template so you can do the imports there instead of the yaml file.
See the schema file in this github example.
https://github.com/GoogleCloudPlatform/deploymentmanager-samples/blob/master/examples/v2/igm-updater/ha-service.py.schema
If you want, you can ignore all the properties and just do the imports section.
I have a tcl script drakon_gen.tcl . I am running it, from another script run.tcl like this:
source "d:\\del 3\\drakon_editor1.22\\drakon_gen.tcl"
When I run run.tcl I have following output:
This utility generates code from a .drn file.
Usage: tclsh8.5 drakon_gen.tcl <options>
Options:
-in <filename> The input filename.
-out <dir> The output directory. Optional.
Now I need to add to run.tcl options that are in the output. I tried many ways but I receive errors. What is the right way to add options?
When you source a script into a tcl interpreter, you are evaluating the script file in the context of the current interpreter. If it was written to be a standalone program you may run into problems with conflicting variables and procedures in the global namespace. One way to avoid that is to investigate the use of slave interpreters (see the interp command) to provide a separate environment for the child script.
In your specific example it looks like you just need to provide some command line arguments. These are normally provided by the argv variable which holds a list of all the command line arguments. If you define this list before sourcing the script you can feed it the required command line. eg:
set original_argv $argv
set argv [list "--optionname" "value"]
source $additional_script_filename
set argv $original_argv
I have two scripts in the pre-build step in a Jenkins job, the first one a perl script, the second a system groovy script using the groovy plugin. I need information from the first perl script in my second groovy script. I think the best way would be to set some environment variable, and was wondering how that can be realized.
Or any other better way.
Thanks for your time.
The way to propagate environment variables among build steps is via EnvInject Plugin.
Here are some previous answers that show how to do it:
How to set environment variables in Jenkins?
Jenkins : Report results of intermediate [windows batch] build steps in email body
In your case, however, it may be simpler just to write to a file in one build step and read that file in another. To make sure you do not accidentally read from a previous version of the file you can incorporate BUILD_ID in the file name.
Using EnvInject Plugin from job configuration you should use Inject environment variables to the build process / Evaluated Groovy script.
Depending on the setup you may execute Groovy or shell command and save it in map containing environment variables:
Example
By either getting command result with execute method:
return [DATE: 'date'.execute().text]
or with Groovy equivalent if one exists:
return [DATE: new Date()]