I'm trying to write a powershell function that receives a list of files from the get-content commandlet by piplining and processes them.
The pipeline looks like this:
get-content D:\filelist.txt | test-pipeline
For simplicity sake the function below should just show each line of the textfile.
function test-pipeline
{
<#
.Synopsis
#>
[CmdletBinding( SupportsShouldProcess=$true)]
Param([Parameter(Mandatory = $true,
ValueFromPipeLine = $true)]
[array]$filelist
)
foreach ($item in $filelist)
{
$item
}
}
My filelist is a normal .txt file and looks like this.
line 1
line 2
line 3
line 4
line 5
No matter what type of parameter I pipe to the function, it never works and shows only the last line of the text file in the $filelist variable. Can anybody help? Powershell Version is v2
Thanks in advance
The reason you see only your last line requires digging a bit into the nature of pipeline-able PowerShell functions. In the absence of explicit begin/process/end blocks that Swonkie alluded to, all code in the function operates as if it is in the end block, i.e. it is as if you wrote:
function test-pipeline
{
[CmdletBinding()]
Param(
[Parameter(ValueFromPipeLine = $true)][array]$filelist
)
END {
$filelist
}
}
But $filelist, being a pipeline variable, has only the current value in it when fed pipeline data; the end block runs when the pipeline input is exhausted, thus $filelist contains just the last value. Simply changing that end block to a process block--where it runs for each value in the pipeline, will give you the desired output:
function test-pipeline
{
[CmdletBinding()]
Param(
[Parameter(ValueFromPipeLine = $true)][array]$filelist
)
PROCESS {
$filelist
}
}
And notice that you do not need any kind of loop there--the pipeline is already providing the "loop".
That is just one of several ways to process pipeline data. Here's a close variant, which is a bit shorter: use filter instead of function, because filter with no explicit blocks operates as if all code is in the process block.
filter test-pipeline
{
[CmdletBinding()]
Param(
[Parameter(ValueFromPipeLine = $true)][array]$filelist
)
$filelist
}
To delve further into the fascinating and arcane world of writing functions for pipelining, take a look at the in-depth analysis I wrote, Down the Rabbit Hole: A Study in PowerShell Pipelines, Functions, and Parameters, published on Simple-Talk.com. Enjoy your PowerShell adventures!
Instead of function try filter:
filter Test-Pipeline {
...
}
This is basically a function with a processing block (which you need to process pipeline objects). Alternatively you can write a function with this block and optional begin and end blocks:
function Test-Pipeline {
begin {
}
process {
...
}
end {
}
}
More info: http://technet.microsoft.com/en-us/magazine/hh413265.aspx
Related
In PowerShell I can pass a parameter if I create a closure with param syntax:
$hello = { param($name) "Hello $name"}
& $hello "World!"
>hello.ps1
Hello World!
When I try this with function syntax I get into trouble:
function hello($n) { { "Hello $n" }.GetNewClosure() }
$doit = hello
& $doit "World!"
>functionclosure.ps1
Hello
I managed to fix this by giving the parameter earlier:
function hello($n) { {"Hello $n"}.GetNewClosure() }
$doit = hello "World"
& $doit
>functionclosure2.ps1
Hello World
Is there a way to pass a parameter to a function from the & call operator line?
Is there a way to pass a parameter to a function from the & call operator line?
The call (also known as invocation or &) operator is generally used to execute content in a string, which we do when there is a long file path or a path with spaces in the name, or when we are dynamically building a string to execute.
Now, in this case, Using GetNewClosure() alters a function to instead return a scriptblock as the output type. That Scriptblock must be invoked using the Call operator, so this is a valid usage of the Call operator.
Back to your question then, yes, you can control the order of execution using paranthesis and pass a parameter to a function which returns a closure from the call line like this:
& (hello stephen)
However, this is pretty confusing in action as closures maintain their own separate scope and in more than ten years of enterprise automation projects, I never saw them used. It might be more confusion than it's worth to go down this route.
Prehaps a simpler approach might be:
function hello($name) { "WithinFunction: Hello $name"}
$do = 'hello "world"'
PS> $do
hello "world"
#invoke
PS> invoke-expression $do
WithinFunction: Hello world
Additional reading on closures by the man himself who implemented it in PowerShell here. https://devblogs.microsoft.com/scripting/closures-in-powershell/
Jeroen Mostert wrote this as a comment:
You've written a function that takes a parameter and then creates a closure using that parameter. It seems like you want a function that returns a closure that takes a parameter -- but that's the same as your first example, and
function hello { { param($n) "Hello $n"} }
would do that. You'd invoke that as
& (hello) "world"
On the other hand, if you want to have a function as a closure you can invoke,
${function:hello}
would do that, i.e.
function hello($n) { "Hello $n" };
$doit = ${function:hello};
& $doit "World"
I want to parse through a json and update specific nodes in json result according to the following code snippet written in power shell:
foreach($val in $getresult.elements.values)
{
if($val.Name -eq "Config")
{
$val.items.Service=$ServiceValue
}
if($val.Name -eq "Analysis")
{
$val.items.ID=$IDValue
$val.items.Name=$NameValue
}
if($val.Name -eq "Report")
{
$val.items.to=$ToValue
}
}
The final $getresult elements/nodes should be updated with $ServiceValue, $NameValue and $ToValue. How do I achieve it in logic apps?
We recently introduced new workflow functions #setProperty, in combination with condition action you will be able to set the properties inside. Or, use #addProperty if such name doesn't exist.
https://learn.microsoft.com/en-us/azure/logic-apps/logic-apps-workflow-definition-language
I'm actually scripting and a part of my script is a FTP file download. The code words without any problem, but I don't understand one specific part. Its the only part from the code I haven't written by myself. I understand how the code works, but I actually dont understand how the function return its value. You're may confused what I mean, so let me explain it with a bit of code:
function get-ftp {
try {
$ftprequest = [system.net.ftpwebrequest]::Create($uri)
$ftprequest.Credentials = New-Object system.net.networkcredential($user,$pass)
$ftprequest.Proxy = $null
$ftprequest.Method = [System.Net.WebRequestMethods+FTP]::ListDirectory
$ftpresponse = $ftprequest.GetResponse()
$reader = New-Object IO.StreamReader $ftpresponse.GetResponseStream()
$reader.ReadToEnd()
$reader.Close()
$ftpresponse.Close()
}
catch {
Write-host "Error while reading filenames"
}
}
So this is my function to get all directories from the FTP server. I call this function with this code:
$allXmlFiles = get-ftp
So after the call, my $allXmlFiles contains a string (tested with getType on $allXmlFiles) with all the filenames on the server. Now my question: How is the answer from the FTP passed to this variable? There's no return in the function, so I'm quite confused how this works. I tried so take the try/catch out of the function and access the answer directly, but that didin't work. I tried to find it in $reader and in $ftpresponse - no success.
It would be really cool if someone can explain me whats going on here. As said, the code works, but I would like to understand whats going on here.
It's
$reader.ReadToEnd()
StreamReader.ReadToEnd() method outputs string and since it's result is not assigned to variable it will be the function output.
Idiomatic way would be to write it like this:
Write-Output $reader.ReadToEnd()
In PowerShell, the result of every command / statement is returned as output if you don't assign or pipe it to anything.
The return keyword only exits the current scope. You rarely use return in PowerShell.
As beatcracker mentioned, $reader.ReadToEnd() is producing the output.
I have noticed that variables inside a module function do not remain in scope after execution returns to the script. I came across Export-ModuleMember but that didn't seem to help, perhaps i'm using it wrong.
FunctionLibrary.psm1
Function AuthorizeAPI
{
# does a bunch of things to find $access_token
$access_token = "aerh137heuar7fhes732"
}
Write-Host $access_token
aerh137heuar7fhes732
Export-ModuleMember -Variable access_token -Function AuthorizeAPI
Main script
Import-Module FunctionLibrary
AuthorizeAPI # call to module function to get access_token
Write-Host $access_token
# nothing returns
I know as an alternative I could just dot source a separate script and that would allow me to get the access_token but I like the idea of using modules and having all my functions therein. Is this doable? Thanks SO!
As per #PetSerAl's comment, you can change the scope of your variable. Read up on scopes here. script scope did not work for me when running from console; global did.
$global:access_token = "aerh137heuar7fhes732"
Alternatively, you can return the value form the function it and store in a variable; no scope change needed.
Function
Function AuthorizeAPI
{
# does a bunch of things to find $access_token
$access_token = "aerh137heuar7fhes732"
return $access_token
}
Main Script
Import-Module FunctionLibrary
$this_access_token = AuthorizeAPI # call to module function to get access_token
Write-Host $this_access_token
I have two environment variables. One is TF_VAR_UN and another is TF_VAR_PW. Then I have a terraform file that looks like this.
resource "google_container_cluster" "primary" {
name = "marcellus-wallace"
zone = "us-central1-a"
initial_node_count = 3
master_auth {
username = ${env.TF_VAR_UN}
password = ${env.TF_VAR_PW}
}
node_config {
oauth_scopes = [
"https://www.googleapis.com/auth/compute",
"https://www.googleapis.com/auth/devstorage.read_only",
"https://www.googleapis.com/auth/logging.write",
"https://www.googleapis.com/auth/monitoring"
]
}
}
The two values I'd like to replace with the environment variables TF_VAR_UN and TF_VAR_PW are the values username and password. I tried what is shown above, with no success, and I've toyed around with a few other things but always get syntax issues.
I would try something more like this, which seems closer to the documentation.
variable "UN" {
type = string
}
variable "PW" {
type = string
}
resource "google_container_cluster" "primary" {
name = "marcellus-wallace"
zone = "us-central1-a"
initial_node_count = 3
master_auth {
username = var.UN
password = var.PW
}
node_config {
oauth_scopes = [
"https://www.googleapis.com/auth/compute",
"https://www.googleapis.com/auth/devstorage.read_only",
"https://www.googleapis.com/auth/logging.write",
"https://www.googleapis.com/auth/monitoring"
]
}
}
With the CLI command being the below.
TF_VAR_UN=foo TF_VAR_PW=bar terraform apply
The use of interpolation syntax throws warning with terraform v0.12.18. Now you don't need to use the interpolation syntax. You can just reference it as var.hello.
Caution :
One important thing to understand from a language standpoint is that, you cannot declare variables using environment variables. You can only assign values for declared variables in the script using environment varibles. For example, let's say you have the following .tf script
variable "hello" {
type=string
}
Now if the environment has a variable TF_VAR_hello="foobar", during runtime the variable hello will have the value "foobar". If you assign the variable without the declaration of the variable there will not be any effect.
You can do the following to get this working.
Declare the variable in terraform configuration that you want to use as environment Variable.
variable "db_password" { type= string }
In the resource section where you want to use this variable change it as
"db_password":"${var.db_password}"
Export the environment variable.
export TF_VAR_db_password="##password##"
terraform plan or terraform apply
Use a null_resource to execute a terminal command (read an environment variable), redirect output to a file, then read the file content:
resource "null_resource" "read_environment_var_value_via_cli" {
triggers = { always_run = "${timestamp()}" }
provisioner "local-exec" {
command = "echo $TF_VAR_UN > TF_VAR_UN.txt" # add gitignore
}
}
data "local_file" "temp_file" {
depends_on = [ null_resource.read_environment_var_value_via_cli]
filename = "${path.module}/TF_VAR_UN.txt"
}
# use value as desired
resource "google_container_cluster" "primary" {
master_auth {
username = data.local_file.temp_file.content # value of $TF_VAR_UN
..
}
}
Most of the providers use:
DefaultFunc: schema.EnvDefaultFunc("
https://github.com/terraform-providers/terraform-provider-infoblox/blob/master/infoblox/provider.go
https://github.com/terraform-providers/terraform-provider-openstack/blob/master/openstack/provider.go
...
Alternatively, you can replace the variables in the file itself using the envsubst utility in bash:
$ envsubst < main.tf > main.tf
Or using an intermediate file with variables and the final config on the output:
$ envsubst < main.txt > main.tf
! Variables for envsubst must be declared using export:
$ export MYVAR=1729
The variables in the source file must be of the form: $VARIABLE or ${VARIABLE}.
in order to use a variable it needs to be wrapped with ""
for example:
username = "${var.UN}"