I've seen different post regarding paralleling or run functions simultaneously in parallel but the code in the answers have not quite worked for me. I'm doing patching automation and the functions I have all work and do their thing separately but since we work with more than 200+ computers, waiting for each function to finish with its batch of computers kind of defeats the purpose. I have the code in one script and in summary its structured like this:
define global variables
$global:varN
define sub-functions
function sub-function1()
function sub-functionN()
define main functions
function InstallGoogleFunction($global:varN)
{
$var1
$result1 = sub-function1
$resultN = sub-functionN
}
function InstallVLCFunction($global:varN)
{
"similar code as above"
}
function InstallAppFunction($global:varN)
{
"similar code as above"
}
The functions will all install a different app/software and will write output to a file. The only thing is I cannot seem to run all the functions for installation without waiting for the first one to finish. I I tried start-job code but it executed and displayed a table like output but when verifying the computers neither had anything running on Task Manager. Is there a way powershell can run this installation functions at the same time? If I have to resort to a one-by-one I will call the functions by the least amount of time taken or the least computers the functions read they need to install I will but I just wanted someone to better explain if this can be done.
You can use multithreading to achieve this. Below example shows how you can trigger tasks on multiple using multi threading in PowerShell.
Replace list of machines in $Computers with your machines and give it a try. Below example get the disk details on given machines
# This is your script you want to execute
$wmidiskblock =
{
Param($ComputerName = "LocalHost")
Get-WmiObject -ComputerName $ComputerName -Class win32_logicaldisk | Where-Object {$_.drivetype -eq 3}
}
# List of servers
$Computers = #("Machine1", "Machine2", "Machine3")
#Start all jobs in parallel
ForEach($Computer in $Computers)
{
Write-Host $Computer
Start-Job -scriptblock $wmidiskblock -ArgumentList $Computer
}
Get-Job | Wait-Job
$out = Get-Job | Receive-Job
$out |export-csv 'c:\temp\wmi.csv'
Related
I found this script on this site <thanks Nick!>
$files = gci $srcpath
foreach ($srcfile in $files) {
# Build destination file path
$dstfile = [string]($dstpath, '\', $srcfile.name -join '')
# Copy the file
cp $srcfile.FullName $dstfile.FullName -whatif
# Make sure file was copied and exists before copying over properties/attributes
if ($dstfile.Exists) {
# $dstfile.CreationTime = $srcfile.CreationTime
# $dstfile.LastAccessTime = $srcfile.LastAccessTime
$dstfile.LastWriteTime = $srcfile.LastWriteTime
# $dstfile.Attributes = $srcfile.Attributes
# $dstfile.SetAccessControl($srcfile.GetAccessControl())
}
}
I want to turn this into a function so it accepts recursive directories to copy the timestamps from the source, a cloud folder to it's equivalent at the destination.
Now, I have tried calling the function using multiple variables, trying different methods such as:
$src = $args[0]
$dst = $args[1]
Get-Timestamp $src, $dst
either uses the default folder that the script is running or will fail when it tries to list the contents combining the 2 variables together.
even setting up the function like so
[CmdletBinding(DefaultParamerterSetName='Srcpath')]
param (
[Parameter(Mandatory = $true,
ParameterSetName = 'Srcpath',
Position = 0)]
[string[]]$Srcpath,
# [Parameter(Mandatory = $true,
# ParameterSetName = 'DSTpath',
# Position = 1)]
[string]$DSTpath
)
$PSCmdlet.ParameterSetName
is not producing the expected result.
This will work on its own, doing one folder at a time. But not when doing subfolders.
Suggestions would be greatly appreciated.
From the PowerShell ISE or VSCode select a simple or advanced function snippet.
PowerShell ISE - use CRTL+J, type function, hit enter and you get this:
function MyFunction ($param1, $param2)
{
}
VSCode - use CRTL=ALT+J, type function, hit enter and you get this:
function FunctionName {
param (
OptionalParameters
)
}
Put your code below the param block. Now, both those function names are not best practice and should be verb-noun, as documented here:
About Functions | MSDocs
Function Names
You can assign any name to a function, but functions
that you share with others should follow the naming rules that have
been established for all PowerShell commands.
Functions names should consist of a verb-noun pair in which the verb
identifies the action that the function performs and the noun
identifies the item on which the cmdlet performs its action.
Functions should use the standard verbs that have been approved for
all PowerShell commands. These verbs help us to keep our command names
simple, consistent, and easy for users to understand.
For more information about the standard PowerShell verbs, see Approved
Verbs in the Microsoft Docs.
You are not getting a recursive search because you are not telling it to. That is what the -Recurse of the Get-ChildItem cmdlet is for.
# Get specifics for a module, cmdlet, or function
(Get-Command -Name Get-ChildItem).Parameters
(Get-Command -Name Get-ChildItem).Parameters.Keys
# Results
<#
Path
LiteralPath
Filter
Include
Exclude
Recurse
Depth
Force
Name
Verbose
Debug
ErrorAction
WarningAction
InformationAction
ErrorVariable
WarningVariable
InformationVariable
OutVariable
OutBuffer
PipelineVariable
UseTransaction
Attributes
Directory
File
Hidden
ReadOnly
System
#>
Get-help -Name Get-ChildItem -Examples
# Results
<#
Example 3: Get child items in the current directory and subdirectories
Get-ChildItem -Path C:\Test\*.txt -Recurse -Force
Directory: C:\Test\Logs\Adirectory
Mode LastWriteTime Length Name
---- ------------- ------ ----
-a---- 2/12/2019 16:16 20 Afile4.txt
-a-h-- 2/12/2019 15:52 22 hiddenfile.txt
-a---- 2/13/2019 13:26 20 LogFile4.txt
Directory: C:\Test\Logs\Backup
Mode LastWriteTime Length Name
---- ------------- ------ ----
-a---- 2/12/2019 16:16 20 ATextFile.txt
-a---- 2/12/2019 15:50 20 LogFile3.txt
#>
Get-help -Name Get-ChildItem -Full
Get-help -Name Get-ChildItem -Online
Lastly, aliases/shorthand names are for interactive throw-away code, not in production/shared scripts. As discussed here:
• Best Practices for aliases
https://devblogs.microsoft.com/scripting/best-practice-for-using-aliases-in-powershell-scripts
https://devblogs.microsoft.com/scripting/using-powershell-aliases-best-practices
Why worry about aliases in the first place? ... There are two things
at work when it comes to a script. The first is that no alias is
guaranteed to exist —even aliases that are created by Windows
PowerShell. ...
... and if you create custom ones, you can step on one already configured on a host.
Lastly, PSScriptAnalyzer, VSCode, etc., make all known aliases as errors, until you make them their full name. Custom aliases, you need to expand yourself. Never assume folks know them or care to. PowerShell is verbose for a reason (good/bad/indifferent is all opinion), easy to read for the most inexperienced, self-documenting, and easier to maintain. Make sure to read the available PowerShell Best Practice references that are available and remember, you write code for those who will use it or follow you or maintain it.
i need help with running different functions at the same with the same arguments.
I have a powershell script are build like this:
$ObjectsArray = #(Object1, Object2, Object3)
function function1($arg) {
do something...
}
function function2($arg) {
do something...
}
function function3($arg) {
do something...
}
foreach($Objec in ObjectArray) {
function1 -arg $Object.Name
function2 -arg $Object.Name
function3 -arg $Object.Name
}
in my script i have many functions and i want optimize the code.
there is any way to run all of these function in one time? maybe with regex?
in all the function i'm use with the same arguments.
Thanks!!
short answer: yes, it's possible.
longer answer: you will need to separate the various executions into powershell jobs. This is sort of multi-threading, but I don't know enough to tell you that it's actually peeling threads from the (virtual) core(s).
Here is how you call an individual job:
PS C:\temp\StackWork\csvimport> start-job -ScriptBlock {Get-ChildItem c:\ | select Mode,Name | ft -w -auto}
Id Name PSJobTypeName State HasMoreData Location Command
-- ---- ------------- ----- ----------- -------- -------
9 Job9 BackgroundJob Running True localhost Get-ChildItem c:\ | se...
There you see the output is not the results of the command, but the properties of the job itself. That 'State' field is how you check if the job is still running or completed.
Then this is how you get the resulting output of the job:
PS C:\temp\StackWork\csvimport> receive-job 9
Mode Name
---- ----
d----- inetpub
d----- PerfLogs
d-r--- Program Files
d-r--- Program Files (x86)
d----- Python27
d----- Quarantine
d----- Tech
d----- Temp
d-r--- Users
d----- Windows
Here is how you get the info on a running job:
PS C:\temp\StackWork\csvimport> get-job -Id 9
Id Name PSJobTypeName State HasMoreData Location Command
-- ---- ------------- ----- ----------- -------- -------
9 Job9 BackgroundJob Completed False localhost
Expanding this really depends on what you need to see in the output, and what you need to trigger as a next action. In your example, it's only really 3 parallel runs, but as you increase you may need to track running jobs and set limits to complete some before starting new ones. A good rule of thumb I've always heard was two running threads x (# cores -1).
All of that is very specific to your needs, but hopefully this helps with the basis of implementation.
In case you want to avoid the repetition of explicitly enumerating individual function calls that take the same arguments:
# Get function-info objects for the target functions by name pattern
# (wildcard expression), as an array.
# Note: Alternatively, you can store the names explicitly in an array:
# $funcsToCall = 'function1', 'function2', 'function3'
$funcsToCall = Get-Item function:function?
foreach ($object in $ObjectsArray) {
# Loop over all functions in the array and call each.
foreach ($func in $funcsToCall) {
# Use & (call operator) to call a function
# by function-info object or name.
& $func -arg $object.Name
}
}
This will still execute the functions sequentially, however.
When working on large SSIS projects containing several packages, the packages can start to get a bit messy with variables that were created but never used or have been made redundant by other changes. I need something that will scan several SSIS packages and list all unused variables.
I have managed to answer my own question by employing some Powershell. The script below uses xpath to get the variable names and then uses regex to find the number of occurrences, if it occurs once it must be because it was defined but never used.
The only caveat is that if you use names for variables that are words that would naturally be present in a dtsx file, then the script will not pick them up. I probably need to expand my script to only do a regex search on spesific nodes in the package.
$results = #()
Get-ChildItem -Filter *.dtsx |
% {
$xml = [xml](Get-Content $_)
$Package = $_.Name
$ns = [System.Xml.XmlNamespaceManager]($xml.NameTable)
$ns.AddNamespace("DTS", "www.microsoft.com/SqlServer/Dts")
$var_list = #($xml.SelectNodes("//DTS:Variable/DTS:Property[#DTS:Name = 'ObjectName']", $ns) | % {$_.'#text'})
$var_list | ? {#([Regex]::Matches($xml.InnerXml, "\b$($_)\b")).Count -eq 1} |
% { $results += New-Object PSObject -Property #{
Package = $Package
Name = $_}
}
}
$results
This is a great question, as I have the same concern with a few rather large SSIS packages. Unfortunately, external to the SSIS packages, there isn't any feature available that will provide this function. See discussions in attached link:
CodePlex
But by opening an SSIS package, you can determine the variable usage by applying the steps outlined in the following link:
find-ssis-variable-dependencies-with-bi-xpress
Hope this helps.
I've put my functions in a separate file and I call the file with:
$workingdir = Split-Path $MyInvocation.MyCommand.Path -Parent
. "$workingdir\serverscan-functions.ps1"
But, if I call the scripts like
my-function
how will the variable scope (from within "my-function") be?
Should I still $script:variable to make the variable exist outside the function or have I dot-sourced the function as well?
Hope I don't confuse anyone with my question... I've tried to make it as understandable as possible, but still learning all the basic concept so I find it hard to explain..
When you dot source code it will behave as if that code was still in the original script. The scopes will be the same as if it was all in one file.
C:\functions.ps1 code:
$myVariable = "Test"
function Test-DotSource {
$script:thisIsAvailableInFunctions = "foo"
$thisIsAvailableOnlyInThisFunction = "bar"
}
main.ps1 code
$script:thisIsAvailableInFunctions = ""
. C:\functions.ps1
# Call the function to set values.
Test-DotSource
$script:thisIsAvailableInFunctions -eq "foo"
# Outputs True because of the script: scope modifier
$thisIsAvailableOnlyInThisFunction -eq "bar"
# Outputs False because it's undefined in this scope.
$myVariable -eq "Test"
# Outputs true because it's in the same scope due to dot sourcing.
In order to achieve what you want, you'll probably need to create a module. In the module, export the functions using Export-ModuleMember, and as long as you don't explicitly export any variables as module members, you should be fine.
Once you've created the module, import it using the Import-Module cmdlet.
My 2cents:
Usually (after a past Andy Arismendi answer! God bless you man!) I save all my scripts in $pwd folder (added in system path environment). The I can call them from the console wihtout dot sourcing and no script variable poisoning the console after a script ends his job.
If you cannot modify yours functions in simple scripts (sometimes it happens) I'm agree with Trevor answer to create a module and import it in $profile
I wrote several Powershell scripts which deploy software for a client. I used Write-Host to output a lot of information so that the progress of the deploy can be watched and they call this from one of their deploy application using Start-Transcript to capture this output.
However, they also need to be able to call some of these scripts from another application which can only capture output from stdout. This means that Write-Host won't work there since it outputs only to the console or host and doesn't get directed to stdout (correct?)
My thought was that I could change the code to use Write-Out instead, except that this causes another problem. Since I use functions and since functions in Powershell "return" everything that goes to stdout to the caller that would likely screw up any of my code that retrieves output from a function.
Is there a way to direct output to stdout from a function without it going to the calling code as the output of the function itself? Here is an example of the problem:
function Test-Output ([int]$number) {
Write-Output "This is a string"
return $number
}
[int]$someNumber = Test-Output 10
$someNumber
If you run the code above you'll see an error because Powershell is trying to assign "This is a string" to the integer $someNumber. If you change the variable to a string then it will capture the full output of the function (This is a string 10) and assign it to the variable.
Thanks for any suggestions that you can give!
Function output and stdout are the same thing so the calling code is going to see anything output to the stdout stream. In this case I would suggest using the Write-Progress cmdlet to report progress to the end user and leave actual function output alone.
Try this.
function Add-Numbers {
param (
[double] $FirstNumber,
[double] $SecondNumber
)
Write-Host "Hello World"
return ($FirstNumber + $SecondNumber)
}
$result = Add-Numbers 1 2
#Write-Host "Result is $result"