PowerShell IntelliSense on parameters [duplicate] - function

The code below is part of a switch and it's working fine, but the problem is: I need to change my file name to 15... Is it possible to to change it so that when I start it, it waits to select for a file with the tab key? Something like when you write Import-Csv in a PowerShell console and press Tab it shows all possbile paths and files.
$names = Import-Csv 15.csv -Header Givenname,Surname -Delimiter ";"
Write-Host "Rename your csv file to '15' and put it in same folder with this script" -ForegroundColor Cyan
pause
foreach ($Name in $Names) {
$FirstFilter = $Name.Givenname
$SecondFilter = $Name.Surname
Get-ADUser -Filter {GivenName -like $FirstFilter -and Surname -like $SecondFilter} |
select Enabled, SamAccountName, DistinguishedName,
#{n="ou";e={($_.DistinguishedName -split ",*..=")[2]}} |
Export-Csv .\sam.csv -NoTypeInformation -Append
}

So you want Intellisense in your script. Ambitious move. Most people would settle for the file browser dialog box. Anyway, I am going to have to refer you to smarter men than me. I was thinking ValidateSet attribute would serve your purpose but I realized that the traditional param block is not enough. So I looked up DynamicParams and this is what I found. This should work for you.
https://blogs.technet.microsoft.com/pstips/2014/06/09/dynamic-validateset-in-a-dynamic-parameter/

The simplest solution is to make your script accept the target file as an argument, by declaring a parameter:
param(
# Declare a mandatory parameter to which the file path of the CSV
# file to import must be passed as an argument on invocation.
[Parameter(Mandatory)]
[string] $FilePath
)
$names = Import-Csv $FilePath -Header Givenname,Surname -Delimiter ";"
foreach ($Name in $Names) {
$FirstFilter = $Name.Givenname
$SecondFilter = $Name.Surname
Get-ADUser -Filter {GivenName -like $FirstFilter -and Surname -like $SecondFilter} |
select Enabled, SamAccountName, DistinguishedName,
#{n="ou";e={($_.DistinguishedName -split ",*..=")[2]}} |
Export-Csv .\sam.csv -NoTypeInformation -Append
}
If you invoke your script without a file path, you will be prompted for it; let's assume your script is located in the current dir. and its name is someScript.ps1:
./someScript # invocation with no argument prompts for a value for $FilePath
Unfortunately, such an automatic prompt is not user-friendly and offers no tab completion.
However, on the command line PowerShell's tab completion defaults to completing file and directory names in the current location, so that:
./someScript <press tab here>
cycles through all files and directories in the current folder.
You can even type a wildcard expression and tab-complete that, if you don't know the full filename or don't want to type it in full:
./someScript *.csv<press tab here>
This will cycle through all *.csv files in the current dir. only.
If you want to go even further and customize tab completion to only cycle through *.csv files, you can use an [ArgumentCompleter({ ... })] attribute (PSv5+):
param(
[Parameter(Mandatory)]
# Implement custom tab-completion based on only the *.csv files in the current dir.
[ArgumentCompleter({
param($cmd, $param, $wordToComplete)
Get-ChildItem -Name "$wordToComplete*.csv"
})]
[string] $FilePath
)
# ...
Now,
./someScript <tab>
will cycle only through the *.csv files in the current directory, if any.
Caveat: As of PowerShell 7.0, tab-completing an argument for which the ArgumentCompleter script block returns no matches (in this case, with no *.csv files present) unexpectedly falls back to the default file- and directory-name completion - see this GitHub issue.
Similarly,
./someScript 1<tab>
will cycle only through the *.csv files in the current directory whose name starts with 1, if any.
As an alternative to using an attribute as part of a script's / function's definition, you can use the PSv5+ Register-ArgumentCompleter cmdlet to attach tab completions to the parameters of any command, i.e., including preexisting ones.
In PSv4- you have two (cumbersome) options for custom tab completion:
Use a dynamic parameter with a dynamically constructed [ValidateSet()] attribute - see the link in Rohin Sidharth's answer.
Customize the tabexpansion2 (PSv3, PSv4) / tabexpansion (PSv1, PSv2) function, but be sure not to accidentally replace existing functionality.

Below is my example.ps1 file that I use to write 1-off scripts. In your case I think you can get what you want with it. For example (no pun intended) you could call this script by typing
C:\PathToYourScripts\example.ps1 [tab]
where [tab] represents pressing the tab key. Powershell intellisense will kick in and offer autocompletion for file names. If your .csv file is not in the current director you can easily use Powershell intellisense to help you find it
C:\PathToYourScripts\example.ps1 C:\PathToCSvFiles[tab]
and powershell will autocomplete. Would-be-downvoters might notice that powershell autocomplete is definitely NOT a complete file-picker but this seems to fulfill the intent of the asked question. Here's the sample.
<#
.NOTES
this is an example script
.SYNOPSIS
this is an example script
.DESCRIPTION
this is an example script
.Example
this is an example script
.LINK
https://my/_git/GitDrive
#>
[CmdletBinding(SupportsShouldProcess=$True, ConfirmImpact="Low")]
param (
[string] $fileName
)
Begin {
}
Process {
if ($PSCmdlet.ShouldProcess("Simulated execution to process $($fileName): Omit -Whatif to process ")) {
Write-Information -Message "Processing $fileName" -InformationAction Continue
}
}
End {
}
If you want to get autocomplete help for multiple parameters just type in the parameter name(s) and press [tab] after each one. Note that leaving the parameters blank will not break the script but you can either extend this to mark the parameters required or just fail with a helpful message. That seems a bit beyond the original question so I'll stop here.

Related

Error trying to get data from JSON file in PowerShell [duplicate]

The txt file is just a bunch of UNC paths, i am trying to get a list of UNC paths from this text file put into another text file after the test-path is validated. it shows the validated paths on screen but the text file does not populate.
$cfgs = Get-Content .\cfgpath.txt
$cfgs | % {
if (Test-Path $_) { write-host "$_" | Out-File -FilePath c:\temp\1.txt -Append }
}
To complement Zam's helpful answer with background information:
Write-Host writes to the host[1] (typically, the console aka terminal), which bypasses PowerShell's success output stream and therefore sends nothing trough the pipeline.
See the bottom section of this answer for when Write-Host is appropriate; in short: you should generally only use it for display-only output.
Write-Output is the appropriate cmdlet for producing data output, but it is rarely necessary, because you can rely on PowerShell's convenient implicit output behavior, as shown in Zam's answer and explained in this answer.
Also, your command will perform much better if you simply pipe the % (ForEach-Object) command's output as a whole to a single Out-File call, rather than calling Out-File -Append for each input path.
Instead of using % with conditional explicit output, you can more elegantly implement your command with the Where-Object cmdlet:
Get-Content .\cfgpath.txt |
Where-Object { Test-Path $_ } |
Out-File -FilePath c:\temp\1.txt
Also note that for saving strings to a file it is more efficient to use Set-Content instead of
Out-File, though note that in Windows PowerShell the default output character encoding differs (no longer a concern in PowerShell [Core] 6+, which consistently defaults to BOM-less UTF-8); see this answer for when to choose which cmdlet.
By contrast, Out-File and > (its effective alias) use PowerShell's formatting system to write for-display representations of any non-string input objects to the output file, the same way that output renders to the display by default.
In other words: To save objects to a file in a way that is suitable for later programmatic processing, you need to use a structured file format, such as CSV (Export-Csv) or JSON (ConvertTo-Json, combined with Set-Content).
[1] In PowerShell 5.0 and higher, Write-Host now writes to a new stream, the information stream (number 6), which by default prints to the host. See about_Redirection.
Therefore, a 6> redirection now technically does allow you to send Write-Host output through the pipeline (though doing so is not a good idea) or capture / redirect it; e.g.,
Write-Host hi 6>&1 | % { "[$_]" }. Note that the type of the objects output by this redirection is System.Management.Automation.InformationRecord.
Write-Host only writes to the console. I believe what you want there is Write-Output.
$cfgs = Get-Content .\cfgpath.txt
$cfgs | % {
if (Test-Path $_) { write-output "$_" | Out-File -FilePath c:\temp\1.txt -Append }
}
Additionally you can just omit the Write-Output and that works too.
$cfgs = Get-Content .\cfgpath.txt
$cfgs | % {
if (Test-Path $_) { "$_" | Out-File -FilePath c:\temp\1.txt -Append }
}

Generate an HTML file with variables in shell (Automator)

Basically, I have this "workflow" that I find myself doing frequently and would love to automate:
create a folder with a new name in a specific folder (the path doesn't change)
create an index.html file in that folder
edit the index.html with 2 key variables (A web title and an https: link)
run a script
Here's how far I've gotten in Automator:
Ask for new folder name
Save as variable
Ask for web title name
Save as variable
Ask for link
Save as variable
Run shell script to cd to the right folder and "touch index.html"
Now I'm stuck. How would I edit the index.html while using the two other variables mentioned. Is there a way to edit or "replace" the file's contents while using Automator variable?
TIA!
Try adding the following to 'Run Shell Script' in the Automator workflow:
for var in $#
do
echo $var >> /path/to/index.html
done
and then setting "Pass input:" above the "Run Shell Script" module to: 'as arguments'
What this loop does is run the commands between do and done for every single variable you set in your Automator script. Alternatively, you can just replace for var in $# to for var, as an empty for will automatically collect the variables.
> and >> are bash shell operators. >> appends to a file or creates the file if it doesn't exist. The > overwrites the file if it exists or creates it if it doesn't exist. You may remove the touch command, unless you wish to create an empty file no matter if any variables are supplied.
If you need to differentiate between your variables, you don't even need a for loop, and can simply run:
echo $1 >> /path/to/index.html
echo $2 >> "/path to/index.html" # *or* /path\ to/index.html
# ^ if the directory of the file contains spaces
echo "The third supplied variable is: ${3}" >> /path/to/index.html
# ^ if you wish to add additional text to the variable
and so on, following the order in which you set your automator variables. Just make sure "Pass input:" is still set to 'as arguments'.

How do I turn this into a function?

I found this script on this site <thanks Nick!>
$files = gci $srcpath
foreach ($srcfile in $files) {
# Build destination file path
$dstfile = [string]($dstpath, '\', $srcfile.name -join '')
# Copy the file
cp $srcfile.FullName $dstfile.FullName -whatif
# Make sure file was copied and exists before copying over properties/attributes
if ($dstfile.Exists) {
# $dstfile.CreationTime = $srcfile.CreationTime
# $dstfile.LastAccessTime = $srcfile.LastAccessTime
$dstfile.LastWriteTime = $srcfile.LastWriteTime
# $dstfile.Attributes = $srcfile.Attributes
# $dstfile.SetAccessControl($srcfile.GetAccessControl())
}
}
I want to turn this into a function so it accepts recursive directories to copy the timestamps from the source, a cloud folder to it's equivalent at the destination.
Now, I have tried calling the function using multiple variables, trying different methods such as:
$src = $args[0]
$dst = $args[1]
Get-Timestamp $src, $dst
either uses the default folder that the script is running or will fail when it tries to list the contents combining the 2 variables together.
even setting up the function like so
[CmdletBinding(DefaultParamerterSetName='Srcpath')]
param (
[Parameter(Mandatory = $true,
ParameterSetName = 'Srcpath',
Position = 0)]
[string[]]$Srcpath,
# [Parameter(Mandatory = $true,
# ParameterSetName = 'DSTpath',
# Position = 1)]
[string]$DSTpath
)
$PSCmdlet.ParameterSetName
is not producing the expected result.
This will work on its own, doing one folder at a time. But not when doing subfolders.
Suggestions would be greatly appreciated.
From the PowerShell ISE or VSCode select a simple or advanced function snippet.
PowerShell ISE - use CRTL+J, type function, hit enter and you get this:
function MyFunction ($param1, $param2)
{
}
VSCode - use CRTL=ALT+J, type function, hit enter and you get this:
function FunctionName {
param (
OptionalParameters
)
}
Put your code below the param block. Now, both those function names are not best practice and should be verb-noun, as documented here:
About Functions | MSDocs
Function Names
You can assign any name to a function, but functions
that you share with others should follow the naming rules that have
been established for all PowerShell commands.
Functions names should consist of a verb-noun pair in which the verb
identifies the action that the function performs and the noun
identifies the item on which the cmdlet performs its action.
Functions should use the standard verbs that have been approved for
all PowerShell commands. These verbs help us to keep our command names
simple, consistent, and easy for users to understand.
For more information about the standard PowerShell verbs, see Approved
Verbs in the Microsoft Docs.
You are not getting a recursive search because you are not telling it to. That is what the -Recurse of the Get-ChildItem cmdlet is for.
# Get specifics for a module, cmdlet, or function
(Get-Command -Name Get-ChildItem).Parameters
(Get-Command -Name Get-ChildItem).Parameters.Keys
# Results
<#
Path
LiteralPath
Filter
Include
Exclude
Recurse
Depth
Force
Name
Verbose
Debug
ErrorAction
WarningAction
InformationAction
ErrorVariable
WarningVariable
InformationVariable
OutVariable
OutBuffer
PipelineVariable
UseTransaction
Attributes
Directory
File
Hidden
ReadOnly
System
#>
Get-help -Name Get-ChildItem -Examples
# Results
<#
Example 3: Get child items in the current directory and subdirectories
Get-ChildItem -Path C:\Test\*.txt -Recurse -Force
Directory: C:\Test\Logs\Adirectory
Mode LastWriteTime Length Name
---- ------------- ------ ----
-a---- 2/12/2019 16:16 20 Afile4.txt
-a-h-- 2/12/2019 15:52 22 hiddenfile.txt
-a---- 2/13/2019 13:26 20 LogFile4.txt
Directory: C:\Test\Logs\Backup
Mode LastWriteTime Length Name
---- ------------- ------ ----
-a---- 2/12/2019 16:16 20 ATextFile.txt
-a---- 2/12/2019 15:50 20 LogFile3.txt
#>
Get-help -Name Get-ChildItem -Full
Get-help -Name Get-ChildItem -Online
Lastly, aliases/shorthand names are for interactive throw-away code, not in production/shared scripts. As discussed here:
• Best Practices for aliases
https://devblogs.microsoft.com/scripting/best-practice-for-using-aliases-in-powershell-scripts
https://devblogs.microsoft.com/scripting/using-powershell-aliases-best-practices
Why worry about aliases in the first place? ... There are two things
at work when it comes to a script. The first is that no alias is
guaranteed to exist —even aliases that are created by Windows
PowerShell. ...
... and if you create custom ones, you can step on one already configured on a host.
Lastly, PSScriptAnalyzer, VSCode, etc., make all known aliases as errors, until you make them their full name. Custom aliases, you need to expand yourself. Never assume folks know them or care to. PowerShell is verbose for a reason (good/bad/indifferent is all opinion), easy to read for the most inexperienced, self-documenting, and easier to maintain. Make sure to read the available PowerShell Best Practice references that are available and remember, you write code for those who will use it or follow you or maintain it.

Powershell file URL / file filtering

I am trying to generate an html page with file index. This approach worked seamlessly:
$htmlout = Get-ChildItem -Path "$SearchPath" -Filter "$fileType" -Recurse |
Select #{Name="Link";Expression={("<a rel=" + $_.FullName + " href=file:///" + $_.FullName + ">$_</a>")}}
The Link column had file names only (i.e. test.txt) and displayed file content when clicking on it. Then we've got an additional requirement to skip old files. The script is now:
$htmlout = Get-ChildItem -Path "$SearchPath" -Recurse -include ("$fileType") | Where-Object {$_.LastWriteTime -ge "01/01/2014"} |` Select #{Name="Link";Expression={("<a rel=" + $_.FullName + " href=file:///" + $_.FullName + ">$_</a>")}}
It still works, but Link column now displays the entire file path + file name (i.e. \fileserver\folder1\folder2\test.txt).
Adding >$_.Name< does not work here.
I am trying to understand why the same URL line behaves differently after filter change.
Background
As far as I can tell, there is a discrepancy with the interaction between the .ToString() method and the DefaultDisplayProperty of objects returned by Get-ChildItem.
The behavior manifests when both of the following conditions are true:
- The -filter parameter is being used.
- The value of the -Path parameter resolves to a single directory, whether or not -Recurse is used.
Under the above circumstances, the .ToString() method implemented by PowerShell uses the Name property as default, rather than FullName as is the case in all other scenarios.
My guess is that this inconsistency is due to the underlying object types returned by the FileSystem provider when -Filter is used, rather than the objects PowerShell returns when it handles the search/filter itself (as is the case with -Include).
Observation
When you wrap your $_ pipeline object variable in double quotes, PowerShell's type-conversion implicitly calls the .ToString() method and you get the resulting name variation.
Solution
To correct your issue, you could simply use -Filter in both code examples and get the desired output, however, that is prone to cause problems sooner or later.
The more appropriate way to negate the problem is to properly use a PowerShell sub-expression within the double-quoted strings.
To create a sub-expression, simply wrap the desired code like so: $(). This creates a separation between which characters are code and which are part of the string; in your case allowing you to use the member access operator .. The method also alleviates the need to do string concatenation with the + operator.
Solution Code:
$HTMLOut = Get-ChildItem -Path $SearchPath -Recurse -Include $FileType | Where-Object {$_.LastWriteTime -ge "01/01/2014"} | Select #{Name="Link";Expression={("<a rel=$($_.FullName) href=file:///$($_.FullName)>$($_.Name)</a>")}}

SSIS - find unused variables in several packages

When working on large SSIS projects containing several packages, the packages can start to get a bit messy with variables that were created but never used or have been made redundant by other changes. I need something that will scan several SSIS packages and list all unused variables.
I have managed to answer my own question by employing some Powershell. The script below uses xpath to get the variable names and then uses regex to find the number of occurrences, if it occurs once it must be because it was defined but never used.
The only caveat is that if you use names for variables that are words that would naturally be present in a dtsx file, then the script will not pick them up. I probably need to expand my script to only do a regex search on spesific nodes in the package.
$results = #()
Get-ChildItem -Filter *.dtsx |
% {
$xml = [xml](Get-Content $_)
$Package = $_.Name
$ns = [System.Xml.XmlNamespaceManager]($xml.NameTable)
$ns.AddNamespace("DTS", "www.microsoft.com/SqlServer/Dts")
$var_list = #($xml.SelectNodes("//DTS:Variable/DTS:Property[#DTS:Name = 'ObjectName']", $ns) | % {$_.'#text'})
$var_list | ? {#([Regex]::Matches($xml.InnerXml, "\b$($_)\b")).Count -eq 1} |
% { $results += New-Object PSObject -Property #{
Package = $Package
Name = $_}
}
}
$results
This is a great question, as I have the same concern with a few rather large SSIS packages. Unfortunately, external to the SSIS packages, there isn't any feature available that will provide this function. See discussions in attached link:
CodePlex
But by opening an SSIS package, you can determine the variable usage by applying the steps outlined in the following link:
find-ssis-variable-dependencies-with-bi-xpress
Hope this helps.