When using Powershell's Dynamic parameters can I suppress Errors? - function

When using Powershell's Dynamic parameters can I suppress Errors?
Specifically the error being:
f foo
Search-FrequentDirectory : Cannot validate argument on parameter 'dirSearch'. The argument "foo" does not belong to the set
"bin,omega,ehiller,psmodules,deploy,gh.riotgames.com,build-go,vim74,cmder,dzr,vimfiles,src,openssh,git" specified by the ValidateSet attribute. Supply an argument
that is in the set and then try the command again.
At line:1 char:3
+ f foo
+ ~~~
+ CategoryInfo : InvalidData: (:) [Search-FrequentDirectory], ParameterBindingValidationException
+ FullyQualifiedErrorId : ParameterArgumentValidationError,Search-FrequentDirectory
Dynamic parameters being:
DynamicParam {
$dirSearch = new-object -Type System.Collections.ObjectModel.Collection[System.Attribute]
# [parameter(mandatory=...,
# ...
# )]
$dirSearchParamAttribute = new-object System.Management.Automation.ParameterAttribute
$dirSearchParamAttribute.Mandatory = $true
$dirSearchParamAttribute.Position = 1
$dirSearchParamAttribute.HelpMessage = "Enter one or more module names, separated by commas"
$dirSearch.Add($dirSearchParamAttribute)
# [ValidateSet[(...)]
$dirPossibles = #()
$historyFile = (Get-PSReadlineOption).HistorySavePath
# directory Seperating character for the os; \ (escaped to \\) for windows (as C:\Users\); / for linux (as in /var/www/);
# a catch all would be \\\/ ; but this invalidates the whitespace escape character that may be used mid-drectory.
$dirSep = "\\"
# Group[1] = Directory , Group[length-1] = lowest folder
$regex = "^[[:blank:]]*cd ([a-zA-Z\~:]+([$dirSep][^$dirSep]+)*[$dirSep]([^$dirSep]+)[$dirSep]?)$"
# original: ^[[:blank:]]*cd [a-zA-Z\~:\\\/]+([^\\\/]+[\\\/]?)*[\\\/]([^\\\/]+)[\/\\]?$
# test for historyFile existance
if( -not (Test-Path $historyFile )){
Write-Warning "File $historyFile not found, unable to load command history. Exiting.";
return 1;
}
$historyLines = Get-Content $historyFile
# create a hash table, format of ;;; [directory path] = [lowest directory]
$searchHistory = #{}
# create a hash table for the count (number of times the command has been run)
$searchCount = #{}
ForEach ( $line in $historyLines ) {
if( $line -match $regex ){
try {
# since the matches index can change, and a hashtable.count is not a valid way to find the index...
# I need this to figure out the highest integer index
$lowestDirectory = $matches[($matches.keys | sort -Descending | Select-Object -First 1)]
$fullPath = $matches[1]
if($searchHistory.keys -notcontains $matches[1]){
$searchHistory.Add($matches[1],$lowestDirectory)
}
$searchCount[$fullPath] = 1
} catch {
$searchCount[$fullPath]++
}
}
}
# this helps with hashtables
# https://www.simple-talk.com/sysadmin/powershell/powershell-one-liners-collections-hashtables-arrays-and-strings/
$dirPossibles = ( $searchHistory.values | Select -Unique )
$modulesValidated_SetAttribute = New-Object -type System.Management.Automation.ValidateSetAttribute($dirPossibles)
$dirSearch.Add($modulesValidated_SetAttribute)
# Remaining boilerplate
$dirSearchDefinition = new-object -Type System.Management.Automation.RuntimeDefinedParameter("dirSearch", [String[]], $dirSearch)
$paramDictionary = new-object -Type System.Management.Automation.RuntimeDefinedParameterDictionary
$paramDictionary.Add("dirSearch", $dirSearchDefinition)
return $paramDictionary
}
The function works, when I'm in the set everything is great. When I miskey or whatever, I get a rather unpleasant looking (non-user friendly error) - which I would like to style.
Is there a way to do this? To suppress the error? I tried try / catch and it was a no go, and I haven't been able to find much else on this - that is, error suppression in dynamic parameters.

I found a way to do it, but not sure I really recommend its use, and it has some downsides of duplicated code. Maybe there is a way to solve this better, but I did this more as an exercise of if it could be done.
Code Get-DynamicParamTestCustom.ps1
<#
Reference: http://blog.enowsoftware.com/solutions-engine/bid/185867/Powershell-Upping-your-Parameter-Validation-Game-with-Dynamic-Parameters-Part-II
#>
[CmdletBinding()]
param (
)
DynamicParam {
function New-ValidationDynamicParam {
[CmdletBinding()]
[OutputType('System.Management.Automation.RuntimeDefinedParameter')]
param (
[Parameter(Mandatory)]
[ValidateNotNullOrEmpty()]
[string]$Name,
[ValidateNotNullOrEmpty()]
[Parameter(Mandatory)]
[array]$ValidateSetOptions,
[Parameter()]
[switch]$Mandatory = $false,
[Parameter()]
[string]$ParameterSetName = '__AllParameterSets',
[Parameter()]
[switch]$ValueFromPipeline = $false,
[Parameter()]
[switch]$ValueFromPipelineByPropertyName = $false
)
$AttribColl = New-Object System.Collections.ObjectModel.Collection[System.Attribute]
$ParamAttrib = New-Object System.Management.Automation.ParameterAttribute
$ParamAttrib.Mandatory = $Mandatory.IsPresent
$ParamAttrib.ParameterSetName = $ParameterSetName
$ParamAttrib.ValueFromPipeline = $ValueFromPipeline.IsPresent
$ParamAttrib.ValueFromPipelineByPropertyName = $ValueFromPipelineByPropertyName.IsPresent
$AttribColl.Add($ParamAttrib)
$AttribColl.Add((New-Object System.Management.Automation.ValidateSetAttribute($Param.ValidateSetOptions)))
$RuntimeParam = New-Object System.Management.Automation.RuntimeDefinedParameter($Param.Name, [string], $AttribColl)
$RuntimeParam
}
function Get-ValidValues
{
# get list of valid values
$validValues = #()
$validValues += 'a'
$validValues += 'b'
$validValues += 'c'
$validValues += $global:dynamic1Value
$validValues
}
# coerce the current passed value into our list, and we detect later
# to customize message
# the heart of this problem is getting the param from the Call Stack
# and stashing it away (a hack, but it IS a solution).
$line = (Get-PSCallStack | Select -First 1 | Select *).InvocationInfo.Line
# parse this for the command line arg
# TODO: make this more robust
$null = $line -match "-Dynamic1 (.*?)(\s+|$)"
$global:dynamic1Value = $Matches[1]
$ParamOptions = #(
#{
'Name' = 'Dynamic1';
'Mandatory' = $true;
'ValidateSetOptions' = Get-ValidValues
}
)
$RuntimeParamDic = New-Object System.Management.Automation.RuntimeDefinedParameterDictionary
foreach ($Param in $ParamOptions)
{
$RuntimeParam = New-ValidationDynamicParam #Param
$RuntimeParamDic.Add($Param.Name, $RuntimeParam)
}
return $RuntimeParamDic
}
begin
{
$PsBoundParameters.GetEnumerator() | foreach { New-Variable -Name $_.Key -Value $_.Value -ea 'SilentlyContinue'}
}
process
{
# not sure how else to write this because the function needs to be inside
# DynamicParam{} block for its usage, and here for 'process' usage.
function Get-ValidValuesReal
{
# get list of valid values
$validValues = #()
$validValues += 'a'
$validValues += 'b'
$validValues += 'c'
$validValues
}
function foo
{
}
Write-Output "global:dynamic1Value is: '$($global:dynamic1Value)'."
Write-Output "Dynamic1 is: '$($Dynamic1)'."
$realValues = Get-ValidValuesReal
if ($global:dynamic1Value -notin $realValues)
{
Write-Error "Hey, '$global:dynamic1Value' is not allowed."
}
else
{
Write-Output "Dynamic1 is: '$($Dynamic1)' and is cool."
}
}
end {}
Test Cases
.\Get-DynamicParamTestCustom.ps1 -Dynamic1 t
.\Get-DynamicParamTestCustom.ps1 -Dynamic1 test
.\Get-DynamicParamTestCustom.ps1 -Dynamic1 test -Verbpse
.\Get-DynamicParamTestCustom.ps1 -Dynamic1 a

Related

What is the good way to read data from CSV and converting them to JSON?

I am trying to read the data from CSV file which has 2200000 records using PowerShell and storing each record in JSON file, but this takes almost 12 hours.
Sample CSV Data:
We will only concern about the 1st column value's.
Code:
function Read-IPData
{
$dbFilePath = Get-ChildItem -Path $rootDir -Filter "IP2*.CSV" | ForEach-Object{ $_.FullName }
Write-Host "file path - $dbFilePath"
Write-Host "Reading..."
$data = Get-Content -Path $dbFilePath | Select-Object -Skip 1
Write-Host "Reading data finished"
$count = $data.Count
Write-host "Total $count records found"
return $data
}
function Convert-NumbetToIP
{
param(
[Parameter(Mandatory=$true)][string]$number
)
try
{
$w = [int64]($number/16777216)%256
$x = [int64]($number/65536)%256
$y = [int64]($number/256)%256
$z = [int64]$number%256
$ipAddress = "$w.$x.$y.$z"
Write-Host "IP Address - $ipAddress"
return $ipAddress
}
catch
{
Write-Host "$_"
continue
}
}
Write-Host "Getting IP Addresses from $dbFileName"
$data = Read-IPData
Write-Host "Checking whether output.json file exist, if not create"
$outputFile = Join-Path -Path $rootDir -ChildPath "output.json"
if(!(Test-Path $outputFile))
{
Write-Host "$outputFile doestnot exist, creating..."
New-Item -Path $outputFile -type "file"
}
foreach($item in $data)
{
$row = $item -split ","
$ipNumber = $row[0].trim('"')
Write-Host "Converting $ipNumber to ipaddress"
$toIpAddress = Convert-NumbetToIP -number $ipNumber
Write-Host "Preparing document JSON"
$object = [PSCustomObject]#{
"ip-address" = $toIpAddress
"is-vpn" = "true"
"#timestamp" = (Get-Date).ToString("o")
}
$document = $object | ConvertTo-Json -Compress -Depth 100
Write-Host "Adding document - $document"
Add-Content -Path $outputFile $document
}
Could you please help optimize the code or is there a better way to do it. or is there a way like multi-threading.
Here is a possible optimization:
function Get-IPDataPath
{
$dbFilePath = Get-ChildItem -Path $rootDir -Filter "IP2*.CSV" | ForEach-Object FullName | Select-Object -First 1
Write-Host "file path - $dbFilePath"
$dbFilePath # implicit output
}
function Convert-NumberToIP
{
param(
[Parameter(Mandatory=$true)][string]$number
)
[Int64] $numberInt = 0
if( [Int64]::TryParse( $number, [ref] $numberInt ) ) {
if( ($numberInt -ge 0) -and ($numberInt -le 0xFFFFFFFFl) ) {
# Convert to IP address like '192.168.23.42'
([IPAddress] $numberInt).ToString()
}
}
# In case TryParse() returns $false or the number is out of range for an IPv4 address,
# the output of this function will be empty, which converts to $false in a boolean context.
}
$dbFilePath = Get-IPDataPath
$outputFile = Join-Path -Path $rootDir -ChildPath "output.json"
Write-Host "Converting CSV file $dbFilePath to $outputFile"
$object = [PSCustomObject]#{
'ip-address' = ''
'is-vpn' = 'true'
'#timestamp' = ''
}
# Enclose foreach loop in a script block to be able to pipe its output to Set-Content
& {
foreach( $item in [Linq.Enumerable]::Skip( [IO.File]::ReadLines( $dbFilePath ), 1 ) )
{
$row = $item -split ','
$ipNumber = $row[0].trim('"')
if( $ip = Convert-NumberToIP -number $ipNumber )
{
$object.'ip-address' = $ip
$object.'#timestamp' = (Get-Date).ToString('o')
# Implicit output
$object | ConvertTo-Json -Compress -Depth 100
}
}
} | Set-Content -Path $outputFile
Remarks for improving performance:
Avoid Get-Content, especially for line-by-line processing it tends to be slow. A much faster alternative is the File.ReadLines method. To skip the header line, use the Linq.Enumerable.Skip() method.
There is no need to read the whole CSV into memory first. Using ReadLines in a foreach loop does lazy enumeration, i. e. it reads only one line per loop iteration. This works because it returns an enumerator instead of a collection of lines.
Avoid try and catch if exceptions occur often, because the "exceptional" code path is very slow. Instead use Int64.TryParse() which returns a boolean indicating successful conversion.
Instead of "manually" converting the IP number to bytes, use the IPAddress class which has a constructor that takes an integer number. Use its method .GetAddressBytes() to get an array of bytes in network (big-endian) order. Finally use the PowerShell -join operator to create a string of the expected format.
Don't allocate a [pscustomobject] for each row, which has some overhead. Create it once before the loop and inside the loop only assign the values.
Avoid Write-Host (or any output to the console) within inner loops.
Unrelated to performance:
I've removed the New-Item call to create the output file, which isn't necessary because Set-Content automatically creates the file if it doesn't exist.
Note that the output is in NDJSON format, where each line is like a JSON file. In case you actually want this to be a regular JSON file, enclose the output in [ ] and insert a comma , between each row.
Modified processing loop to write a regular JSON file instead of NDJSON file:
& {
'[' # begin array
$first = $true
foreach( $item in [Linq.Enumerable]::Skip( [IO.File]::ReadLines( $dbFilePath ), 1 ) )
{
$row = $item -split ','
$ipNumber = $row[0].trim('"')
if( $ip = Convert-NumberToIP -number $ipNumber )
{
$object.'ip-address' = $ip
$object.'#timestamp' = (Get-Date).ToString('o')
$row = $object | ConvertTo-Json -Compress -Depth 100
# write array element delimiter if necessary
if( $first ) { $row; $first = $false } else { ",$row" }
}
}
']' # end array
} | Set-Content -Path $outputFile
You can optimize the function Convert-NumberToIP like below:
function Convert-NumberToIP {
param(
[Parameter(Mandatory=$true)][uint32]$number
)
# either do the math yourself like this:
# $w = ($number -shr 24) -band 255
# $x = ($number -shr 16) -band 255
# $y = ($number -shr 8) -band 255
# $z = $number -band 255
# '{0}.{1}.{2}.{3}' -f $w, $x, $y, $z # output the dotted IP string
# or use .Net:
$n = ([IPAddress]$number).GetAddressBytes()
[array]::Reverse($n)
([IPAddress]$n).IPAddressToString
}

Convert and format XML to JSON on Azure Storage Account in Powershell

So i'm trying to convert XML files on an Azure Storage Container to JSON in the same container.
This way I'm able to read the information into an Azure SQL Database via an Azure Datafactory.
I'd like to stay clear from using Logic apps if able.
The JSON files need to be formatted.
And all this through the use of PowerShell scripting.
What i've got so far after some searching on the interwebs and shamelessly copying and pasting powershell code:
#Connect-AzAccount
# Helper function that converts a *simple* XML document to a nested hashtable
# with ordered keys.
function ConvertFrom-Xml {
param([parameter(Mandatory, ValueFromPipeline)] [System.Xml.XmlNode] $node)
process {
if ($node.DocumentElement) { $node = $node.DocumentElement }
$oht = [ordered] #{}
$name = $node.Name
if ($node.FirstChild -is [system.xml.xmltext]) {
$oht.$name = $node.FirstChild.InnerText
} else {
$oht.$name = New-Object System.Collections.ArrayList
foreach ($child in $node.ChildNodes) {
$null = $oht.$name.Add((ConvertFrom-Xml $child))
}
}
$oht
}
}
function Format-Json
{
<#
.SYNOPSIS
Prettifies JSON output.
.DESCRIPTION
Reformats a JSON string so the output looks better than what ConvertTo-Json outputs.
.PARAMETER Json
Required: [string] The JSON text to prettify.
.PARAMETER Minify
Optional: Returns the json string compressed.
.PARAMETER Indentation
Optional: The number of spaces (1..1024) to use for indentation. Defaults to 4.
.PARAMETER AsArray
Optional: If set, the output will be in the form of a string array, otherwise a single string is output.
.EXAMPLE
$json | ConvertTo-Json | Format-Json -Indentation 2
#>
[CmdletBinding(DefaultParameterSetName = 'Prettify')]
Param(
[Parameter(Mandatory = $true, Position = 0, ValueFromPipeline = $true)]
[string]$Json,
[Parameter(ParameterSetName = 'Minify')]
[switch]$Minify,
[Parameter(ParameterSetName = 'Prettify')]
[ValidateRange(1, 1024)]
[int]$Indentation = 4,
[Parameter(ParameterSetName = 'Prettify')]
[switch]$AsArray
)
if ($PSCmdlet.ParameterSetName -eq 'Minify')
{
return ($Json | ConvertFrom-Json) | ConvertTo-Json -Depth 100 -Compress
}
# If the input JSON text has been created with ConvertTo-Json -Compress
# then we first need to reconvert it without compression
if ($Json -notmatch '\r?\n')
{
$Json = ($Json | ConvertFrom-Json) | ConvertTo-Json -Depth 100
}
$indent = 0
$regexUnlessQuoted = '(?=([^"]*"[^"]*")*[^"]*$)'
$result = $Json -split '\r?\n' |
ForEach-Object {
# If the line contains a ] or } character,
# we need to decrement the indentation level unless it is inside quotes.
if ($_ -match "[}\]]$regexUnlessQuoted")
{
$indent = [Math]::Max($indent - $Indentation, 0)
}
# Replace all colon-space combinations by ": " unless it is inside quotes.
$line = (' ' * $indent) + ($_.TrimStart() -replace ":\s+$regexUnlessQuoted", ': ')
# If the line contains a [ or { character,
# we need to increment the indentation level unless it is inside quotes.
if ($_ -match "[\{\[]$regexUnlessQuoted")
{
$indent += $Indentation
}
$line
}
if ($AsArray) { return $result }
return $result -Join [Environment]::NewLine
}
# Storage account details
$resourceGroup = "insert resource group here"
$storageAccountName = "insert storage account name here"
$container = "insert container here"
$storageAccountKey = (Get-AzStorageAccountKey -ResourceGroupName $resourceGroup -Name $storageAccountName).Value[0]
$storageAccount = Get-AzStorageAccount -ResourceGroupName $resourceGroup -Name $storageAccountName
# Creating Storage context for Source, destination and log storage accounts
#$context = New-AzStorageContext -StorageAccountName $storageAccountName -StorageAccountKey $storageAccountKey
$context = New-AzStorageContext -ConnectionString "insert connection string here"
$blob_list = Get-AzStorageBlob -Container $container -Context $context
foreach($blob_iterator in $blob_list){
[XML](Get-AzStorageBlobContent $blob_iterator.name -Container $container -Context $context) | ConvertFrom-Xml | ConvertTo-Json -Depth 11 | Format-Json | Set-Content ($blob_iterator.name + '.json')
}
Output =
Cannot convert value "Microsoft.WindowsAzure.Commands.Common.Storage.ResourceModel.AzureStorageBlob" to type "System.Xml.XmlDocument". Error: "The specified node cannot
be inserted as the valid child of this node, because the specified node is the wrong type."
At C:\Users\.....\Convert XML to JSON.ps1:116 char:6
+ [XML](Get-AzStorageBlobContent $blob_iterator.name -Container $c ...
+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+ CategoryInfo : InvalidArgument: (:) [], RuntimeException
+ FullyQualifiedErrorId : InvalidCastToXmlDocument
When I run the code the script asks me if I want to download the xml file to a local folder on my laptop.
This is not what I want, I want the conversion to be done in Azure on the Storage container.
And I think that I'm adding ".json" to the .xml file name.
So the output would become something like filename.xml.json instead of just filename.json
What's going wrong here?
And how can it be fixed?
Thank you in advance for your help.

Powershell scripts work when run directly but not when called by another

I have a long list of simple jobs I would like to somewhat automate. It's simple stuff, grab or post info via API and build some reports, nothing fancy.
I decided to build a master script which directs out to a variety of other scripts, each handling its own job. Each one of those little scripts, reference functions from a Utility script which I built that has functions which are common to all the other simple job scripts.
Each of the scripts work perfectly when I run them directly, however, when I try to run them via the master script, which routes to them, they all fail.
One example is that in many cases I need to fetch data from an API but get capped at 1000 object returns when I need 10k+. To solve this, I built a function which recursively calls itself until there is no more data left to collect. Again, this works when called by itself but not from the master script, for some reason, it bails out after the first run (should run 10+ times in this case). Then, it returns nothing.
I am thinking maybe this has something to do with how I am scoping the functions/variables?? Not sure. I have tried scoping to Global, Local & Script but none seem to work. Here's some of the code...
*Master Director Script runs script based on user input*
...
&$choice_hash[$action].script_path
$ScriptDirectory = Split-Path -Path $MyInvocation.MyCommand.Definition -Parent
. "$ScriptDirectory\Utilities.psm1"
$user_data = $null
$env_choice = $null
$csv_output_path = $null
$collated_user_data = [System.Collections.ArrayList]#()
function selectEnv {
$global:env_choice = Read-Host #"
> Select an Environment: [Prod] or [Dev]
Your Choice
"#
if ($env_choice -ne 'Prod' -and $env_choice -ne 'Dev') {
consoleCmt $env_choice
consoleCmt 'Invalid Choice. Try again...'
selectEnv
} else {
if ($env_choice -eq 'Prod') {
$global:csv_output_path = '\\etoprod\******\Exports\Report_Users_Prod.csv'
} else {
$global:csv_output_path = '\\etoprod\******\Exports\Report_Users_Dev.csv'
}
$global:user_data = process_data $env_choice 'api/xm/1/people?embed=roles&limit=1000'
}
}
function processUsersData {
foreach($user in $user_data) {
$user_roles = ''
$role_divider = ','
for($i = 0; $i -lt $user.roles.data.length; $i++) {
# Only append a comma if there are more, otherwise leave blank for CSV deliniation
if ($i -eq $user.roles.data.length - 1) {
$role_divider = ''
}
$user_roles += $user.roles.data[$i].name + $role_divider
}
# Build ordered hash table with above data
$sanatized_user = [pscustomobject][ordered]#{id = $user.targetName; firstName = $user.firstName; lastName = $user.lastName; siteName = $user.site.name; roles = $user_roles }
# Shovel into storage array used for building the CSV
$global:collated_user_data += $sanatized_user
}
}
notice 'Initiating Groups Report Script'
selectEnv
processUsersData
exportCsv $collated_user_data $csv_output_path
Utility Script (relevant functions being called)
$res = $null
$content = #()
...
function process_data($env, $url) {
fetch_data $env $url
foreach($i in $res.data) {
$global:content += $i
}
if($res.links.next) {
fetch_more $env $res.links.next
}
return $content **Should return full collection of data, but fails after one pass**
}
function fetch_data($env, $url) {
$base = generateEnvBase $env
$path = "$base/$url"
$req = Invoke-WebRequest -Credential $cred -Uri $path -Method GET
$global:res = ConvertFrom-Json $req
}
function fetch_more($env, $url) {
$base = generateEnvBase $env
$path = "$base$url"
$req = Invoke-WebRequest -Credential $cred -Uri $path -Method GET
$res = ConvertFrom-Json $req
foreach($i in $res.data) {
$global:content += $i
}
if($res.links.next) {
fetch_more $env $res.links.next
}
}
Sorry in advance if I have not followed procedure or etiquette, I'm new here.
This should work if you declare all variables in Main.ps1 that are needed by functions. You could also use the "Script" scope when creating a new variable inside a function that you want to use outside the function. Example $Script:Var = "Stuff" created inside a function will be available to whole script.
Directory Structure
C:\Script\Root
| Main.ps1
\---Utilities
fetch_data.ps1
fetch_more.ps1
processUsersData.ps1
process_data.ps1
selectEnv.ps1
Main.ps1
#---[ Initization ]---#
# Strings
[String]$RootPath = $PSScriptRoot
[String]$UtilPath = "$($RootPath)\Utilities"
[String]$env_choice = $null
[String]$csv_output_path = $null
# Arrays
[Array]$user_data = #()
[Array]$content = #()
[Array]$collated_user_data = #()
[Array]$res = #()
#---[ Source in Utilities ]---#
# Get the scripts
$Utilities = Get-ChildItem -Path "$UtilPath" -File | Where-Object {$_.Extension -eq ".ps1"}
# Source in each one
foreach ($Item in $Utilities) {
.$Item.FullName
}
#---[ Select an Environment ]---#
# Get the User's choice
$env_choice = selectEnv
# Process the choice
switch ($env_choice) {
Prod {
$csv_output_path = '\\etoprod\******\Exports\Report_Users_Prod.csv'
$user_data = process_data 'Prod' 'api/xm/1/people?embed=roles&limit=1000'
}
Dev {
$csv_output_path = '\\etoprod\******\Exports\Report_Users_Dev.csv'
$user_data = process_data 'Dev' 'api/xm/1/people?embed=roles&limit=1000'
}
Test {
Write-Output "Test is not an option. Choose wisely."
exit 1
}
Default {
Write-Output "Unknown Environment Choice."
exit 1
}
}
#---[ Process Users and Export ]---#
processUsersData
exportCsv $collated_user_data $csv_output_path
selectEnv.ps1
function selectEnv {
$Title = "Environment:"
$Info = "Please choose an environment"
# Options
$Prod = New-Object System.Management.Automation.Host.ChoiceDescription '&Prod', 'Production environment'
$Dev = New-Object System.Management.Automation.Host.ChoiceDescription '&Dev', 'Development environment'
$Test = New-Object System.Management.Automation.Host.ChoiceDescription '&Test', 'Testing environment'
$Options = [System.Management.Automation.Host.ChoiceDescription[]]($Prod, $Dev, $Test)
$Default = 0
# Promp the User
$Choice = $host.UI.PromptForChoice($Title , $Info , $Options, $Default)
$Result = $Options[$Choice].Label -Replace '&',''
return $Result
}

Use ping output to create a CSV

I have written following code to use ping command to ping multiple computers, I would like to capture following values it:
Reply from / Request timed out etc.
Actual IP Address of the remote host
This function is working fine however, if I am trying to get its returned values to a CSV I am unable to do so.
# Declaration of the function name and expected parameters
Function Ping-Check{
[cmdletbinding()]
PARAM (
[Parameter(Mandatory=$True,
ValueFromPipeline=$True,
ValueFromPipelineByPropertyName=$True,
HelpMessage='Enter the name of the remote host.')]
[String]$ObjCompName
)
Begin{
# Setup the Process startup info
$pinfo = New-Object System.Diagnostics.ProcessStartInfo
$pinfo.FileName = "ping.exe"
$pinfo.Arguments = " -a " + $ObjCompName + " -n 2"
$pinfo.UseShellExecute = $false
$pinfo.CreateNoWindow = $true
$pinfo.RedirectStandardOutput = $true
$pinfo.RedirectStandardError = $true
}
Process{
$p = New-Object System.Diagnostics.Process
$p.StartInfo = $pinfo
$p.Start() | Out-Null
$p.WaitForExit()
# Redirect the Output
$stdout = $p.StandardOutput.ReadToEnd()
$stderr = $p.StandardError.ReadToEnd()
}
End{
Write-Output "stdout: $stdout"
Write-Output "stderr: $stderr"
Write-Output "exit code: " + $p.ExitCode
}
}
There is a Test-Connection cmdlet available in powershell which returns an object array with properties you can use. If you wan't to stick with your attempt, be aware that you have to parse the output yourself.

Powershell 2.0 - loop through records in CSV for FTP Login validate and update csv with confirmation

Current issue, probably stupid simple but have been grinding my wheels for awhile. I have 600+ FTP accounts to validate if they are able to be logged on. Verified field to be updated in CSV.
CSV Sample:
Connection,Name,Path,Password,Phonetic,Client Name,Client ID,Verified
FTP,BFGftp,d:\data\ftpaccounts\BFGftp,8Wu8Agec8$,(Eight - WHISKEY - uniform - Eight - ALPHA - golf - echo - charlie - Eight - Dollar),(Nine - papa - ECHO - foxtrot - ROMEO - Two - kilo - echo - NOVEMBER - Two),,
FTP,bookitftp,d:\data\ftpaccounts\flipitftp,i3439flip12##,,Flipitftp,30342,
Missing statement block after 'else' keyword.
At C:\Users\mgoeres\Desktop\B-SWIFT FTP TEST2.ps1:118 char:10
+ else <<<< ($i++){ $GetFTP }
+ CategoryInfo : ParserError: (:) [], ParentContainsErrorRecordException
+ FullyQualifiedErrorId : MissingStatementBlockAfterElse
FTP ACCOUNT VALIDATOR
Function Get-FtpDirectory($Directory) {
#Example
#$Server = "ftp://ftp.example.com/"
#$User = "anonymous#example.com"
#$Pass = "anonymous#anonymous.com"
$Server = "ftp://ftp.localhost.com/"
$User = $FTPLogon
$Pass = $FTPPass
# Credentials
$FTPRequest = [System.Net.FtpWebRequest]::Create("$($Server)$($Directory)")
$FTPRequest.Credentials = New-Object System.Net.NetworkCredential($User,$Pass)
$FTPRequest.Method = [System.Net.WebRequestMethods+FTP]::ListDirectoryDetails
# Don't want Binary, Keep Alive unecessary.
$FTPRequest.UseBinary = $False
$FTPRequest.KeepAlive = $False
$FTPResponse = $FTPRequest.GetResponse()
$ResponseStream = $FTPResponse.GetResponseStream()
# Create a nice Array of the detailed directory listing
$StreamReader = New-Object System.IO.Streamreader $ResponseStream
$DirListing = (($StreamReader.ReadToEnd()) -split [Environment]::NewLine)
$StreamReader.Close()
# Remove first two elements ( . and .. ) and last element (\n)
$DirListing = $DirListing[2..($DirListing.Length-2)]
# Close the FTP connection so only one is open at a time
$FTPResponse.Close()
# This array will hold the final result
$FileTree = #()
# Loop through the listings
foreach ($CurLine in $DirListing) {
# Split line into space separated array
$LineTok = ($CurLine -split '\ +')
# Get the filename (can even contain spaces)
$CurFile = $LineTok[8..($LineTok.Length-1)]
# Figure out if it's a directory. Super hax.
$DirBool = $LineTok[0].StartsWith("d")
# Determine what to do next (file or dir?)
If ($DirBool) {
# Recursively traverse sub-directories
$FileTree += ,(Get-FtpDirectory "$($Directory)$($CurFile)/")
}
Else {
# Add the output to the file tree
$FileTree += ,"$($Directory)$($CurFile)"
}
}
if (!$FileTree -eq $null) {
$Verified = "Y"
Return $FileTree
}
else {
$Verified = "Failed"}
}
# Update data record field "Verified" in CSV '$data' file with status value[Y,Failed,Other]
There is where the import of the FTP login ID and Password should start
#Column Names [Connection,Name,Path,Password,Phonetic,Client Name,Client ID,Verified]
#Variables
$AllName = #()
$AllPassword = #()
$AllVerified = #()
$data = #()
$FTPLogon = #()
$FTPPass = #()
$response = #()
$GetFTP = Get-FtpDirectory
# Import CSV File "FTP_Account_List_with_Password_and_Path.csv"
$data = Import-Csv C:\Users\mgoeres\Desktop\FTP_Account_List_with_Password_and_Path.csv | where-object {$_.Connection -eq "FTP"}
# This import allows key columns to be referenced as variables.
#Import-Csv C:\Users\mgoeres\Desktop\FTP_Account_List_with_Password_and_Path.csv | ForEach-Object {
#$AllName += $_.Name
#$AllPassword += $_.Password
#$AllVerified += $_.Verified}
# Simple test to see if CSV was imported into '$data' and column headers are working as variables
$data|select-object Name,Connection,Verified
Write-Host "CSV File Opened and Loaded"
## Select (next)record from '$data' and store correlating values in '$UserName' and '$Password'
# .EXAMPLE '$data[0].Password' <= First records Password value.
# .EXAMPLE '$data[1].Name' <= Second records Name value.
for ($i=0; $i -lt $data.count; $i++){
$FTPLogon = $data[$i].Name
$FTPPass = $data[$i].Password
$Verified = $data[$i].Verified }
if ($Verifed -eq $null){ $GetFTP }
else ($i++){ $GetFTP }