Importing a CDATA string literal from a JSON array into PowerShell - json

So I have a XML value extractor for a large rule database.
I stored the values in a Json file with sections for each file and then the values tied to their names.
However some of the values are a CDATA Snippet.
I can pull the CDATA as a string literal to store in the Json but I CANNOT seem to find a way to get power shell to let me put it back into the XML.
Cannot set "Value" because only strings can be used as values to set XmlNode properties.
At C:\Users\vagrant\Desktop\RuleSetter.ps1:24 char:21
+ $rule.value = $ruleFileSet. ($singleRuleXmlFile.directory.nam ...
+
+ CategoryInfo : NotSpecified: (:) [], SetValueException
+ FullyQualifiedErrorId : XmlNodeSetShouldBeAString`
The Getter looks like this
$directories = Get-ChildItem -dir -path C:\inetpub\wwwroot\
foreach($instName in $directories){
if($instName -notlike 'client' -and $instName -notlike 'bin'-and $instName -notlike 'aspnet_client'-and $instName -notlike 'AccessLibertyLogs'){
$obj = [hashtable]#{}
$head = [hashtable]#{Institution = $instName}
$obj.Add('head',$head)
$files = Get-ChildItem C:\inetpub\wwwroot\$instName\filepath -Include Rules.XML -Recurse
$files |
ForEach-Object {
[xml]$temp = Get-Content -path $_.FullName
[string]$title = $_.Directory.name
[hashtable]$output = #{}
foreach($rule in $temp.RuleCollection.Rules.Rule){
$t = ""
if($rule.value -isnot [string]){
$t = $rule.value.innerXml
}else{
$t = $rule.value
}
$output.Add($rule.name,$t)
}
$obj.Add($title,$output)
}
$transfer = New-Object -TypeName PSObject -Property $obj
$location = "C:\Users\vagrant\desktop\json\" + $instName + ".Json"
$transfer | ConvertTo-Json -Compress | Out-file -FilePath $location
}
}
And the Setter like this
Param(
[string]$location,
[string]$ruleConfigFileLocation)
$allInstConfigFiles = Get-ChildItem -Path C:\Users\vagrant\Desktop\json\
foreach($singleInstFile in $allInstConfigFiles) {
$instaName = $singleInstFile.BaseName
$allRuleXmlFileList = Get-ChildItem C:\inetpub\wwwroot\$instaName\Liberty \Applications\Origination\Configuration -Include Rules.XML -Recurse
ForEach($singleRuleXmlFile in $allRuleXmlFileList) {
[xml]$singleRuleXmlFileContents = Get-Content -path $singleRuleXmlFile.FullName
$singleInstFileContent = Get-Content $singleInstFile.FullName | ConvertFrom-Json
foreach($ruleFileSet in $singleInstFileContent){
foreach($rule in $singleRuleXmlFileContents.RuleCollection.Rules.Rule){
#write-host $ruleFileSet.($singleRuleXmlFile.directory.name).($rule.name)
if($ruleFileSet.($singleRuleXmlFile.directory.name).($rule.name) -as [xml])
{
$rule.value = $ruleFileSet.($singleRuleXmlFile.directory.name).($rule.name).OuterXml
write-host 'skipped'
}else{
write-host 'not skipped'
$rule.value = $ruleFileSet.($singleRuleXmlFile.directory.name).($rule.name)
}
}
}
}
}

Related

How to call a function within a foreach parallel loop

Good evening.
I'm trying to use parallelism for the first time but I don't understand how to call a function within foreach loop.
I get a series of this error: Cannot bind argument to parameter 'Path' because it is null.
This is what I've done so far:
$FolderPath = "C:\myfolder\"
function AppendLog ($client) {
$so = New-CimSessionOption -Protocol 'DCOM'
$s = New-CimSession -ComputerName $client -SessionOption $so
Add-Content -Path (join-path $folderpath "LOGS.txt") -Value ( (get-date -Format "[yyyy.MM.dd HH:mm:ss]").tostring() + $client + " -PING: OK")
$arch = Get-CimInstance –query "select * from win32_operatingsystem" -CimSession $s | select -expandproperty osarchitecture
Add-Content -Path (join-path $folderpath "LOGS.txt") -Value ( (get-date -Format "[yyyy.MM.dd HH:mm:ss]").tostring() + $client + " -ARCH:" + $arch )
$lastboot = Get-CimInstance –query "select * from win32_operatingsystem" -CimSession $s | select -expandproperty lastbootuptime
Add-Content -Path (join-path $folderpath "LOGS.txt") -Value ( (get-date -Format "[yyyy.MM.dd HH:mm:ss]").tostring() + $client + " -BOOT:" + $lastboot )
}
$funcDef = $function:AppendLog.ToString()
$clients = get-content -path (join-path $folderPath "client.txt")
$clients | ForEach-Object -parallel {
if (test-connection $_ -count 2 -Quiet)
{
$function:AppendLog = $using:funcDef
AppendLog ($_)
}
} -throttlelimit 3
Could you explain me how to pass my path?
My bad on the comment, the error you're getting is most likely coming from your function. The error is being thrown by Join-Path:
PS /> Join-Path $null 'Logs.txt'
Join-Path : Cannot bind argument to parameter 'Path' because it is null.
At line:1 char:11
+ Join-Path $null 'Logs.txt'
+ ~~~~~
+ CategoryInfo : InvalidData: (:) [Join-Path], ParameterBindingValidationException
+ FullyQualifiedErrorId : ParameterArgumentValidationErrorNullNotAllowed,Microsoft.PowerShell.Commands.JoinPathCommand
The reason is because $FolderPath doesn't exist in the scope of your parallel loop. $folderpath should be replace with $using:folderpath inside your function.
As a side note, adding information to the same file on a parallel execution doesn't seem to be a good idea.
Last edit, I understand if this is meant to test how ForEach-Object -Parallel works but again, if the cmdlet allows remote querying / remote execution with multiple hosts at the same time, let the cmdlet handle that for you, it is more efficient.
As for the code, this is what I would use with what you already have:
$FolderPath = "C:\myfolder\"
$sessionOption = New-CimSessionOption -Protocol 'DCOM'
$clients = Get-Content -Path (Join-Path $FolderPath -ChildPath "Client.txt")
$results = $clients | ForEach-Object -Parallel {
$out = #{
Time = [datetime]::Now.ToString('[yyyy.MM.dd HH:mm:ss]')
ComputerName = $_
}
if ($ping = Test-Connection $_ -Count 2 -Quiet)
{
$session = New-CimSession -ComputerName $_ -SessionOption $using:sessionOption
$OSInfo = Get-CimInstance -CimSession $session -ClassName win32_operatingSystem
Remove-CimSession $session
}
$out.Ping = $ping
$out.Arch = $OSInfo.OSArchitecture
$out.LastBoot = $OSInfo.LastBootUpTime
[pscustomobject]$out
} -ThrottleLimit 3
$results | Export-Csv "$myFolder\LOGS.csv" -NoTypeInformation
This will output an object like this below:
Time ComputerName Ping OSArchitecture LastBoot
---- ------------ ---- -------------- --------
[2021.06.19 20:06:00] ComputerName01 True 64-bit 6/16/2021 11:47:16 AM
[2021.06.19 20:07:00] ComputerName02 False
[2021.06.19 20:08:00] ComputerName03 True 64-bit 6/13/2021 11:47:16 AM
[2021.06.19 20:09:00] ComputerName04 True 64-bit 6/14/2021 11:47:16 AM
[2021.06.19 20:10:00] ComputerName05 True 64-bit 6/15/2021 11:47:16 AM
Which can be exported nicely to a CSV instead of a text file. P.D.: sorry for the syntax highlighting :(

How to extract json data from the Url using powershell script

I want my data from the url and want to filer it accordingly and export that data into the csv file
This is my code
Here $url variable contains the url
$defs = (Invoke-RestMethod -Uri ($url) -Method Get -UseDefaultCredentials)
$json = convertto-Json $defs
$json | Out-File D:\My_scripts\Final.json
$a = Get-Content D:\My_scripts\Final.json -raw | Out-String | ConvertFrom-Json
$arrOutput = #()
foreach($l2 in $a)
{
foreach($l3 in $l2.value)
{
#Write-Host $l3.id
$arr = #{}
$arr.id = $l3.id
$arr.name = $l3.name
$outarray = New-Object Psobject -Property $arr
$arrOutput += $outarray
}
}
$arrOutput | Export-Csv -Path D:\My_scripts\Final.csv -Delimiter "," -NoTypeInformation
But not able to get the values

Powershell Parse Swagger Json

The following works to parse a Swagger json into resource, method, httptype but probably... the $path.Definition part is weirdly, how can i get $path.Definition to be an array not a string that i need to parse for the array symbol.
$json = Get-Content -Path "$PSScriptRoot/Test/example_swagger.json" | ConvertFrom-Json
$paths = Get-Member -InputObject $json.paths -MemberType NoteProperty
$result = ""
foreach($path in $paths) {
$elements = $path.Name.Substring(5).split("/") -join ","
$httpmethods = $path.Definition.Substring($path.Definition.IndexOf("#{"))
if ($httpmethods.Contains("get")) {
$result += $elements + ", GET" + "`n"
}
if ($httpmethods.Contains("post")) {
$result += $elements + ", POST" + "`n" #same methodnames different http methods
}
}
$result
As detailed in my answer to Iterating through a JSON file PowerShell, the output of ConvertFrom-Json is hard to iterate over. This makes "for each key in object" and "keys of object not known ahead of time" kinds of situations more difficult to handle, but not impossible.
You need a helper function:
# helper to turn PSCustomObject into a list of key/value pairs
function Get-ObjectMember {
[CmdletBinding()]
Param(
[Parameter(Mandatory=$True, ValueFromPipeline=$True)]
[PSCustomObject]$obj
)
$obj | Get-Member -MemberType NoteProperty | ForEach-Object {
$key = $_.Name
[PSCustomObject]#{Key = $key; Value = $obj."$key"}
}
}
with this, the approach gets a whole lot simpler:
$swagger = Get-Content -Path "example_swagger.json" -Encoding UTF8 -Raw | ConvertFrom-Json
$swagger.paths | Get-ObjectMember | ForEach-Object {
[pscustomobject]#{
path = $_.Key
methods = $_.Value | Get-ObjectMember | ForEach-Object Key
}
}
Applied to the default Swagger file from https://editor.swagger.io/ as a sample, this is printed
path methods
---- -------
/pet {post, put}
/pet/findByStatus get
/pet/findByTags get
/pet/{petId} {delete, get, post}
/pet/{petId}/uploadImage post
/store/inventory get
/store/order post
/store/order/{orderId} {delete, get}
/user post
/user/createWithArray post
/user/createWithList post
/user/login get
/user/logout get
/user/{username} {delete, get, put}

How to apply colors in powershell output

Requirement :
I am beginner in powershell. Below ps script is giving the details about services are in started state or in stopped state but my requirement is I need to see this out put as background color in 'Sky Blue', if services are running then highlight in Green ,Stopped services in Red color. How do I achieve it.
Help on this is highly appriciated.
$Result = #()
foreach($server in Get-Content C:\PowerSQL\List.txt)
{
$Services=gwmi win32_service -computername $server | where {$_.Name -like ‘*SQL*’}
if(!(Test-Connection -Cn $server -BufferSize 16 -Count 1 -ea 0 -quiet))
{“Problem still exists in connecting to $server”}
ELSE {
$services | ForEach {
If ($_)
{ $Result += New-Object PSObject -Property #{
‘Host Name’ = $_.Systemname
‘Service Display Name’ = $_.Displayname
‘Service Name’ = $_.Name
‘Start Mode’ = $_.Startmode
‘Service Account Name’ = $_.Startname
‘State’ = $_.State
‘Status’= $_.Status
}
}
}
}
}
$Result | ConvertTo-HTML | Out-File C:\PowerSQL\service.htm
See my answer to similar question to this.
Communary.ConsoleExtensions [link] might help you
Invoke-ColorizedFileListing C:\Windows -m *.dmp
The above command will colorise file types and highlight dump files.
To save a color output, you would have to save to a format that preserves color, like RTF, or HTML. Txt (plain text file) only stores text.
The code below will save your output as an html file.
$time = (Get-Date).AddYears(-2)
Get-ChildItem -Recurse | Where-Object {$_.LastWriteTime -lt $time} |
Select Directory,Name,LastWriteTime |
ConvertTo-Html -Title "Services" -Body "<H2>The result of Get-ChildItem</H2> " -Property Directory,Name,LastWriteTime |
ForEach-Object {
if ($_ -like '<tr><td>*') {
$_ -replace '^(.*?)(<td>.*?</td>)<td>(.*?)</td>(.*)','$1$2<td><font color="green">$3</font></td>$4'
} else {
$_
}
} | Set-Content "$env:TEMP\ColorDirList.html" -Force
The line:
if ($_ -like '<tr><td>*') {
...checks for line in the html output that is a table row.
The line:
$_ -replace '^(.*?)(<td>.*?</td>)<td>(.*?)</td>(.*)','$1$2<td><font color="green">$3</font></td>$4'
...uses a RegEx to replace the 2nd table cell contents with a font tag with the color green. This is a very simple RegEx search & replace that will only color the 2nd column.
And here's another implementation of console only coloring, based on this link
$linestocolor = #(
'CSName Version OSArchitecture'
'------ ------- --------------'
'BENDER 6.1.7601 64-bit '
'LEELA 6.1.7601 64-bit '
'FRY 6.1.7600 64-bit '
'FARNSWORTH 6.1.7601 32-bit '
)
# http://www.bgreco.net/powershell/format-color/
function Format-Color {
[CmdletBinding()]
param(
[Parameter(ValueFromPipeline=$true,Mandatory=$true)]
$ToColorize
, [hashtable]$Colors=#{}
, [switch]$SimpleMatch
, [switch]$FullLine
)
Process {
$lines = ($ToColorize | Out-String).Trim() -replace "`r", "" -split "`n"
foreach($line in $lines) {
$color = ''
foreach($pattern in $Colors.Keys){
if (!$SimpleMatch -and !$FullLine -and $line -match "([\s\S]*?)($pattern)([\s\S]*)") { $color = $Colors[$pattern] }
elseif (!$SimpleMatch -and $line -match $pattern) { $color = $Colors[$pattern] }
elseif ($SimpleMatch -and $line -like $pattern) { $color = $Colors[$pattern] }
}
if ($color -eq '') { Write-Host $line }
elseif ($FullLine -or $SimpleMatch) { Write-Host $line -ForegroundColor $color }
else {
Write-Host $Matches[1] -NoNewline
Write-Host $Matches[2] -NoNewline -ForegroundColor $color
Write-Host $Matches[3]
}
}
}
}
$linestocolor | Format-Color -Colors #{'6.1.7600' = 'Red'; '32-bit' = 'Green'}
# doesn't work...
# (Get-ChildItem | Format-Table -AutoSize) | Format-Color -Colors #{'sql' = 'Red'; '08/07/2016' = 'Green'}
# does work...
Format-Color -ToColorize (Get-ChildItem | Format-Table -AutoSize) -Colors #{'sql' = 'Red'; '08/07/2016' = 'Green'}
return
EDIT. to answer the OPs request
$Result = #()
foreach($server in Get-Content C:\PowerSQL\List.txt)
{
$Services=gwmi win32_service -computername $server | where {$_.Name -like ‘*SQL*’}
if(!(Test-Connection -Cn $server -BufferSize 16 -Count 1 -ea 0 -quiet))
{“Problem still exists in connecting to $server”}
else {
$services | ForEach {
If ($_)
{ $Result += New-Object PSObject -Property #{
HostName = $_.Systemname
ServiceDisplayName = $_.Displayname
ServiceName = $_.Name
StartMode = $_.Startmode
ServiceAccountName = $_.Startname
State = $_.State
Status = $_.Status
}
}
}
}
}
$Result | ConvertTo-HTML `
-Title "Services" `
-Body "<H2>The result of gwmi win32_service</H2> " `
-Property HostName,ServiceDisplayName,ServiceName,StartMode,ServiceAccountName,State,Status |
ForEach-Object {
if ($_ -like '<tr><td>*') {
switch ($_) {
{ $_ -like '*<td>Stopped</td>*' } {$color='red'}
{ $_ -like '*<td>Running</td>*' } {$color='green'}
Default {$color='white'}
}
$_.Replace('<tr>', "<tr bgcolor=`"$color`">")
} else {
$_
}
} | Set-Content C:\PowerSQL\service.htm -Force

Powershell script not recognising Functions

I've written the following PS script to delete log files from specific server paths. I'm a novice to PS but I'm getting some errors with a few of the functions that I have written in this script:
#* FileName: FileCleaner.ps1
#Clear the screen
Clear
#Read XML Config File to get settings
[xml]$configfile = Get-Content "C:\Users\pmcma\Documents\Projects\Replace FileCleaner with PowerShell Script\FileCleaner.config.xml"
#Declare and set variables from Config values
$hostServer = $configfile.Settings.HostServer
$dirs = #($configfile.Settings.DirectoryName.Split(",").Trim())
$scanSubDirectories = $configfile.Settings.ScanSubDirectories
$deleteAllFiles = $configfile.Settings.deleteAllFiles
$fileTypesToDelete = #($configfile.Settings.FileTypesToDelete.Split(";").Trim())
$liveSiteLogs = $configfile.Settings.LiveSiteLogs
$fileExclusions = #($configfile.Settings.FileExclusions.Split(";").Trim())
$retentionPeriod = $configfile.Settings.RetentionPeriod
$AICLogs = $configfile.Settings.AICLogs
$AICLogsRententionPeriod = $configfile.Settings.AICLogsRententionPeriod
$fileCleanerLogs = $configfile.Settings.FileCleanerLogs
$fileCleanerLogsRententionPeriod = $configfile.Settings.FileCleanerLogsRententionPeriod
#Setup FileCleaner output success logfiles
$successLogfile = $configfile.Settings.SuccessOutputLogfile
$dirName = [io.path]::GetDirectoryName($successLogfile)
$filename = [io.path]::GetFileNameWithoutExtension($successLogfile)
$ext = [io.path]::GetExtension($successLogfile)
$successLogfile = "$dirName\$filename$(get-date -Format yyyy-MM-dd)$ext"
#Setup FileCleaner output error logfiles
$errorLogfile = $configfile.Settings.ErrorOutputLogfile
$dirName = [io.path]::GetDirectoryName($errorLogfile)
$filename = [io.path]::GetFileNameWithoutExtension($errorLogfile)
$ext = [io.path]::GetExtension($errorLogfile)
$errorLogfile = "$dirName\$filename$(get-date -Format yyyy-MM-dd)$ext"
#Setup Retention Period
$LastWrite = (Get-Date).AddDays(-$retentionPeriod)#.ToString("d")
$AICLastWrite = (Get-Date).AddDays(-$AICLogsRententionPeriod)#.ToString("d")
$fileCleanerLastWrite = (Get-Date).AddDays(-$fileCleanerLogsRententionPeriod)
#EMAIL SETTINGS
$smtpServer = $configfile.Settings.SMTPServer
$emailFrom = $configfile.Settings.EmailFrom
$emailTo = $configfile.Settings.EmailTo
$emailSubject = $configfile.Settings.EmailSubject
#Update the email subject to display the Host Server value
$emailSubject -replace "HostServer", $hostServer
$countUnaccessibleUNCPaths = 0
#Check Logfiles exists, if not create them
if(!(Test-Path -Path $successLogfile))
{
New-Item -Path $successLogfile –itemtype file
}
if(!(Test-Path -Path $errorLogfile))
{
New-Item -Path $errorLogfile –itemtype file
}
foreach ($dir in $dirs)
{
#needs a check to determine if server/the UNC Path is accessible. If it fails to connect, it needs to move on to the next UNC share but a flag needs to
#be generate to alert us to investigate why the UNC share was not accessible during the job run.
If(Test-Path -Path $dir)
{
#write to output logfile Directory info
$Msg = Write-Output "$(Get-Date -UFormat "%D / %T") - Accessing: $dir"
$Msg | out-file $successLogfile
If ($scanSubDirectories -eq "True")
{
If ($deleteAllFiles -eq "True")
{
#ScanSubDirectories and delete all files older than the $retentionPeriod, include Sub-Directories / also forces the deletion of any hidden files
$logFiles = Get-ChildItem -Path $dir -Force -Recurse -Exclude $fileExclusions[0],$fileExclusions[1] | Where { $_.LastWriteTime -le "$LastWrite" }
DeleteLogFiles($logFiles)
#foreach($logFile in $logFiles)
#{
# if($logFile -ne $null)
# {
# $Msg = Write-Output "$("Deleting File $logFile")"
# $Msg | out-file $successLogfile -append
# Remove-Item $logFile.FullName -Force
# }
#}
}
Else
{
#"ScanSubDirectories but only delete specified file types."
$logFiles = Get-Childitem $dir -Include $fileTypesToDelete[0],$fileTypesToDelete[1],$fileTypesToDelete[2], $liveSiteLogs -Recurse -Exclude $fileExclusions[0],$fileExclusions[1] | Where {$_.LastWriteTime -le "$LastWrite"}
DeleteLogFiles($logFiles)
#foreach($logFile in $logFiles)
#{
# if($logFile -ne $null)
# {
# $Msg = Write-Output "$("Deleting File $logFile")"
# $Msg | out-file $successLogfile -append
# Remove-Item $logFile.FullName -Force
# }
#}
}
}
Else
{
#Only delete files in top level Directory
If ($deleteAllFiles -eq "True")
{
$logFiles = Get-ChildItem -Path $dir -Force -Exclude $fileExclusions[0],$fileExclusions[1] | Where { $_.LastWriteTime -le "$LastWrite" }
DeleteLogFiles($logFiles)
#foreach($logFile in $logFiles)
#{
# if($logFile -ne $null)
# {
# $Msg = Write-Output "$("Deleting File $logFile")"
# $Msg | out-file $successLogfile -append
# Remove-Item $logFile.FullName -Force
# }
#}
}
Else
{
$logFiles = Get-Childitem $dir -Include $fileTypesToDelete[0],$fileTypesToDelete[1],$fileTypesToDelete[2], $liveSiteLogs -Exclude $fileExclusions[0],$fileExclusions[1] | Where {$_.LastWriteTime -le "$LastWrite"}
DeleteLogFiles($logFiles)
#foreach($logFile in $logFiles)
#{
# if($logFile -ne $null)
# {
# $Msg = Write-Output "$("Deleting File $logFile")"
# $Msg | out-file $successLogfile -append
# Remove-Item $logFile.FullName -Force
# }
#}
}
}
}
Else
{
$countUnaccessibleUNCPaths++
#server/the UNC Path is unaccessible
$Msg = Write-Output "$(Get-Date -UFormat "%D / %T") Unable to access $dir."
$Msg | out-file $errorLogfile -append
}
# Call the function to Delete the AIC XML Logfiles
DeleteAICXMLLogs $dir
}
#If any of the directories were unaccessible send an email to alert the team
if($countUnaccessibleUNCPaths.count -gt 0)
{
# Call the function to send the email
SendEmail $emailSubject $emailFrom $emailTo
}
#Only keep 2 weeks worth of the FileCleaner App logs for reference purposes
If(Test-Path -Path $fileCleanerLogs)
{
#write to output logfile Directory info
$Msg = Write-Output "$(Get-Date -UFormat "%D / %T") - Accessing: $fileCleanerLogs"
$Msg | out-file $successLogfile
$fileCleanerLogs = Get-Childitem $fileCleanerLogs -Recurse | Where {$_.LastWriteTime -le "$fileCleanerLastWrite"}
DeleteLogFiles($fileCleanerLogs)
#foreach($fileCleanerLog in $fileCleanerLogs)
#{
# if($fileCleanerLog -ne $null)
# {
# $Msg = Write-Output "$("Deleting File $fileCleanerLog")"
# $Msg | out-file $successLogfile -append
# Remove-Item $fileCleanerLog.FullName -Force
# }
#}
}
Function DeleteLogFiles($logFiles)
{
foreach($logFile in $logFiles)
{
if($logFile -ne $null)
{
$Msg = Write-Output "$("Deleting File $logFile")"
$Msg | out-file $successLogfile -append
Remove-Item $logFile.FullName -Force
}
}
}
Function DeleteAICXMLLogs($dir)
{
#Split the UNC path $dir to retrieve the server value
$parentpath = "\\" + [string]::join("\",$dir.Split("\")[2])
#test access to the \\server\D$\DebugXML path
If(Test-Path -Path $parentpath$AICLogs)
{
$Msg = Write-Output "$(Get-Date -UFormat "%D / %T") - Accessing: $parentpath$AICLogs"
$Msg | out-file $successLogfile
#Concantenate server value to $AICLogs to delete all xml logs in \\server\D$\DebugXML with a retention period of 30Days
$XMLlogFiles = Get-ChildItem -Path $parentpath$AICLogs -Force -Include $fileTypesToDelete[3]-Recurse -Exclude $fileExclusions[0],$fileExclusions[1] | Where { $_.LastWriteTime -le "$AICLastWrite" }
#get each file and add the filename to be deleted to the successLogfile before deleting the file
DeleteLogFiles($XMLlogFiles)
#foreach($XMLlogFile in $XMLlogFiles)
#{
# if($XMLlogFile -ne $null)
# {
# $Msg = Write-Output "$("Deleting File $XMLlogFile")"
# $Msg | out-file $successLogfile -append
# Remove-Item $XMLlogFile.FullName -Force
# }
#}
}
Else
{
$Msg = Write-Output "$("$parentpath$AICLogs does not exist.")"
$Msg | out-file $successLogfile -append
}
}
Function SendEmail($emailSubject, $emailFrom, $emailTo)
{
$MailMessage = New-Object System.Net.Mail.MailMessage
$SMTPClient = New-Object System.Net.Mail.smtpClient
$SMTPClient.host = $smtpServer
$Recipient = New-Object System.Net.Mail.MailAddress($emailTo, "Recipient")
$Sender = New-Object System.Net.Mail.MailAddress($emailFrom, "Sender")
$MailMessage.Sender = $Sender
$MailMessage.From = $Sender
$MailMessage.Subject = $emailSubject
$MailMessage.Body = #"
This email was generated because the FileCleaner script was unable to access some UNC Paths, please refer to $errorLogfile for more information.
Please inform the Team if you plan to resolve this.
This is an automated email please do not respond.
"#
$SMTPClient.Send($MailMessage)
}
when debugging I'm getting these errors:
DeleteAICXMLLogs : The term 'DeleteAICXMLLogs' is not recognized as
the name of a cmdlet, function, script file, or operable program.
Check the spelling of the name, or if a path was included, verify
that the path is correct and try again. At
C:\Users\pmcma\Documents\Projects\Replace FileCleaner with PowerShell
Script\FileCleaner.ps1:158 char:5
+ DeleteAICXMLLogs $dir
+ ~~~~~~~~~~~~~~~~
+ CategoryInfo : ObjectNotFound: (DeleteAICXMLLogs:String) [], CommandNotFoundException
+ FullyQualifiedErrorId : CommandNotFoundException
SendEmail : The term 'SendEmail' is not recognized as the name of a
cmdlet, function, script file, or operable program. Check the spelling
of the name, or if a path was included, verify that the path is
correct and try again. At C:\Users\pmcma\Documents\Projects\Replace
FileCleaner with PowerShell Script\FileCleaner.ps1:164 char:5
+ SendEmail $emailSubject $emailFrom $emailTo
+ ~~~~~~~~~
+ CategoryInfo : ObjectNotFound: (SendEmail:String) [], CommandNotFoundException
+ FullyQualifiedErrorId : CommandNotFoundException
DeleteLogFiles : The term 'DeleteLogFiles' is not recognized as the
name of a cmdlet, function, script file, or operable program. Check
the spelling of the name, or if a path was included, verify that the
path is correct and try again. At
C:\Users\pmcma\Documents\Projects\Replace FileCleaner with PowerShell
Script\FileCleaner.ps1:175 char:5
+ DeleteLogFiles($fileCleanerLogs)
+ ~~~~~~~~~~~~~~
+ CategoryInfo : ObjectNotFound: (DeleteLogFiles:String) [], CommandNotFoundException
+ FullyQualifiedErrorId : CommandNotFoundException
I don't see anything wrong with how I'm declaring the functions or calling them. Any ideas why this script is failing?
PowerShell Scripts are read from the top to the bottom, so you can't use any references before they are defined, most probably that is why you are receiving errors.
Try adding your function definition blocks above the point where you call them.
Alternatively you can make a function having global scope. Just preface the function name with the keyword global: like,
function global:test ($x, $y)
{
$x * $y
}
I've had this happen as well. Try placing the functions before the business logic. This is a script, not compiled code. So the functions are yet to be declared before you are calling them.