I am using PowerShell to compare the file count and size (per file extension) in 2 separate directories.
$User = $env:username
$pwd = pwd
clear
write-host "`n"
write-host "`n"
write-host "`n"
write "The current user is: $User"
write-host "`n"
write "The current path is: $pwd"
write-host "`n"
write-host "`n"
write-host "`n"
write "We need to know the following information:"
write "`n"
write "`n"
$UserDesktopPath = Read-Host "New PC User Desktop Path" # This should be the new PC Desktop Path
$UserDocumentPath = Read-Host "New PC User Document Path" # This should be the new PC Document Path
$USBDesktopPathServer = Read-Host "USB User Desktop Path" # This should be the USB User Desktop Path
$USBDocumentPathServer = Read-Host "USB User Document Path" # This should be the USB User Document Path
clear
write-host "`n"
write-host "`n"
write-host "`n"
write "This is the results for your Desktop Folder Paths:"
write-host "`n"
$folder_new = Get-ChildItem -Recurse -path "$USBDesktopPathServer" # Recurses the New PC Desktop
$folder_old = Get-ChildItem -Recurse -path "$UserDesktopPath" # Recurses the USB Backup Desktop
Compare-Object -ReferenceObject "$folder_new" -DifferenceObject "$folder_old" # Compares the two folders for the path to identify discrepancies
write-host "`n"
write "This is the results for your Documents Folder Paths:"
write-host "`n"
write-host "`n"
write-host "`n"
$folder_new1 = Get-ChildItem -Recurse -path "$UserDocumentPath" # Recurses the New PC Documents
$folder_old1 = Get-ChildItem -Recurse -path "$USBDocumentPathServer" # Recurses the USB Backup Documents
Compare-Object -ReferenceObject "$folder_new1" -DifferenceObject "$folder_old1" # Compares the two folders for the path to identify discrepancies
write-host "`n"
write-host "`n"
write-host "`n"
write "Now we shall compare file sizes of your Documents:"
write-host "`n"
write-host "`n"
write-host "`n"
write-host "`n"
function doc{
$DirectoryDocuments = "$USBDocumentPathServer", "$UserDocumentPath"
foreach ($Directory in $DirectoryDocuments) {
Get-ChildItem -Path $Directory -Recurse |
Where-Object {-not $_.PSIsContainer} |
Tee-Object -Variable Files |
Group-Object -Property Extension |
Select-Object -Property #{
n = "Directory"
e = {$Directory}
},
#{
n = "Extension"
e = { $_.Name -replace '^\.' }
},
#{
n = "Size (MB)"
e={ [math]::Round( ( ( $_.Group | Measure-Object Length -Sum ).Sum / 1MB ), 2 ) }
},
Count
$Files |
Measure-Object -Sum -Property Length |
Select-Object -Property #{
n = 'Extension'
e = { 'Total' }
},
#{
n = 'Size (MB)'
e = { [math]::Round( ( $_.Sum / 1MB ), 2 ) }
},
Count
}
}
When using the ISE and calling dtop I get the correct return:
PS C:\Users\Michael Nancarrow> dtop
Directory Extension Size (MB) Count
--------- --------- --------- -----
D:\Deployment Kit\Test\Desktop2 txt 0 1
Total 0 1
D:\Deployment Kit\Test\Desktop1 txt 0 11
Total 0 11
Yet when run in the script, it does not return any value. I have attempted to call a function write $tst which runs dtop and that does the same (writes null).
Furthermore, I have removed the { so it does not run as a function, and it operates without an issue. My concern is perhaps the -Path file cannot be parsed at the same time as the input - meaning: when I call dtop from ISE it already has the $Directory variable stored in memory.
Are there any obvious errors here? I am rather new to PowerShell and am unsure where the mistake lies.
Related
I am trying to create a program that will copy files from our backup to archive directory using Powershell. I have two criteria in order for this program to run smoothly. One is that we have files from both current year and past years so only the files from this year must be copied over. Another is that we have to check to make sure that we are not copying over files of the same file name in case if the data in the file is accidentally modified. Whenever I have this program not in a function, it works. But in a function, it gives me errors that it "cannot find the path" of the folder that I am copying from and the folder that I'm pasting the files to. I am going to use this for more than sixty locations, so it would be better that I don't have to rewrite the code in the function sixty times. I thought about using Robocopy, but I am still getting the same issues regardless with files not being copied over.
Function Copy-Data {
param (
[system.object]$copyFolder,
[system.object]$pasteFolder,
[int]$currentYear,
[int]$lastYear,
[int]$nextYear)
$copyItem = Get-ChildItem -Path $copyFolder
$pasteItem = Get-ChildItem -Path $pasteFolder
$copyCount = $copyItem.count
for ($i = 0; $i -lt $copyCount; $i++)
{
$copyName = $copyItem.Name
$testPath = Test-Path "$pasteFolder$copyName"
if ($copyItem[$i].LastWriteTime -gt $firstDate -and $copyItem[$i].LastWriteTime -lt $lastDate)
{
if ($testPath -eq $false)
{
Copy-Item -Path $copyFolder$copyName -Destination $pasteFolder
#Robocopy "$copyFolder$copyItem[$i]" "$pasteFolder"
Write-Host $pasteFolder$copyName
}
}
}
}
$currentYear = Get-Date -Format "yyyy"
$lastYear = [int]$currentYear - 1
$nextYear = [int]$currentYear + 1
$firstDate = "12/31/$lastYear"
$lastDate = "01/01/$nextYear"
$copyFolder = "\\fileshare\test\copy\"
$pasteFolder = "\\fileshare\test\$currentYear\paste\"
Copy-Data ($copyFolder, $pasteFolder, $currentYear, $lastYear, $nextYear)
I feel like you are making this more complicated than it needs to be.
Core code can be something like:
Get-ChildItem -Path $copyFolder |
ForEach-Object{
If( !(Test-Path $pasteFolder -Name $_.Name ) )
{
Copy-Item $_.FullName -Destination $pasteFolder
}
}
You can use just the destination path you don't have to give the full path of the destination file.
As a function it may look something like:
Function CopyFoldercontents
{
Param(
[Parameter( Mandatory = $true, Position = 0)]
[String]$copyFolder,
[Parameter( Mandatory = $true, Position = 1)]
[String]$pasteFolder
) # End Parameter Block.
Get-ChildItem -Path $copyFolder |
ForEach-Object{
If( !(Test-Path $pasteFolder -Name $_.Name ) )
{
Copy-Item $_.FullName -Destination $pasteFolder
}
}
} # End Function CopyFolderContents
This could be more robust though, depends on what direction you want to take it.
Continuing from my comment.
You could just do this...
(validate what you are after on both sides, the construct the final function)
Function Start-FolderMirror
{
[CmdletBinding()]
[Alias('mir')]
Param
(
[string]$SourcePath = (Read-Host -Prompt 'Enter a source path'),
[string]$DestinationPath = (Read-Host -Prompt 'Enter a destination path')
)
$SourceFiles = (Get-ChildItem -Path $SourcePath -File).FullName
$DestinationFiles = (Get-ChildItem -Path $DestinationPath -File).FullName
Compare-Object -ReferenceObject $SourceFiles -DifferenceObject $DestinationFiles -IncludeEqual |
Select-Object -First 5
}
Start-FolderMirror -SourcePath 'd:\temp' -DestinationPath 'D:\temp\TestFiles'
# Results
<#
InputObject SideIndicator
----------- -------------
D:\temp\TestFiles\abc - Copy - Copy.bat =>
D:\temp\TestFiles\abc - Copy.bat =>
D:\temp\TestFiles\abc.bat =>
D:\temp\(MSINFO32) command-line tool switches.pdf <=
D:\temp\23694d1213305764-revision-number-in-excel-book1.xls <=
#>
Function Start-FolderMirror
{
[CmdletBinding(SupportsShouldProcess)]
[Alias('mir')]
Param
(
[string]$SourcePath = (Read-Host -Prompt 'Enter a source path'),
[string]$DestinationPath = (Read-Host -Prompt 'Enter a destination path')
)
$SourceFiles = (Get-ChildItem -Path $SourcePath -File).FullName
$DestinationFiles = (Get-ChildItem -Path $DestinationPath -File).FullName
Compare-Object -ReferenceObject $SourceFiles -DifferenceObject $DestinationFiles -IncludeEqual |
Select-Object -First 5 |
Where-Object -Property SideIndicator -Match '<='
}
Start-FolderMirror -SourcePath 'd:\temp' -DestinationPath 'D:\temp\TestFiles' -WhatIf
# Results
<#
InputObject SideIndicator
----------- -------------
D:\temp\(MSINFO32) command-line tool switches.pdf <=
D:\temp\23694d1213305764-revision-number-in-excel-book1.xls <=
#>
Function Start-FolderMirror
{
[CmdletBinding(SupportsShouldProcess)]
[Alias('mir')]
Param
(
[string]$SourcePath = (Read-Host -Prompt 'Enter a source path'),
[string]$DestinationPath = (Read-Host -Prompt 'Enter a destination path')
)
$SourceFiles = (Get-ChildItem -Path $SourcePath -File).FullName
$DestinationFiles = (Get-ChildItem -Path $DestinationPath -File).FullName
Compare-Object -ReferenceObject $SourceFiles -DifferenceObject $DestinationFiles -IncludeEqual |
Select-Object -First 5 |
Where-Object -Property SideIndicator -Match '<=' |
ForEach {Copy-Item -Path $PSItem.InputObject -Destination $DestinationPath}
}
Start-FolderMirror -SourcePath 'd:\temp' -DestinationPath 'D:\temp\TestFiles' -WhatIf
# Results
<#
What if: Performing the operation "Copy File" on target "Item: D:\temp\(MSINFO32) command-line tool switches.pdf Destination: D:\temp\TestFiles\(MSINFO32) command-line tool switches.pdf".
What if: Performing the operation "Copy File" on target "Item: D:\temp\23694d1213305764-revision-number-in-excel-book1.xls Destination: D:\temp\TestFiles\23694d1213305764-revision-number-in-excel-book1.xls".
#>
I'm trying to get what permissions files and folders have and export to a csv file. I can get the info to display on screen, but when I try to export it the resulting csv file is empty.
The code:
function Test-IsWritable(){
<#
.Synopsis
Command tests if a file is present and writable.
.Description
Command to test if a file is writeable. Returns true if file can be opened for write access.
.Example
Test-IsWritable -path $foo
Test if file $foo is accesible for write access.
.Example
$bar | Test-IsWriteable
Test if each file object in $bar is accesible for write access.
.Parameter Path
Psobject containing the path or object of the file to test for write access.
#>
[CmdletBinding()]
param([Parameter(Mandatory=$true,ValueFromPipeline=$true)][psobject]$path)
process{
Write-Host "Test if file $path is writeable"
if (Test-Path -Path $path -PathType Any){
$target = Get-Item $path -Force
try{
$writestream = $target.Openwrite()
$writestream.Close() | Out-Null
Remove-Variable -Name writestream
Write-Host "File is writable" -ForegroundColor DarkGreen
Write-Output $true
}
catch{
Write-Host "File is not writable" -ForegroundColor DarkRed
Write-Output $false
}
Remove-Variable -Name target
}
else{
Write-Host "File $path does not exist or is a directory" -ForegroundColor Red
Write-Output $false
}
}
}
write-host "WARNING: If checking deep folders (where the full path is longer than 248 characters) please " -foregroundcolor Yellow -NoNewline
Write-Host "MAP THE DRIVE " -ForegroundColor Red -NoNewline
Write-Host "in order to keep the names as short as possible" -ForegroundColor Yellow
$basefolder = Read-Host -Prompt 'What is the folder or files you want to get permissions of?'
write-host "WARNING: if permissions.csv already exists, it will be overwritten!" -foregroundcolor Yellow
Write-Host 'Export results to CSV? (y/n): ' -ForegroundColor Magenta -NoNewline
$export = Read-Host
if ($export -like "y")
{
Write-Host "Name the file (ex: permissions.csv): " -ForegroundColor Magenta -NoNewline
$FileName = Read-Host
$Outfile = “$PSScriptRoot\$FileName”
write-host "Will write results to $PSScriptRoot\$FileName" -ForegroundColor Green
}
else
{
write-host "User did not type 'y', continuing" -ForegroundColor DarkYellow
}
$files = get-childitem $basefolder -recurse -File
Write-Host $files
Write-Host "=========================" -ForegroundColor Black
#$subfiles = Get-ChildItem $folders -Recurse -File
#Write-Host $folders
#Write-Host "=========================" -ForegroundColor Black
#Write-Host $subfiles
$results = foreach($folder in $files) {
New-Object psobject -Property #{
File = $folder;
Access = "$basefolder\$folder" | Test-IsWritable
}
Write-Host $folder
}
#$subresults = foreach($subfile in $subfiles) {
# New-Object psobject -Property #{
# File = $subfiles;
# Access = $subfile | Test-IsWritable;
# }
#}
Write-Host $results
Write-Host "Finished combo loop, exporting..." -ForegroundColor Green
$results | Export-Csv $Outfile -NoTypeInformation -Delimiter ";"
Write-Host "Converting delimited CSV to Column Excel Spreadsheet"
$outputXLSX = $PSScriptRoot + "\$Filename.xlsx"
$excel = New-Object -ComObject excel.application
$workbook = $excel.Workbooks.Add(1)
$worksheet = $workbook.worksheets.Item(1)
$TxtConnector = ("TEXT;" + $Outfile)
$Connector = $worksheet.QueryTables.add($TxtConnector,$worksheet.Range("A1"))
$query = $worksheet.QueryTables.item($Connector.name)
$query.TextFileOtherDelimiter = ';'
$query.TextFileParseType = 1
$query.TextFileColumnDataTypes = ,2 * $worksheet.Cells.Columns.Count
$query.AdjustColumnWidth = 1
$query.Refresh()
$query.Delete()
$Workbook.SaveAs($outputXLSX,51)
$excel.Quit()
Remove-Item $Outfile
Write-Host "See $PSScriptRoot\$Filename.xlsx for results" -ForegroundColor Green
UPDATE: Mostly working, strange output though:
Z:\testfolder\file1.txt
Z:\testfolder\file1.txt
Z:\testfolder\file1.txt
Z:\testfolder\file1.txt
Z:\testfolder\file1.txt
Z:\testfolder\file1.txt
Z:\testfolder\file1.txt
Z:\testfolder\file2.txt
Z:\testfolder\file2.txt
Z:\testfolder\file2.txt
Z:\testfolder\file2.txt
Z:\testfolder\file2.txt
Z:\testfolder\file2.txt
Z:\testfolder\file2.txt
Z:\testfolder\file3.rar
Z:\testfolder\file3.rar
Z:\testfolder\file3.rar
Z:\testfolder\file3.rar
Z:\testfolder\file3.rar
Z:\testfolder\file3.rar
Z:\testfolder\file3.rar
The specified path, file name, or both are too
long. The fully qualified file name must be less than 260 characters,
and the directory name must be less than 248 characters.
In the next column:
FileAccess
FullControl
FullControl
FullControl
Modify, Synchronize
ReadAndExecute, Synchronize
Modify, Synchronize
Modify, Synchronize
FullControl
FullControl
FullControl
Modify, Synchronize
...
The specified path, file name, or both are too long. The fully
qualified file name must be less than 260 characters, and the
directory name must be less than 248 characters.
I'm not sure why it's showing multiple rows for the same file, I'd like to have 1 row per file with the true File Access.
Remove Write-Host before using Export-Csv. Write-Hostconsumes the data from the pipeline and only outputs it on screen.
#(...)
$i = 0
$results = foreach($acl in $acls) {
$folder = (Convert-Path $acl.pspath)
Write-Progress -Activity "Getting Security" -Status "checking $folder" -PercentComplete ($i / $folders.Count * 100)
foreach($access in $acl.GetAccessRules($true, $true, [System.Security.Principal.SecurityIdentifier])) {
New-Object psobject -Property #{
Folder = $folder;
User = $acl.Owner;
Group=$acl.Group;
Mode = $access.AccessControlType;
FileAcess = $access.FileSystemRights;
}
}
$i++
}
Write-Host "Reached End, exporting..." -ForegroundColor Green
$results | Export-Csv $Outfile -NoTypeInformation -Delimiter ";"
I have a long winded script that gets a user from an CSV file, matches user against a CSV filename from a specific directory.
The user is matched against this CSV file in this format <data><samaccountname><text>.csv
The aim here is to get an AD User from a list, then scan a folder with CSV Files in it and match against the user. From there restore the user AD attributes.
The issue here is that the output is always of the last user twice, I have REM out the export at the end so I can see what is on screen first.
Clear-Host
#Get username from users list and match against CSV file name.
$FDate = (get-date).ToString("yyyMMdd")
$Project = "<FolderPath>" #Project name used to setup folders and for reports etc
$ProjectRoot = "<path>\" # Backup folder
$RestorePath = $ProjectRoot + $Project #combined path for restoring
$UsersListFile = $ProjectRoot + '\Userlist.csv' #Userlist
$Results = #{} # Storage for all csv files
$PSObject = New-Object psobject
$Report = #() #For Export-CSV
$Results = gci $RestorePath -Filter '*.csv'
$i = 0
foreach ($File in $Results) {
$i += 1
Write-Host 'Number of passes - '$i
Write-Host 'Current file processing - '$file.Name -for Green
foreach ($User in (import-csv $UsersListFile)) {
$SAM = $User.SamAccountName
Write-Host 'Current User processing - '$SAM -ForegroundColor Magenta
if ($file.Name -match $SAM) {
Write-host "Filename and user $SAM match " -for Yellow
$Row= New-Object psobject
$ROW | Add-Member -type NoteProperty -name Name -value $SAM -force
$Report += $Row
foreach ($Attrib in (import-csv $restorepath\$file)) {
#Write-host 'Attributes in file - ' $attrib.samaccountname $Attrib.mail -for Yellow
#Use this to restore AD User data
}
} else {
Write-Host "No match" -ForegroundColor Red
}
}
}
#$Report | Export-Csv $RestorePath'\Test.csv' -NoTypeInformation -Force
$Report | Sort-Object Name
Updated script to move New-Object psobject to above $Row, so this creates a new object each time, rather then overwriting previous entry.
I've written the following PS script to delete log files from specific server paths. I'm a novice to PS but I'm getting some errors with a few of the functions that I have written in this script:
#* FileName: FileCleaner.ps1
#Clear the screen
Clear
#Read XML Config File to get settings
[xml]$configfile = Get-Content "C:\Users\pmcma\Documents\Projects\Replace FileCleaner with PowerShell Script\FileCleaner.config.xml"
#Declare and set variables from Config values
$hostServer = $configfile.Settings.HostServer
$dirs = #($configfile.Settings.DirectoryName.Split(",").Trim())
$scanSubDirectories = $configfile.Settings.ScanSubDirectories
$deleteAllFiles = $configfile.Settings.deleteAllFiles
$fileTypesToDelete = #($configfile.Settings.FileTypesToDelete.Split(";").Trim())
$liveSiteLogs = $configfile.Settings.LiveSiteLogs
$fileExclusions = #($configfile.Settings.FileExclusions.Split(";").Trim())
$retentionPeriod = $configfile.Settings.RetentionPeriod
$AICLogs = $configfile.Settings.AICLogs
$AICLogsRententionPeriod = $configfile.Settings.AICLogsRententionPeriod
$fileCleanerLogs = $configfile.Settings.FileCleanerLogs
$fileCleanerLogsRententionPeriod = $configfile.Settings.FileCleanerLogsRententionPeriod
#Setup FileCleaner output success logfiles
$successLogfile = $configfile.Settings.SuccessOutputLogfile
$dirName = [io.path]::GetDirectoryName($successLogfile)
$filename = [io.path]::GetFileNameWithoutExtension($successLogfile)
$ext = [io.path]::GetExtension($successLogfile)
$successLogfile = "$dirName\$filename$(get-date -Format yyyy-MM-dd)$ext"
#Setup FileCleaner output error logfiles
$errorLogfile = $configfile.Settings.ErrorOutputLogfile
$dirName = [io.path]::GetDirectoryName($errorLogfile)
$filename = [io.path]::GetFileNameWithoutExtension($errorLogfile)
$ext = [io.path]::GetExtension($errorLogfile)
$errorLogfile = "$dirName\$filename$(get-date -Format yyyy-MM-dd)$ext"
#Setup Retention Period
$LastWrite = (Get-Date).AddDays(-$retentionPeriod)#.ToString("d")
$AICLastWrite = (Get-Date).AddDays(-$AICLogsRententionPeriod)#.ToString("d")
$fileCleanerLastWrite = (Get-Date).AddDays(-$fileCleanerLogsRententionPeriod)
#EMAIL SETTINGS
$smtpServer = $configfile.Settings.SMTPServer
$emailFrom = $configfile.Settings.EmailFrom
$emailTo = $configfile.Settings.EmailTo
$emailSubject = $configfile.Settings.EmailSubject
#Update the email subject to display the Host Server value
$emailSubject -replace "HostServer", $hostServer
$countUnaccessibleUNCPaths = 0
#Check Logfiles exists, if not create them
if(!(Test-Path -Path $successLogfile))
{
New-Item -Path $successLogfile –itemtype file
}
if(!(Test-Path -Path $errorLogfile))
{
New-Item -Path $errorLogfile –itemtype file
}
foreach ($dir in $dirs)
{
#needs a check to determine if server/the UNC Path is accessible. If it fails to connect, it needs to move on to the next UNC share but a flag needs to
#be generate to alert us to investigate why the UNC share was not accessible during the job run.
If(Test-Path -Path $dir)
{
#write to output logfile Directory info
$Msg = Write-Output "$(Get-Date -UFormat "%D / %T") - Accessing: $dir"
$Msg | out-file $successLogfile
If ($scanSubDirectories -eq "True")
{
If ($deleteAllFiles -eq "True")
{
#ScanSubDirectories and delete all files older than the $retentionPeriod, include Sub-Directories / also forces the deletion of any hidden files
$logFiles = Get-ChildItem -Path $dir -Force -Recurse -Exclude $fileExclusions[0],$fileExclusions[1] | Where { $_.LastWriteTime -le "$LastWrite" }
DeleteLogFiles($logFiles)
#foreach($logFile in $logFiles)
#{
# if($logFile -ne $null)
# {
# $Msg = Write-Output "$("Deleting File $logFile")"
# $Msg | out-file $successLogfile -append
# Remove-Item $logFile.FullName -Force
# }
#}
}
Else
{
#"ScanSubDirectories but only delete specified file types."
$logFiles = Get-Childitem $dir -Include $fileTypesToDelete[0],$fileTypesToDelete[1],$fileTypesToDelete[2], $liveSiteLogs -Recurse -Exclude $fileExclusions[0],$fileExclusions[1] | Where {$_.LastWriteTime -le "$LastWrite"}
DeleteLogFiles($logFiles)
#foreach($logFile in $logFiles)
#{
# if($logFile -ne $null)
# {
# $Msg = Write-Output "$("Deleting File $logFile")"
# $Msg | out-file $successLogfile -append
# Remove-Item $logFile.FullName -Force
# }
#}
}
}
Else
{
#Only delete files in top level Directory
If ($deleteAllFiles -eq "True")
{
$logFiles = Get-ChildItem -Path $dir -Force -Exclude $fileExclusions[0],$fileExclusions[1] | Where { $_.LastWriteTime -le "$LastWrite" }
DeleteLogFiles($logFiles)
#foreach($logFile in $logFiles)
#{
# if($logFile -ne $null)
# {
# $Msg = Write-Output "$("Deleting File $logFile")"
# $Msg | out-file $successLogfile -append
# Remove-Item $logFile.FullName -Force
# }
#}
}
Else
{
$logFiles = Get-Childitem $dir -Include $fileTypesToDelete[0],$fileTypesToDelete[1],$fileTypesToDelete[2], $liveSiteLogs -Exclude $fileExclusions[0],$fileExclusions[1] | Where {$_.LastWriteTime -le "$LastWrite"}
DeleteLogFiles($logFiles)
#foreach($logFile in $logFiles)
#{
# if($logFile -ne $null)
# {
# $Msg = Write-Output "$("Deleting File $logFile")"
# $Msg | out-file $successLogfile -append
# Remove-Item $logFile.FullName -Force
# }
#}
}
}
}
Else
{
$countUnaccessibleUNCPaths++
#server/the UNC Path is unaccessible
$Msg = Write-Output "$(Get-Date -UFormat "%D / %T") Unable to access $dir."
$Msg | out-file $errorLogfile -append
}
# Call the function to Delete the AIC XML Logfiles
DeleteAICXMLLogs $dir
}
#If any of the directories were unaccessible send an email to alert the team
if($countUnaccessibleUNCPaths.count -gt 0)
{
# Call the function to send the email
SendEmail $emailSubject $emailFrom $emailTo
}
#Only keep 2 weeks worth of the FileCleaner App logs for reference purposes
If(Test-Path -Path $fileCleanerLogs)
{
#write to output logfile Directory info
$Msg = Write-Output "$(Get-Date -UFormat "%D / %T") - Accessing: $fileCleanerLogs"
$Msg | out-file $successLogfile
$fileCleanerLogs = Get-Childitem $fileCleanerLogs -Recurse | Where {$_.LastWriteTime -le "$fileCleanerLastWrite"}
DeleteLogFiles($fileCleanerLogs)
#foreach($fileCleanerLog in $fileCleanerLogs)
#{
# if($fileCleanerLog -ne $null)
# {
# $Msg = Write-Output "$("Deleting File $fileCleanerLog")"
# $Msg | out-file $successLogfile -append
# Remove-Item $fileCleanerLog.FullName -Force
# }
#}
}
Function DeleteLogFiles($logFiles)
{
foreach($logFile in $logFiles)
{
if($logFile -ne $null)
{
$Msg = Write-Output "$("Deleting File $logFile")"
$Msg | out-file $successLogfile -append
Remove-Item $logFile.FullName -Force
}
}
}
Function DeleteAICXMLLogs($dir)
{
#Split the UNC path $dir to retrieve the server value
$parentpath = "\\" + [string]::join("\",$dir.Split("\")[2])
#test access to the \\server\D$\DebugXML path
If(Test-Path -Path $parentpath$AICLogs)
{
$Msg = Write-Output "$(Get-Date -UFormat "%D / %T") - Accessing: $parentpath$AICLogs"
$Msg | out-file $successLogfile
#Concantenate server value to $AICLogs to delete all xml logs in \\server\D$\DebugXML with a retention period of 30Days
$XMLlogFiles = Get-ChildItem -Path $parentpath$AICLogs -Force -Include $fileTypesToDelete[3]-Recurse -Exclude $fileExclusions[0],$fileExclusions[1] | Where { $_.LastWriteTime -le "$AICLastWrite" }
#get each file and add the filename to be deleted to the successLogfile before deleting the file
DeleteLogFiles($XMLlogFiles)
#foreach($XMLlogFile in $XMLlogFiles)
#{
# if($XMLlogFile -ne $null)
# {
# $Msg = Write-Output "$("Deleting File $XMLlogFile")"
# $Msg | out-file $successLogfile -append
# Remove-Item $XMLlogFile.FullName -Force
# }
#}
}
Else
{
$Msg = Write-Output "$("$parentpath$AICLogs does not exist.")"
$Msg | out-file $successLogfile -append
}
}
Function SendEmail($emailSubject, $emailFrom, $emailTo)
{
$MailMessage = New-Object System.Net.Mail.MailMessage
$SMTPClient = New-Object System.Net.Mail.smtpClient
$SMTPClient.host = $smtpServer
$Recipient = New-Object System.Net.Mail.MailAddress($emailTo, "Recipient")
$Sender = New-Object System.Net.Mail.MailAddress($emailFrom, "Sender")
$MailMessage.Sender = $Sender
$MailMessage.From = $Sender
$MailMessage.Subject = $emailSubject
$MailMessage.Body = #"
This email was generated because the FileCleaner script was unable to access some UNC Paths, please refer to $errorLogfile for more information.
Please inform the Team if you plan to resolve this.
This is an automated email please do not respond.
"#
$SMTPClient.Send($MailMessage)
}
when debugging I'm getting these errors:
DeleteAICXMLLogs : The term 'DeleteAICXMLLogs' is not recognized as
the name of a cmdlet, function, script file, or operable program.
Check the spelling of the name, or if a path was included, verify
that the path is correct and try again. At
C:\Users\pmcma\Documents\Projects\Replace FileCleaner with PowerShell
Script\FileCleaner.ps1:158 char:5
+ DeleteAICXMLLogs $dir
+ ~~~~~~~~~~~~~~~~
+ CategoryInfo : ObjectNotFound: (DeleteAICXMLLogs:String) [], CommandNotFoundException
+ FullyQualifiedErrorId : CommandNotFoundException
SendEmail : The term 'SendEmail' is not recognized as the name of a
cmdlet, function, script file, or operable program. Check the spelling
of the name, or if a path was included, verify that the path is
correct and try again. At C:\Users\pmcma\Documents\Projects\Replace
FileCleaner with PowerShell Script\FileCleaner.ps1:164 char:5
+ SendEmail $emailSubject $emailFrom $emailTo
+ ~~~~~~~~~
+ CategoryInfo : ObjectNotFound: (SendEmail:String) [], CommandNotFoundException
+ FullyQualifiedErrorId : CommandNotFoundException
DeleteLogFiles : The term 'DeleteLogFiles' is not recognized as the
name of a cmdlet, function, script file, or operable program. Check
the spelling of the name, or if a path was included, verify that the
path is correct and try again. At
C:\Users\pmcma\Documents\Projects\Replace FileCleaner with PowerShell
Script\FileCleaner.ps1:175 char:5
+ DeleteLogFiles($fileCleanerLogs)
+ ~~~~~~~~~~~~~~
+ CategoryInfo : ObjectNotFound: (DeleteLogFiles:String) [], CommandNotFoundException
+ FullyQualifiedErrorId : CommandNotFoundException
I don't see anything wrong with how I'm declaring the functions or calling them. Any ideas why this script is failing?
PowerShell Scripts are read from the top to the bottom, so you can't use any references before they are defined, most probably that is why you are receiving errors.
Try adding your function definition blocks above the point where you call them.
Alternatively you can make a function having global scope. Just preface the function name with the keyword global: like,
function global:test ($x, $y)
{
$x * $y
}
I've had this happen as well. Try placing the functions before the business logic. This is a script, not compiled code. So the functions are yet to be declared before you are calling them.
My script keeps bugging me with the following exception
copy-item : Cannot find drive. A drive with the name 'F' does not exist.
At C:\Program Files (x86)\CA\ARCserve Backup\Templates\RB_Pre_Process.ps1:58 char:1
+ copy-item -Path $drive -Destination $DST_DRIVE -Recurse -ErrorAction Stop
+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+ CategoryInfo : ObjectNotFound: (F:String) [Copy-Item], DriveNotFoundException
+ FullyQualifiedErrorId : DriveNotFound,Microsoft.PowerShell.Commands.CopyItemCommand
This is what my script looks like. I am mounting an ISO image on drive F: and I have added a "start-slepp -s 5" command so i can verify the image get's mounted, which it does!
$BACKUP_PATH = "E:\00_BACKUP_DATA"
$DR_PATH = "E:\01_DR_DATA"
$ISO_IMAGE = "C:\Program Files (x86)\CA\ARCserve Backup\Templates\Winpe_x64.iso"
$DST_DRIVE = "E:\"
try {
New-EventLog -LogName Application -Source "RB_Pre_Process.ps1" -ErrorAction Stop
} catch [System.InvalidOperationException] {
Write-host $_
}
try {
Write-Host "Preparing RDX cartridge..."
# Query for disk object
$disk_number = (Get-Disk | Where-Object -Property FriendlyName -like "TANDBERG RDX*").Number
# Remove partitions
Get-Disk $disk_number | Clear-Disk -RemoveData -Confirm:$false | Out-Null
# Create new partition
New-Partition -DiskNumber $disk_number -UseMaximumSize | Out-Null
# Format partition
Format-Volume -DriveLetter E -FileSystem NTFS -NewFileSystemLabel "RDX_TAPE" -Confirm:$false | Out-Null
# Set partition as active
Set-Partition -DriveLetter E -IsActive:$true | Out-Null
} catch {
Write-Host $_
Write-EventLog -LogName Application -Source $MyInvocation.MyCommand.Name -EventID 2 -Message $_
}
try {
Write-Host "Creating folder structure..."
new-item -itemtype directory -Path $BACKUP_PATH -ErrorAction stop | Out-Null
new-item -itemtype directory -path $DR_PATH -ErrorAction stop | Out-Null
} catch {
Write-Host $_
Write-EventLog -LogName Application -Source $MyInvocation.MyCommand.Name -EventID 2 -Message $_
}
try {
Write-Host "Mounting ISO image..."
$image = Mount-DiskImage -ImagePath $ISO_IMAGE -PassThru -ErrorAction Stop
} catch [ParameterBindingException] {
Write-Host $_
Write-EventLog -LogName Application -Source $MyInvocation.MyCommand.Name -EventId 2 -Message $_
}
$drive = ($image | Get-Volume).DriveLetter
$drive += ":\*"
Start-Sleep -s 5
try {
Write-Host "Copying ISO content..."
copy-item -Path $drive -Destination $DST_DRIVE -Recurse -ErrorAction Stop
} catch {
Write-Host $_
Write-EventLog -LogName Application -Source $MyInvocation.MyCommand.Name -EventId 2 -Message $_
}
try {
Write-Host "Unmounting ISO image..."
Dismount-DiskImage -ImagePath $ISO_IMAGE -ErrorAction Stop
} catch [System.Exception] {
Write-Host $_
Write-EventLog -LogName Application -Source $MyInvocation.MyCommand.Name -EventId 2 -Message $_
}
So, what's going wrong here? Sometimes it works sometimes not...
I "solved" the issue... my script is working perfectly fine when it's getting started directly from the PowerShell prompt instead of the PowerShell ISE... So the IDE is the culprit.
it seems the mounted image can't be reached in powershell. I think it's a limitation of the provider. A possible workaround is issuing CMD command. You could replace
copy-item -Path $drive -Destination $DST_DRIVE -Recurse -ErrorAction Stop
with
& cmd /C "copy F:\* G:\dest\"
Here I just give an example, you may need to do further work to copy recursively..you could use the xcopy or robocopy which could handle recursive copy.