SMO restore of SQL database doesn't overwrite - sql-server-2008

I'm trying to restore a database from a backup file using SMO. If the database does not already exist then it works fine. However, if the database already exists then I get no errors, but the database is not overwritten.
The "restore" process still takes just as long, so it looks like it's working and doing a restore, but in the end the database has not changed.
I'm doing this in Powershell using SMO. The code is a bit long, but I've included it below. You'll notice that I do set $restore.ReplaceDatabase = $true. Also, I use a try-catch block and report on any errors (I hope), but none are returned.
Any obvious mistakes? Is it possible that I'm not reporting some error and it's being hidden from me?
Thanks for any help or advice that you can give!
function Invoke-SqlRestore {
param(
[string]$backup_file_name,
[string]$server_name,
[string]$database_name,
[switch]$norecovery=$false
)
# Get a new connection to the server
[Microsoft.SqlServer.Management.Smo.Server]$server = New-SMOconnection -server_name $server_name
Write-Host "Starting restore to $database_name on $server_name."
Try {
$backup_device = New-Object("Microsoft.SqlServer.Management.Smo.BackupDeviceItem") ($backup_file_name, "File")
# Get local paths to the Database and Log file locations
If ($server.Settings.DefaultFile.Length -eq 0) {$database_path = $server.Information.MasterDBPath }
Else { $database_path = $server.Settings.DefaultFile}
If ($server.Settings.DefaultLog.Length -eq 0 ) {$database_log_path = $server.Information.MasterDBLogPath }
Else { $database_log_path = $server.Settings.DefaultLog}
# Load up the Restore object settings
$restore = New-Object Microsoft.SqlServer.Management.Smo.Restore
$restore.Action = 'Database'
$restore.Database = $database_name
$restore.ReplaceDatabase = $true
if ($norecovery.IsPresent) { $restore.NoRecovery = $true }
Else { $restore.Norecovery = $false }
$restore.Devices.Add($backup_device)
# Get information from the backup file
$restore_details = $restore.ReadBackupHeader($server)
$data_files = $restore.ReadFileList($server)
# Restore all backup files
ForEach ($data_row in $data_files) {
$logical_name = $data_row.LogicalName
$physical_name = Get-FileName -path $data_row.PhysicalName
$restore_data = New-Object("Microsoft.SqlServer.Management.Smo.RelocateFile")
$restore_data.LogicalFileName = $logical_name
if ($data_row.Type -eq "D") {
# Restore Data file
$restore_data.PhysicalFileName = $database_path + "\" + $physical_name
}
Else {
# Restore Log file
$restore_data.PhysicalFileName = $database_log_path + "\" + $physical_name
}
[Void]$restore.RelocateFiles.Add($restore_data)
}
$restore.SqlRestore($server)
# If there are two files, assume the next is a Log
if ($restore_details.Rows.Count -gt 1) {
$restore.Action = [Microsoft.SqlServer.Management.Smo.RestoreActionType]::Log
$restore.FileNumber = 2
$restore.SqlRestore($server)
}
}
Catch {
$ex = $_.Exception
Write-Output $ex.message
$ex = $ex.InnerException
while ($ex.InnerException) {
Write-Output $ex.InnerException.message
$ex = $ex.InnerException
}
Throw $ex
}
Finally {
$server.ConnectionContext.Disconnect()
}
Write-Host "Restore ended without any errors."
}

I having the same problem, I'm trying to restore the database from a back taken from the same server but with a different name.
I have profiled the restore process and it doesn't add the 'with move' with the different file names. This is why it will restore the database when the database doesn't exist,but fail when it does.
There is an issue with the .PhysicalFileName property.

I was doing the SMO restore and was running into errors. The only way I found to diagnose the problem was to run SQL profile during the execution of my powershell script.
This showed me the actual T-SQL that was being executed. I then copied this into a query and tried to execute it. This showed me the actual errors: In my case it was that my database was had multiple data files that needed to be relocated.
The attached script works for databases that have only one data file.
Param
(
[Parameter(Mandatory=$True)][string]$sqlServerName,
[Parameter(Mandatory=$True)][string]$backupFile,
[Parameter(Mandatory=$True)][string]$newDBName
)
# Load assemblies
[System.Reflection.Assembly]::LoadWithPartialName("Microsoft.SqlServer.SMO") | Out-Null
[System.Reflection.Assembly]::LoadWithPartialName("Microsoft.SqlServer.SmoExtended") | Out-Null
[Reflection.Assembly]::LoadWithPartialName("Microsoft.SqlServer.ConnectionInfo") | Out-Null
[Reflection.Assembly]::LoadWithPartialName("Microsoft.SqlServer.SmoEnum") | Out-Null
# Create sql server object
$server = New-Object ("Microsoft.SqlServer.Management.Smo.Server") $sqlServerName
# Copy database locally if backup file is on a network share
Write-Host "Loaded assemblies"
$backupDirectory = $server.Settings.BackupDirectory
Write-Host "Backup Directory:" $backupDirectory
$fullBackupFile = $backupDirectory + "\" + $backupFile
Write-Host "Copy DB from: " $fullBackupFile
# Create restore object and specify its settings
$smoRestore = new-object("Microsoft.SqlServer.Management.Smo.Restore")
$smoRestore.Database = $newDBName
$smoRestore.NoRecovery = $false;
$smoRestore.ReplaceDatabase = $true;
$smoRestore.Action = "Database"
Write-Host "New Database name:" $newDBName
# Create location to restore from
$backupDevice = New-Object("Microsoft.SqlServer.Management.Smo.BackupDeviceItem") ($fullBackupFile, "File")
$smoRestore.Devices.Add($backupDevice)
# Give empty string a nice name
$empty = ""
# Specify new data file (mdf)
$smoRestoreDataFile = New-Object("Microsoft.SqlServer.Management.Smo.RelocateFile")
$defaultData = $server.DefaultFile
if (($defaultData -eq $null) -or ($defaultData -eq $empty))
{
$defaultData = $server.MasterDBPath
}
Write-Host "defaultData:" $defaultData
$smoRestoreDataFile.PhysicalFileName = Join-Path -Path $defaultData -ChildPath ($newDBName + "_Data.mdf")
Write-Host "smoRestoreDataFile.PhysicalFileName:" $smoRestoreDataFile.PhysicalFileName
# Specify new log file (ldf)
$smoRestoreLogFile = New-Object("Microsoft.SqlServer.Management.Smo.RelocateFile")
$defaultLog = $server.DefaultLog
if (($defaultLog -eq $null) -or ($defaultLog -eq $empty))
{
$defaultLog = $server.MasterDBLogPath
}
$smoRestoreLogFile.PhysicalFileName = Join-Path -Path $defaultLog -ChildPath ($newDBName + "_Log.ldf")
Write-Host "smoRestoreLogFile:" $smoRestoreLogFile.PhysicalFileName
# Get the file list from backup file
$dbFileList = $smoRestore.ReadFileList($server)
# The logical file names should be the logical filename stored in the backup media
$smoRestoreDataFile.LogicalFileName = $dbFileList.Select("Type = 'D'")[0].LogicalName
$smoRestoreLogFile.LogicalFileName = $dbFileList.Select("Type = 'L'")[0].LogicalName
# Add the new data and log files to relocate to
$smoRestore.RelocateFiles.Add($smoRestoreDataFile)
$smoRestore.RelocateFiles.Add($smoRestoreLogFile)
# Restore the database
$smoRestore.SqlRestore($server)
"Database restore completed successfully"

Just like if you do this from T-SQL, if there is something using the database, then that'll block the restore. Whenever I'm tasked with restoring a database, I like to take it offline (with rollback immediate) first. That kills any connections to the db. You may have to set it back online first; I don't remember if restore is smart enough to realise that the files that you're overwriting belong to the database you're restoring or not. Hope this helps.

Related

What is the ItemType for SSRS Catalog Type 14?

SQL Server 2019. I manually uploaded a xlsx file to the server and it was placed in an "Excel Workbooks" group. When I look in the reportserver catalog table I see the Type value is 14. I'm trying to write a powershell script to upload a bunch of xlsx files instead of having to do it manually, one by one. I need to know the ItemType. I can use "Resource" but it doesn't upload it to the "Excel Workbooks" group, it uploads it to the "Resources" group. I did list everything in my report server and see the TypeName for the xlsx I manually uploaded is "ExcelWorkbook", but is not a valid ItemType. Any suggestions? Below is the powershell I'm using (I'm still new to powershell). Thanks!
$WebServiceUrl = "http://xxxx"
$ReportFolder = "PDF_Reports2"
$SourceDirectory = $PSScriptRoot
$Overwrite = $true
$SSRSProxy = New-WebServiceProxy -Uri $WebServiceUrl'/ReportServer/ReportService2010.asmx?WSDL' -UseDefaultCredential
# LIST ITEMS IN SERVER
#$SSRSProxy.ListChildren("/",$true)
$type = $SSRSProxy.GetType().Namespace
$datatype = ($type + '.Property')
$Property =New-Object ($datatype);
$Property.Name = "MimeType"
$Property.Value = "application/vnd.openxmlformats-officedocument.spreadsheetml.sheet"
$SourceDirectory = "c:\tmp\SSRS\"
$ItemType = "Resource" # Resource works, but it gets put in the Resources group, I want it in the Excel Workbooks group.
$ReportFolder = "/PDF_Reports2"
ForEach ($rdlfile in Get-ChildItem $SourceDirectory -Filter "*.xlsx" | Where-Object { $_.Attributes -ne "Directory" } )
{
$byteArray = [System.IO.File]::ReadAllBytes($rdlfile.FullName)
write-host $rdlfile.FullName
$Warnings = #();
$SSRSProxy.CreateCatalogItem($ItemType, $rdlfile, $ReportFolder, $Overwrite, $byteArray, $Property, [ref]$Warnings)
$warnings.Length
}

Powershell Export-CSV from MySQL database reader fails mid-export

I'm a bit new to PowerShell, and I've got a new requirement to get Data out of a MySQL database and into an Oracle one. The strategy I chose was to output to a CSV and then import the CSV into Oracle.
I wanted to get a progress bar for the export from MySQL into CSV, so I used the data reader to achieve this. It works, and begins to export, but somewhere during the export (around record 5,000 of 4.5mil -- not consistent) it will throw an error:
Exception calling "Read" with "0" argument(s): "Fatal error encountered during data read." Exception calling "Close" with "0" argument(s): "Timeout in IO operation" Method invocation failed because [System.Management.Automation.PSObject] does not contain a method named 'op_Addition'. Exception calling "ExecuteReader" with "0" argument(s): "The CommandText property has not been properly initialized."
Applicable code block is below. I'm not sure what I'm doing wrong here, and would appreciate any feedback possible. I've been pulling my hair out on this for days.
Notes: $tableObj is a custom object with a few string fields to hold table name and SQL values. Not showing those SQL queries here, but they work.
Write-Host "[INFO]: Gathering data from MySQL select statement..."
$conn = New-Object MySql.Data.MySqlClient.MySqlConnection
$conn.ConnectionString = $MySQLConnectionString
$conn.Open()
#
# Get Count of records in table
#
$countCmd = New-Object MySql.Data.MySqlClient.MySqlCommand($tableObj.SqlCount, $conn)
$recordCount = 0
try{
$recordCount = $countCmd.ExecuteScalar()
} Catch {
Write-Host "[ERROR]: (" $tableObj.Table ") Error getting Count."
Write-Host "---" $_.Exception.Message
Exit
}
$recordCountString = $recordCount.ToString('N0')
Write-Host "[INFO]: Count for table '" $tableObj.Table "' is " $recordCountString
#
# Compose the command
#
$cmd = New-Object MySql.Data.MySqlClient.MySqlCommand($tableObj.SqlExportInit, $conn)
#
# Write to CSV using DataReader
#
Write-Host "[INFO]: Data gathered into memory. Writing data to CSV file '" $tableObj.OutFile "'"
$counter = 0 # Tracks items selected
$reader=$cmd.ExecuteReader()
$dataRows = #()
# Read all rows into a hash table
while ($reader.Read())
{
$counter++
$percent = ($counter/$recordCount)*100
$percentString = [math]::Round($percent,3)
$counterString = $counter.ToString('N0')
Write-Progress -Activity '[INFO]: CSV Export In Progress' -Status "$percentString% Complete" -CurrentOperation "($($counterString) of $($recordCountString))" -PercentComplete $percent
$row = #{}
for ($i = 0; $i -lt $reader.FieldCount; $i++)
{
$row[$reader.GetName($i)] = $reader.GetValue($i)
}
# Convert hashtable into an array of PSObjects
$dataRows += New-Object psobject -Property $row
}
$conn.Close()
$dataRows | Export-Csv $tableObj.OutFile -NoTypeInformation
EDIT: Didn't work, but I also added this line to my connection string: defaultcommandtimeout=600;connectiontimeout=25 per MySQL timeout in powershell
Using #Carl Ardiente's thinking, the query is timing out, and you have to set the timeout to something insane to fully execute. You simply have to set the timeout value for your session before you start getting data.
Write-Host "[INFO]: Gathering data from MySQL select statement..."
$conn = New-Object MySql.Data.MySqlClient.MySqlConnection
$conn.ConnectionString = $MySQLConnectionString
$conn.Open()
# Set timeout on MySql
$cmd = New-Object MySql.Data.MySqlClient.MySqlCommand("set net_write_timeout=99999; set net_read_timeout=99999", $conn)
$cmd.ExecuteNonQuery()
#
# Get Count of records in table
#
...Etc....
Not that I've found the solution, but none of the connection string changes worked. Manually setting the timeout didn't seem to help either. It seemed to be caused from too many rows returned, so I broke up the function to run in batches, and append to a CSV as it goes. This gets rid of the IO / timeout error.

Uploading a CSV File to MySQL Server

I am using a PowerShell script to upload a CSV file to my database. The end goal is to download the most recent CSV file uploaded to our FTP (which I have completed), and then upload this CSV file to our database. These CSV files are always in the same format, and I have created the database to match this format.
I am using a script I found online as a sort of outline to help me, however it still doesn't seem to be working. Below is the script and I am hoping someone can help me figure out either a better way to complete this objective, or what I am doing wrong.
# Database variables
$sqlserver = "****"
$database = "****"
$table = "****"
$user = "****"
$pass = "****!"
# CSV variables
$csvfile = "C:\Users\Lucy\Documents\FTPFiles\vc_report_20171211.csv"
$csvdelimiter = ","
$FirstRowColumnNames = $true
################### No need to modify anything below ###################
Write-Host "Script started..."
$elapsed = [System.Diagnostics.Stopwatch]::StartNew()
[void][Reflection.Assembly]::LoadWithPartialName("System.Data")
[void][Reflection.Assembly]::LoadWithPartialName("System.Data.SqlClient")
# 50k worked fastest and kept memory usage to a minimum
$batchsize = 50000
# Build the sqlbulkcopy connection, and set the timeout to infinite
$connectionstring = "Data Source=$sqlserver;User Id=$user;Password=$pass;Initial Catalog=$database;"
$bulkcopy = New-Object Data.SqlClient.SqlBulkCopy($connectionstring, [System.Data.SqlClient.SqlBulkCopyOptions]::TableLock)
$bulkcopy.DestinationTableName = $table
$bulkcopy.bulkcopyTimeout = 0
$bulkcopy.batchsize = $batchsize
# Create the datatable, and autogenerate the columns.
$datatable = New-Object System.Data.DataTable
# Open the text file from disk
$reader = New-Object System.IO.StreamReader($csvfile)
$columns = (Get-Content $csvfile -First 1).Split($csvdelimiter)
if ($FirstRowColumnNames -eq $true) { $null = $reader.readLine() }
foreach ($column in $columns) {
$null = $datatable.Columns.Add()
}
# Read in the data, line by line, not column by column
while (($line = $reader.ReadLine()) -ne $null) {
$null = $datatable.Rows.Add($line.Split($csvdelimiter))
# Import and empty the datatable before it starts taking up too much RAM, but
# after it has enough rows to make the import efficient.
$i++;
if (($i % $batchsize) -eq 0) {
$bulkcopy.WriteToServer($datatable)
Write-Host "$i rows have been inserted in $($elapsed.Elapsed.ToString())."
$datatable.Clear()
}
}
# Add in all the remaining rows since the last clear
if ($datatable.Rows.Count -gt 0) {
$bulkcopy.WriteToServer($datatable)
$datatable.Clear()
}
# Clean Up
$reader.Close(); $reader.Dispose()
$bulkcopy.Close(); $bulkcopy.Dispose()
$datatable.Dispose()
Write-Host "Script complete. $i rows have been inserted into the database."
Write-Host "Total Elapsed Time: $($elapsed.Elapsed.ToString())"
$i = 0;
# Sometimes the Garbage Collector takes too long to clear the huge datatable.
[System.GC]::Collect()
Our database is hosted by SiteGround so I need to connect to the database. At first I was getting the following error:
Exception calling "WriteToServer" with "1" argument(s): "A network-related or
instance-specific error occurred while establishing a connection to SQL Server.
The server was not found or was not accessible. Verify that the instance name is
correct and that SQL Server is configured to allow remote connections. (provider:
Named Pipes Provider, error: 40 - Could not open a connection to SQL Server)"
At C:\Users\Lucy\Documents\FTPFiles\upload.ps1:61 char:5
+ $bulkcopy.WriteToServer($datatable)
+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+ CategoryInfo : NotSpecified: (:) [], MethodInvocationException
+ FullyQualifiedErrorId : SqlException
Then I added the port 3306, to the end of the $sqlserver = "servername,3306". When I did the I new get the error:
Exception calling "WriteToServer" with "1" argument(s): "A network-related or
instance-specific error occurred while establishing a connection to SQL Server.
The server was not found or was not accessible. Verify that the instance name is
correct and that SQL Server is configured to allow remote connections. (provider:
Named Pipes Provider, error: 40 - Could not open a connection to SQL Server)"
At C:\Users\Lucy\Documents\FTPFiles\upload.ps1:63 char:5
+ $bulkcopy.WriteToServer($datatable)
+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+ CategoryInfo : NotSpecified: (:) [], MethodInvocationException
+ FullyQualifiedErrorId : SqlException

Getting error while backing up SQL database using powershell

I am trying to backup SQL database but strange thing is taking lpace, few databases i can backup with powershell but few i am getting error
Exception calling "SqlBackup" with "1" argument(s): "Backup failed for Server
'Server1'. "
At C:\_Scripts\defaultbackup.ps1:40 char:1
+ $smoBackup.SqlBackup($server)
+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+ CategoryInfo : NotSpecified: (:) [], MethodInvocationException
+ FullyQualifiedErrorId : FailedOperationException
$dbToBackup = "test"
#clear screen
cls
#load assemblies
#note need to load SqlServer.SmoExtended to use SMO backup in SQL Server 2008
#otherwise may get this error
#Cannot find type [Microsoft.SqlServer.Management.Smo.Backup]: make sure
#the assembly containing this type is loaded.
[System.Reflection.Assembly]::LoadWithPartialName("Microsoft.SqlServer.SMO") | Out-Null
[System.Reflection.Assembly]::LoadWithPartialName("Microsoft.SqlServer.SmoExtended") |
Out-Null
[System.Reflection.Assembly]::LoadWithPartialName("Microsoft.SqlServer.ConnectionInfo")
| Out-Null
[System.Reflection.Assembly]::LoadWithPartialName("Microsoft.SqlServer.SmoEnum") | Out-
Null
#create a new server object
$server = New-Object ("Microsoft.SqlServer.Management.Smo.Server") "(local)"
$backupDirectory = $server.Settings.BackupDirectory
"Default Backup Directory: " + $backupDirectory
$db = $server.Databases[$dbToBackup]
$dbName = $db.Name
$timestamp = Get-Date -format yyyyMMddHHmmss
$smoBackup = New-Object ("Microsoft.SqlServer.Management.Smo.Backup")
#BackupActionType specifies the type of backup.
#Options are Database, Files, Log
#This belongs in Microsoft.SqlServer.SmoExtended assembly
$smoBackup.Action = "Database"
$smoBackup.BackupSetDescription = "Full Backup of " + $dbName
$smoBackup.BackupSetName = $dbName + " Backup"
$smoBackup.Database = $dbName
$smoBackup.MediaDescription = "Disk"
$smoBackup.Devices.AddDevice($backupDirectory + "\" + $dbName + "_" + $timestamp +
".bak",
"File")
$smoBackup.SqlBackup($server)
#let's confirm, let's list list all backup files
$directory = Get-ChildItem $backupDirectory
$smoBackup.Devices.AddDevice($backupDirectory + "\" + $dbName + "_" + $timestamp + ".bak", "File")
$smoBackup.SqlBackup($server)
let's confirm, let's list list all backup files
$directory = Get-ChildItem $backupDirectory
list only files that end in .bak, assuming this is your convention for all backup files
$backupFilesList = $directory | where {$_.extension -eq ".bak"}
$backupFilesList | Format-Table Name, LastWriteTime
$backupFilesList = $directory | where {$_.extension -eq ".bak"}
$backupFilesList | Format-Table Name, LastWriteTime
Replace $server = New-Object ("Microsoft.SqlServer.Management.Smo.Server") "(local)"with
$srv = New-Object ("Microsoft.SqlServer.Management.Smo.Server") "(local)"
$conContext = $srv.ConnectionContext
$conContext.LoginSecure = $True
$conContext.ConnectTimeout = 0
$server = new-object Microsoft.SqlServer.Management.Smo.Server($conContext)
Since you can take backup of few and few are failing, seems like script is fine. Why not add the exception handling to trap exact error.
You may want to modify your code like below
Try
{
$smoBackup.SqlBackup($server
}
Catch
{
Write-Output $_.Exception.InnerException
}
I know this is an old thread, but not fully answered. It did though help me resolve my problem (which is on a later version 14.0.1000.169 of SQL Server).
My database backup is done via a Powershell script on the task scheduler. It would randomly fail. Same when stepping through in the Powershell IDE it failed with the above error...
Exception calling "SqlBackup" with "1" argument(s): "Backup failed for Server
'ABC'."
When backed up manually through 'Tasks' in SMS (v18), it works every time.
In reading through the properties of the ServerConnection, the StatementTimeout property defaults to 600 seconds (10 minutes). My backup script, once the DB backup has completed, Zips the backup then FTPs the Zip to another server. Looking at the timestamp on the Zips, they were roughly 10 minutes from the backup initiating.
So I added the StatementTimeout line to my code as follows to double the timeout to 20 minutes:
$mySrvConn = new-object Microsoft.SqlServer.Management.Common.ServerConnection
$mySrvConn.ServerInstance = $serverName
$mySrvConn.LoginSecure = $false
$mySrvConn.Login = $userName
$mySrvConn.Password = $userPwd
$mySrvConn.StatementTimeout = 1200
$server = new-object Microsoft.SqlServer.Management.SMO.Server($mySrvConn)
This has resolved the problem. Some backups were completing within 10 minutes, others must have not quite completed.

Changing NTFS security on user with fullcontrol to modify

I have thousands of folders I need to change users with Fullcontrol access to modify access. The following is a list of what I have:
A script that changes NTFS perms:
$acl = Get-Acl "G:\Folder"
$acl | Format-List
$acl.GetAccessRules($true, $true, [System.Security.Principal.NTAccount])
#second $true on following line turns on inheritance, $False turns off
$acl.SetAccessRuleProtection($True, $True)
$rule = New-Object System.Security.AccessControl.FileSystemAccessRule("Administrators","FullControl", "ContainerInherit, ObjectInherit", "None", "Allow")
$acl.AddAccessRule($rule)
$rule = New-Object System.Security.AccessControl.FileSystemAccessRule("My-ServerTeam","FullControl", "ContainerInherit, ObjectInherit", "None", "Allow")
$acl.AddAccessRule($rule)
$rule = New-Object System.Security.AccessControl.FileSystemAccessRule("Users","Read", "ContainerInherit, ObjectInherit", "None", "Allow")
$acl.AddAccessRule($rule)
Set-Acl "G:\Folder" $acl
Get-Acl "G:\Folder" | Format-List
A text file with the directories and users that need to be changed from fullcontrol to modify.
I can always create a variable for the path and/or username and create a ForEach loop, but I'm not sure how to change the users that exist in the ACL for each folder to Modify, but keep the Admin accounts as full control. Any help would be appreciated.
Went another route and got what I needed. I'm not surprised noone tried to help me on this one.... it was tough. I'll post the scripts for the next person who has this issue.
There are two scripts. The first I obtained from the internet and altered a bit. The second script launches the first with the parameters required to automate.
First Script Named SetFolderPermission.ps1:
param ([string]$Path, [string]$Access, [string]$Permission = ("Modify"), [switch]$help)
function GetHelp() {
$HelpText = #"
DESCRIPTION:
NAME: SetFolderPermission.ps1
Sets FolderPermissions for User on a Folder.
Creates folder if not exist.
PARAMETERS:
-Path Folder to Create or Modify (Required)
-User User who should have access (Required)
-Permission Specify Permission for User, Default set to Modify (Optional)
-help Prints the HelpFile (Optional)
SYNTAX:
./SetFolderPermission.ps1 -Path C:\Folder\NewFolder -Access Domain\UserName -Permission FullControl
Creates the folder C:\Folder\NewFolder if it doesn't exist.
Sets Full Control for Domain\UserName
./SetFolderPermission.ps1 -Path C:\Folder\NewFolder -Access Domain\UserName
Creates the folder C:\Folder\NewFolder if it doesn't exist.
Sets Modify (Default Value) for Domain\UserName
./SetFolderPermission.ps1 -help
Displays the help topic for the script
Below Are Available Values for -Permission
"#
$HelpText
[system.enum]::getnames([System.Security.AccessControl.FileSystemRights])
}
<#
function CreateFolder ([string]$Path) {
# Check if the folder Exists
if (Test-Path $Path) {
Write-Host "Folder: $Path Already Exists" -ForeGroundColor Yellow
} else {
Write-Host "Creating $Path" -Foregroundcolor Green
New-Item -Path $Path -type directory | Out-Null
}
}
#>
function SetAcl ([string]$Path, [string]$Access, [string]$Permission) {
# Get ACL on FOlder
$GetACL = Get-Acl $Path
# Set up AccessRule
$Allinherit = [system.security.accesscontrol.InheritanceFlags]"ContainerInherit, ObjectInherit"
$Allpropagation = [system.security.accesscontrol.PropagationFlags]"None"
$AccessRule = New-Object system.security.AccessControl.FileSystemAccessRule($Access, $Permission, $AllInherit, $Allpropagation, "Allow")
# Check if Access Already Exists
if ($GetACL.Access | Where {$_.IdentityReference -eq $Access}) {
Write-Host "Modifying Permissions For: $Access on directory: $Path" -ForeGroundColor Yellow
$AccessModification = New-Object system.security.AccessControl.AccessControlModification
$AccessModification.value__ = 2
$Modification = $False
$GetACL.ModifyAccessRule($AccessModification, $AccessRule, [ref]$Modification) | Out-Null
} else {
Write-Host "Adding Permission: $Permission For: $Access"
$GetACL.AddAccessRule($AccessRule)
}
Set-Acl -aclobject $GetACL -Path $Path
Write-Host "Permission: $Permission Set For: $Access on directory: $Path" -ForeGroundColor Green
}
if ($help) { GetHelp }
if ($Access -AND $Permission) {
SetAcl $Path $Access $Permission
}
The next script calls the first script and adds the needed parameters. A CSV containing 2 columns with the folders and usernames with full control.
$path = "C:\Scripts\scandata\TwoColumnCSVwithPathandUserwithFullControl.csv"
$csv = Import-csv -path $path
foreach($line in $csv){
$userN = $line.IdentityReference
$PathN = $line.Path
$dir = "$PathN"
$DomUser = "$userN"
$Perm = "Modify"
$scriptPath = "C:\Scripts\SetFolderPermission.ps1"
$argumentList1 = '-Path'
$argumentList2 = "$dir"
$argumentList3 = '-Access'
$argumentList4 = "$DomUser"
$argumentList5 = '-Permission'
$argumentList6 = "$Perm"
Invoke-Expression "$scriptPath $argumentList1 $argumentList2 $argumentList3 $argumentList4 $argumentList5 $argumentList6"