Uploading a CSV File to MySQL Server - mysql

I am using a PowerShell script to upload a CSV file to my database. The end goal is to download the most recent CSV file uploaded to our FTP (which I have completed), and then upload this CSV file to our database. These CSV files are always in the same format, and I have created the database to match this format.
I am using a script I found online as a sort of outline to help me, however it still doesn't seem to be working. Below is the script and I am hoping someone can help me figure out either a better way to complete this objective, or what I am doing wrong.
# Database variables
$sqlserver = "****"
$database = "****"
$table = "****"
$user = "****"
$pass = "****!"
# CSV variables
$csvfile = "C:\Users\Lucy\Documents\FTPFiles\vc_report_20171211.csv"
$csvdelimiter = ","
$FirstRowColumnNames = $true
################### No need to modify anything below ###################
Write-Host "Script started..."
$elapsed = [System.Diagnostics.Stopwatch]::StartNew()
[void][Reflection.Assembly]::LoadWithPartialName("System.Data")
[void][Reflection.Assembly]::LoadWithPartialName("System.Data.SqlClient")
# 50k worked fastest and kept memory usage to a minimum
$batchsize = 50000
# Build the sqlbulkcopy connection, and set the timeout to infinite
$connectionstring = "Data Source=$sqlserver;User Id=$user;Password=$pass;Initial Catalog=$database;"
$bulkcopy = New-Object Data.SqlClient.SqlBulkCopy($connectionstring, [System.Data.SqlClient.SqlBulkCopyOptions]::TableLock)
$bulkcopy.DestinationTableName = $table
$bulkcopy.bulkcopyTimeout = 0
$bulkcopy.batchsize = $batchsize
# Create the datatable, and autogenerate the columns.
$datatable = New-Object System.Data.DataTable
# Open the text file from disk
$reader = New-Object System.IO.StreamReader($csvfile)
$columns = (Get-Content $csvfile -First 1).Split($csvdelimiter)
if ($FirstRowColumnNames -eq $true) { $null = $reader.readLine() }
foreach ($column in $columns) {
$null = $datatable.Columns.Add()
}
# Read in the data, line by line, not column by column
while (($line = $reader.ReadLine()) -ne $null) {
$null = $datatable.Rows.Add($line.Split($csvdelimiter))
# Import and empty the datatable before it starts taking up too much RAM, but
# after it has enough rows to make the import efficient.
$i++;
if (($i % $batchsize) -eq 0) {
$bulkcopy.WriteToServer($datatable)
Write-Host "$i rows have been inserted in $($elapsed.Elapsed.ToString())."
$datatable.Clear()
}
}
# Add in all the remaining rows since the last clear
if ($datatable.Rows.Count -gt 0) {
$bulkcopy.WriteToServer($datatable)
$datatable.Clear()
}
# Clean Up
$reader.Close(); $reader.Dispose()
$bulkcopy.Close(); $bulkcopy.Dispose()
$datatable.Dispose()
Write-Host "Script complete. $i rows have been inserted into the database."
Write-Host "Total Elapsed Time: $($elapsed.Elapsed.ToString())"
$i = 0;
# Sometimes the Garbage Collector takes too long to clear the huge datatable.
[System.GC]::Collect()
Our database is hosted by SiteGround so I need to connect to the database. At first I was getting the following error:
Exception calling "WriteToServer" with "1" argument(s): "A network-related or
instance-specific error occurred while establishing a connection to SQL Server.
The server was not found or was not accessible. Verify that the instance name is
correct and that SQL Server is configured to allow remote connections. (provider:
Named Pipes Provider, error: 40 - Could not open a connection to SQL Server)"
At C:\Users\Lucy\Documents\FTPFiles\upload.ps1:61 char:5
+ $bulkcopy.WriteToServer($datatable)
+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+ CategoryInfo : NotSpecified: (:) [], MethodInvocationException
+ FullyQualifiedErrorId : SqlException
Then I added the port 3306, to the end of the $sqlserver = "servername,3306". When I did the I new get the error:
Exception calling "WriteToServer" with "1" argument(s): "A network-related or
instance-specific error occurred while establishing a connection to SQL Server.
The server was not found or was not accessible. Verify that the instance name is
correct and that SQL Server is configured to allow remote connections. (provider:
Named Pipes Provider, error: 40 - Could not open a connection to SQL Server)"
At C:\Users\Lucy\Documents\FTPFiles\upload.ps1:63 char:5
+ $bulkcopy.WriteToServer($datatable)
+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+ CategoryInfo : NotSpecified: (:) [], MethodInvocationException
+ FullyQualifiedErrorId : SqlException

Related

Powershell Export-CSV from MySQL database reader fails mid-export

I'm a bit new to PowerShell, and I've got a new requirement to get Data out of a MySQL database and into an Oracle one. The strategy I chose was to output to a CSV and then import the CSV into Oracle.
I wanted to get a progress bar for the export from MySQL into CSV, so I used the data reader to achieve this. It works, and begins to export, but somewhere during the export (around record 5,000 of 4.5mil -- not consistent) it will throw an error:
Exception calling "Read" with "0" argument(s): "Fatal error encountered during data read." Exception calling "Close" with "0" argument(s): "Timeout in IO operation" Method invocation failed because [System.Management.Automation.PSObject] does not contain a method named 'op_Addition'. Exception calling "ExecuteReader" with "0" argument(s): "The CommandText property has not been properly initialized."
Applicable code block is below. I'm not sure what I'm doing wrong here, and would appreciate any feedback possible. I've been pulling my hair out on this for days.
Notes: $tableObj is a custom object with a few string fields to hold table name and SQL values. Not showing those SQL queries here, but they work.
Write-Host "[INFO]: Gathering data from MySQL select statement..."
$conn = New-Object MySql.Data.MySqlClient.MySqlConnection
$conn.ConnectionString = $MySQLConnectionString
$conn.Open()
#
# Get Count of records in table
#
$countCmd = New-Object MySql.Data.MySqlClient.MySqlCommand($tableObj.SqlCount, $conn)
$recordCount = 0
try{
$recordCount = $countCmd.ExecuteScalar()
} Catch {
Write-Host "[ERROR]: (" $tableObj.Table ") Error getting Count."
Write-Host "---" $_.Exception.Message
Exit
}
$recordCountString = $recordCount.ToString('N0')
Write-Host "[INFO]: Count for table '" $tableObj.Table "' is " $recordCountString
#
# Compose the command
#
$cmd = New-Object MySql.Data.MySqlClient.MySqlCommand($tableObj.SqlExportInit, $conn)
#
# Write to CSV using DataReader
#
Write-Host "[INFO]: Data gathered into memory. Writing data to CSV file '" $tableObj.OutFile "'"
$counter = 0 # Tracks items selected
$reader=$cmd.ExecuteReader()
$dataRows = #()
# Read all rows into a hash table
while ($reader.Read())
{
$counter++
$percent = ($counter/$recordCount)*100
$percentString = [math]::Round($percent,3)
$counterString = $counter.ToString('N0')
Write-Progress -Activity '[INFO]: CSV Export In Progress' -Status "$percentString% Complete" -CurrentOperation "($($counterString) of $($recordCountString))" -PercentComplete $percent
$row = #{}
for ($i = 0; $i -lt $reader.FieldCount; $i++)
{
$row[$reader.GetName($i)] = $reader.GetValue($i)
}
# Convert hashtable into an array of PSObjects
$dataRows += New-Object psobject -Property $row
}
$conn.Close()
$dataRows | Export-Csv $tableObj.OutFile -NoTypeInformation
EDIT: Didn't work, but I also added this line to my connection string: defaultcommandtimeout=600;connectiontimeout=25 per MySQL timeout in powershell
Using #Carl Ardiente's thinking, the query is timing out, and you have to set the timeout to something insane to fully execute. You simply have to set the timeout value for your session before you start getting data.
Write-Host "[INFO]: Gathering data from MySQL select statement..."
$conn = New-Object MySql.Data.MySqlClient.MySqlConnection
$conn.ConnectionString = $MySQLConnectionString
$conn.Open()
# Set timeout on MySql
$cmd = New-Object MySql.Data.MySqlClient.MySqlCommand("set net_write_timeout=99999; set net_read_timeout=99999", $conn)
$cmd.ExecuteNonQuery()
#
# Get Count of records in table
#
...Etc....
Not that I've found the solution, but none of the connection string changes worked. Manually setting the timeout didn't seem to help either. It seemed to be caused from too many rows returned, so I broke up the function to run in batches, and append to a CSV as it goes. This gets rid of the IO / timeout error.

Getting error while backing up SQL database using powershell

I am trying to backup SQL database but strange thing is taking lpace, few databases i can backup with powershell but few i am getting error
Exception calling "SqlBackup" with "1" argument(s): "Backup failed for Server
'Server1'. "
At C:\_Scripts\defaultbackup.ps1:40 char:1
+ $smoBackup.SqlBackup($server)
+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+ CategoryInfo : NotSpecified: (:) [], MethodInvocationException
+ FullyQualifiedErrorId : FailedOperationException
$dbToBackup = "test"
#clear screen
cls
#load assemblies
#note need to load SqlServer.SmoExtended to use SMO backup in SQL Server 2008
#otherwise may get this error
#Cannot find type [Microsoft.SqlServer.Management.Smo.Backup]: make sure
#the assembly containing this type is loaded.
[System.Reflection.Assembly]::LoadWithPartialName("Microsoft.SqlServer.SMO") | Out-Null
[System.Reflection.Assembly]::LoadWithPartialName("Microsoft.SqlServer.SmoExtended") |
Out-Null
[System.Reflection.Assembly]::LoadWithPartialName("Microsoft.SqlServer.ConnectionInfo")
| Out-Null
[System.Reflection.Assembly]::LoadWithPartialName("Microsoft.SqlServer.SmoEnum") | Out-
Null
#create a new server object
$server = New-Object ("Microsoft.SqlServer.Management.Smo.Server") "(local)"
$backupDirectory = $server.Settings.BackupDirectory
"Default Backup Directory: " + $backupDirectory
$db = $server.Databases[$dbToBackup]
$dbName = $db.Name
$timestamp = Get-Date -format yyyyMMddHHmmss
$smoBackup = New-Object ("Microsoft.SqlServer.Management.Smo.Backup")
#BackupActionType specifies the type of backup.
#Options are Database, Files, Log
#This belongs in Microsoft.SqlServer.SmoExtended assembly
$smoBackup.Action = "Database"
$smoBackup.BackupSetDescription = "Full Backup of " + $dbName
$smoBackup.BackupSetName = $dbName + " Backup"
$smoBackup.Database = $dbName
$smoBackup.MediaDescription = "Disk"
$smoBackup.Devices.AddDevice($backupDirectory + "\" + $dbName + "_" + $timestamp +
".bak",
"File")
$smoBackup.SqlBackup($server)
#let's confirm, let's list list all backup files
$directory = Get-ChildItem $backupDirectory
$smoBackup.Devices.AddDevice($backupDirectory + "\" + $dbName + "_" + $timestamp + ".bak", "File")
$smoBackup.SqlBackup($server)
let's confirm, let's list list all backup files
$directory = Get-ChildItem $backupDirectory
list only files that end in .bak, assuming this is your convention for all backup files
$backupFilesList = $directory | where {$_.extension -eq ".bak"}
$backupFilesList | Format-Table Name, LastWriteTime
$backupFilesList = $directory | where {$_.extension -eq ".bak"}
$backupFilesList | Format-Table Name, LastWriteTime
Replace $server = New-Object ("Microsoft.SqlServer.Management.Smo.Server") "(local)"with
$srv = New-Object ("Microsoft.SqlServer.Management.Smo.Server") "(local)"
$conContext = $srv.ConnectionContext
$conContext.LoginSecure = $True
$conContext.ConnectTimeout = 0
$server = new-object Microsoft.SqlServer.Management.Smo.Server($conContext)
Since you can take backup of few and few are failing, seems like script is fine. Why not add the exception handling to trap exact error.
You may want to modify your code like below
Try
{
$smoBackup.SqlBackup($server
}
Catch
{
Write-Output $_.Exception.InnerException
}
I know this is an old thread, but not fully answered. It did though help me resolve my problem (which is on a later version 14.0.1000.169 of SQL Server).
My database backup is done via a Powershell script on the task scheduler. It would randomly fail. Same when stepping through in the Powershell IDE it failed with the above error...
Exception calling "SqlBackup" with "1" argument(s): "Backup failed for Server
'ABC'."
When backed up manually through 'Tasks' in SMS (v18), it works every time.
In reading through the properties of the ServerConnection, the StatementTimeout property defaults to 600 seconds (10 minutes). My backup script, once the DB backup has completed, Zips the backup then FTPs the Zip to another server. Looking at the timestamp on the Zips, they were roughly 10 minutes from the backup initiating.
So I added the StatementTimeout line to my code as follows to double the timeout to 20 minutes:
$mySrvConn = new-object Microsoft.SqlServer.Management.Common.ServerConnection
$mySrvConn.ServerInstance = $serverName
$mySrvConn.LoginSecure = $false
$mySrvConn.Login = $userName
$mySrvConn.Password = $userPwd
$mySrvConn.StatementTimeout = 1200
$server = new-object Microsoft.SqlServer.Management.SMO.Server($mySrvConn)
This has resolved the problem. Some backups were completing within 10 minutes, others must have not quite completed.

Powershell ftp download failed

I'm having issues downloading files from an ftp in powershell, this script tries to setup the connection, search for some files (I got this part right) and then download it in the working directory, I got issues don't know why, please help!!
Here's the code:
#IP address of DNS of the target % protocol
$protocol="ftp"
$target = "XXXX"
$connectionString = $protocol+"://"+$target
#Method to connect
$Request = [System.Net.WebRequest]::Create($connectionString)
$Request.Method = [System.Net.WebRequestMethods+Ftp]::ListDirectoryDetails
# Set Credentials "username",password
$username = "XXXXXXX"
$password = "XXXXXX"
# Read Username/password
$Request.Credentials = New-Object System.Net.NetworkCredential $username,$password
$Response = $Request.GetResponse()
$ResponseStream = $Response.GetResponseStream()
# Select Pattern to search
$pattern = "CCS"
# Set directory for download Files
$directory = [IO.Directory]::GetCurrentDirectory()
# Read and display the text in the file
$Reader = new-object System.Io.StreamReader $Responsestream
$files = ($Reader.ReadToEnd()) -split "`n" | Select-String "$pattern" | foreach { $_.ToString().split(” “)[28]}
$uri = (New-Object System.Uri($connectionString+"/"+$file))
$download = New-Object System.Net.WebRequestMethods+Ftp
foreach ($file in $files) {
$destinationFile = $directory+"\"+$file
$sourceFile = $uri.OriginalString
$download.DownloadFile($sourceFile, $destinationFile)
}
# Close Reader and Response objects
$Reader.Close()
$Response.Close()
When I run it I got this output:
Exception calling "DownloadFile" with "2" argument(s): "An exception occurred during a WebClient request."
At C:\CRIF\BatchScripts\FTPCHECK\01.FTP_Check.ps1:44 char:5
+ $download.DownloadFile($sourceFile, $destinationFile)
+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+ CategoryInfo : NotSpecified: (:) [], MethodInvocationException
+ FullyQualifiedErrorId : WebException
I'm running this on Powershell 3.0 (Windows Server 2012). Please help!
Details about the problem are hidden in the inner exception of this generic exception. You should dig a bit deeper in the error to find out what the real problem is.
Since PowerShell errors are stores into $error you could, immediatly after getting the error, try the following command to check out the inner exception of the last error
$error[0].Exception.InnerException
To get the most out of error messages you could use functions people wrote like Resolve-Error.
If you want your script in this case to always display a better error message, you could use a try catch block to catch the error and display it better. Something like this:
try {
$download.DownloadFile($sourceFile, $destinationFile)
}
catch [System.Net.WebException] {
if ($_.Exception.InnerException) {
Write-Error $_.Exception.InnerException.Message
} else {
Write-Error $_.Exception.Message
}
}

Cannot convert value of type "Microsoft.SqlServer.Management.Smo.Server" to type "Microsoft.SqlServer.Management.Smo.Server"

I'm trying to use SMO to restore a database via Powershell, however when I try to define and use a server object it gives me the following error:
Cannot convert argument "srv", with value: "[MJNHNX4]", for "SqlRestore" to type "Microsoft.SqlServer.Management.Smo.Server": "Cannot convert the "[MJNHNX4]" value of type
"Microsoft.SqlServer.Management.Smo.Server" to type "Microsoft.SqlServer.Management.Smo.Server"."
At line:38 char:1
+ $smoRestore.SqlRestore($server)
+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+ CategoryInfo : NotSpecified: (:) [], MethodException
+ FullyQualifiedErrorId : MethodArgumentConversionInvalidCastArgument
Here is my code in full (there's not a lot to it):
[System.Reflection.Assembly]::LoadWithPartialName("Microsoft.SqlServer.SMO") | Out-Null
[System.Reflection.Assembly]::LoadWithPartialName("Microsoft.SqlServer.SmoExtended") | Out-Null
[System.Reflection.Assembly]::LoadWithPartialName("Microsoft.SqlServer.ConnectionInfo") | Out-Null
[System.Reflection.Assembly]::LoadWithPartialName("Microsoft.SqlServer.SmoEnum") | Out-Null
#clear screen
cls
#get backup file
$backupFile = "D:\databases\Perfmon.bak"
$servername = "MJNHNX4"
$server = New-Object Microsoft.SqlServer.Management.Smo.Server($servername)
$backupDevice = New-Object ("Microsoft.SqlServer.Management.Smo.BackupDeviceItem") ($backupFile, "File")
$smoRestore = new-object("Microsoft.SqlServer.Management.Smo.Restore")
#settings for restore
$smoRestore.NoRecovery = $false;
$smoRestore.ReplaceDatabase = $true;
$smoRestore.Action = "Database"
#show every 10% progress
$smoRestore.PercentCompleteNotification = 10;
$smoRestore.Devices.Add($backupDevice)
#read db name from the backup file's backup header
$smoRestoreDetails = $smoRestore.ReadBackupHeader($server)
#display database name
"Database Name from Backup Header : " + $smoRestoreDetails.Rows[0]["DatabaseName"]
$smoRestore.Database = $smoRestoreDetails.Rows[0]["DatabaseName"]
#restore
$smoRestore.SqlRestore($server)
"Done"
The error occurs regardless of what I try to pass to Microsoft.SqlServer.Management.Smo.Server and I'm really not sure why it would be giving me that particular error. I've read through the TechNet articles on the Server Constructor and I really have no idea what's going on there. Any ideas?
After some further testing and reading about the SMO syntax, it looks like I don't need to define the $server as a new object. Instead, just passing the name of the server to $server works fine.

SMO restore of SQL database doesn't overwrite

I'm trying to restore a database from a backup file using SMO. If the database does not already exist then it works fine. However, if the database already exists then I get no errors, but the database is not overwritten.
The "restore" process still takes just as long, so it looks like it's working and doing a restore, but in the end the database has not changed.
I'm doing this in Powershell using SMO. The code is a bit long, but I've included it below. You'll notice that I do set $restore.ReplaceDatabase = $true. Also, I use a try-catch block and report on any errors (I hope), but none are returned.
Any obvious mistakes? Is it possible that I'm not reporting some error and it's being hidden from me?
Thanks for any help or advice that you can give!
function Invoke-SqlRestore {
param(
[string]$backup_file_name,
[string]$server_name,
[string]$database_name,
[switch]$norecovery=$false
)
# Get a new connection to the server
[Microsoft.SqlServer.Management.Smo.Server]$server = New-SMOconnection -server_name $server_name
Write-Host "Starting restore to $database_name on $server_name."
Try {
$backup_device = New-Object("Microsoft.SqlServer.Management.Smo.BackupDeviceItem") ($backup_file_name, "File")
# Get local paths to the Database and Log file locations
If ($server.Settings.DefaultFile.Length -eq 0) {$database_path = $server.Information.MasterDBPath }
Else { $database_path = $server.Settings.DefaultFile}
If ($server.Settings.DefaultLog.Length -eq 0 ) {$database_log_path = $server.Information.MasterDBLogPath }
Else { $database_log_path = $server.Settings.DefaultLog}
# Load up the Restore object settings
$restore = New-Object Microsoft.SqlServer.Management.Smo.Restore
$restore.Action = 'Database'
$restore.Database = $database_name
$restore.ReplaceDatabase = $true
if ($norecovery.IsPresent) { $restore.NoRecovery = $true }
Else { $restore.Norecovery = $false }
$restore.Devices.Add($backup_device)
# Get information from the backup file
$restore_details = $restore.ReadBackupHeader($server)
$data_files = $restore.ReadFileList($server)
# Restore all backup files
ForEach ($data_row in $data_files) {
$logical_name = $data_row.LogicalName
$physical_name = Get-FileName -path $data_row.PhysicalName
$restore_data = New-Object("Microsoft.SqlServer.Management.Smo.RelocateFile")
$restore_data.LogicalFileName = $logical_name
if ($data_row.Type -eq "D") {
# Restore Data file
$restore_data.PhysicalFileName = $database_path + "\" + $physical_name
}
Else {
# Restore Log file
$restore_data.PhysicalFileName = $database_log_path + "\" + $physical_name
}
[Void]$restore.RelocateFiles.Add($restore_data)
}
$restore.SqlRestore($server)
# If there are two files, assume the next is a Log
if ($restore_details.Rows.Count -gt 1) {
$restore.Action = [Microsoft.SqlServer.Management.Smo.RestoreActionType]::Log
$restore.FileNumber = 2
$restore.SqlRestore($server)
}
}
Catch {
$ex = $_.Exception
Write-Output $ex.message
$ex = $ex.InnerException
while ($ex.InnerException) {
Write-Output $ex.InnerException.message
$ex = $ex.InnerException
}
Throw $ex
}
Finally {
$server.ConnectionContext.Disconnect()
}
Write-Host "Restore ended without any errors."
}
I having the same problem, I'm trying to restore the database from a back taken from the same server but with a different name.
I have profiled the restore process and it doesn't add the 'with move' with the different file names. This is why it will restore the database when the database doesn't exist,but fail when it does.
There is an issue with the .PhysicalFileName property.
I was doing the SMO restore and was running into errors. The only way I found to diagnose the problem was to run SQL profile during the execution of my powershell script.
This showed me the actual T-SQL that was being executed. I then copied this into a query and tried to execute it. This showed me the actual errors: In my case it was that my database was had multiple data files that needed to be relocated.
The attached script works for databases that have only one data file.
Param
(
[Parameter(Mandatory=$True)][string]$sqlServerName,
[Parameter(Mandatory=$True)][string]$backupFile,
[Parameter(Mandatory=$True)][string]$newDBName
)
# Load assemblies
[System.Reflection.Assembly]::LoadWithPartialName("Microsoft.SqlServer.SMO") | Out-Null
[System.Reflection.Assembly]::LoadWithPartialName("Microsoft.SqlServer.SmoExtended") | Out-Null
[Reflection.Assembly]::LoadWithPartialName("Microsoft.SqlServer.ConnectionInfo") | Out-Null
[Reflection.Assembly]::LoadWithPartialName("Microsoft.SqlServer.SmoEnum") | Out-Null
# Create sql server object
$server = New-Object ("Microsoft.SqlServer.Management.Smo.Server") $sqlServerName
# Copy database locally if backup file is on a network share
Write-Host "Loaded assemblies"
$backupDirectory = $server.Settings.BackupDirectory
Write-Host "Backup Directory:" $backupDirectory
$fullBackupFile = $backupDirectory + "\" + $backupFile
Write-Host "Copy DB from: " $fullBackupFile
# Create restore object and specify its settings
$smoRestore = new-object("Microsoft.SqlServer.Management.Smo.Restore")
$smoRestore.Database = $newDBName
$smoRestore.NoRecovery = $false;
$smoRestore.ReplaceDatabase = $true;
$smoRestore.Action = "Database"
Write-Host "New Database name:" $newDBName
# Create location to restore from
$backupDevice = New-Object("Microsoft.SqlServer.Management.Smo.BackupDeviceItem") ($fullBackupFile, "File")
$smoRestore.Devices.Add($backupDevice)
# Give empty string a nice name
$empty = ""
# Specify new data file (mdf)
$smoRestoreDataFile = New-Object("Microsoft.SqlServer.Management.Smo.RelocateFile")
$defaultData = $server.DefaultFile
if (($defaultData -eq $null) -or ($defaultData -eq $empty))
{
$defaultData = $server.MasterDBPath
}
Write-Host "defaultData:" $defaultData
$smoRestoreDataFile.PhysicalFileName = Join-Path -Path $defaultData -ChildPath ($newDBName + "_Data.mdf")
Write-Host "smoRestoreDataFile.PhysicalFileName:" $smoRestoreDataFile.PhysicalFileName
# Specify new log file (ldf)
$smoRestoreLogFile = New-Object("Microsoft.SqlServer.Management.Smo.RelocateFile")
$defaultLog = $server.DefaultLog
if (($defaultLog -eq $null) -or ($defaultLog -eq $empty))
{
$defaultLog = $server.MasterDBLogPath
}
$smoRestoreLogFile.PhysicalFileName = Join-Path -Path $defaultLog -ChildPath ($newDBName + "_Log.ldf")
Write-Host "smoRestoreLogFile:" $smoRestoreLogFile.PhysicalFileName
# Get the file list from backup file
$dbFileList = $smoRestore.ReadFileList($server)
# The logical file names should be the logical filename stored in the backup media
$smoRestoreDataFile.LogicalFileName = $dbFileList.Select("Type = 'D'")[0].LogicalName
$smoRestoreLogFile.LogicalFileName = $dbFileList.Select("Type = 'L'")[0].LogicalName
# Add the new data and log files to relocate to
$smoRestore.RelocateFiles.Add($smoRestoreDataFile)
$smoRestore.RelocateFiles.Add($smoRestoreLogFile)
# Restore the database
$smoRestore.SqlRestore($server)
"Database restore completed successfully"
Just like if you do this from T-SQL, if there is something using the database, then that'll block the restore. Whenever I'm tasked with restoring a database, I like to take it offline (with rollback immediate) first. That kills any connections to the db. You may have to set it back online first; I don't remember if restore is smart enough to realise that the files that you're overwriting belong to the database you're restoring or not. Hope this helps.