Is there a Powershell command to list all SQL instances on my system? (MS SQL 2008)
Just another way of doing it...can be a little quicker than SQLPS to get a quick answer.
(get-itemproperty 'HKLM:\SOFTWARE\Microsoft\Microsoft SQL Server').InstalledInstances
I found that (for me at least) none of the above returned my SQL Express instance. I have 5 named instances, 4 full-fat SQL Server, 1 SQL Express. The 4 full-fat are included in the answers above, the SQL Express isn't. SO, I did a little digging around the internet and came across this article by James Kehr, which lists information about all SQL Server instances on a machine. I used this code as a basis for writing the function below.
# get all sql instances, defaults to local machine, '.'
Function Get-SqlInstances {
Param($ServerName = '.')
$localInstances = #()
[array]$captions = gwmi win32_service -computerName $ServerName | ?{$_.Name -match "mssql*" -and $_.PathName -match "sqlservr.exe"} | %{$_.Caption}
foreach ($caption in $captions) {
if ($caption -eq "MSSQLSERVER") {
$localInstances += "MSSQLSERVER"
} else {
$temp = $caption | %{$_.split(" ")[-1]} | %{$_.trimStart("(")} | %{$_.trimEnd(")")}
$localInstances += "$ServerName\$temp"
}
}
$localInstances
}
Import powershell sql server extensions:
Import-Module SqlServer
Then do these commands
Set-Location SQLSERVER:\SQL\localhost
Get-ChildItem
[System.Reflection.Assembly]::LoadWithPartialName("Microsoft.SqlServer.SqlWmiManagement") | out-null
$mach = '.'
$m = New-Object ('Microsoft.SqlServer.Management.Smo.Wmi.ManagedComputer') $mach
$m.ServerInstances
$a = "MyComputerName"
[System.Data.Sql.SqlDataSourceEnumerator]::Instance.GetDataSources() | ? { $_.servername -eq $a}
Aaron method return a more sure response.
Read Here about Instance.GetDataSources()
The System.Data.Sql namespace contains classes that support SQL Server-specific functionality.
By using the System.Data.Sql namespace you can get all MSSQL instances on a machine using this command in windows power shell:
[System.Data.Sql.SqlDataSourceEnumerator]::Instance.GetDataSources()
This function it gonna return all the installed instances with the version details in a object list:
function ListSQLInstances {
$listinstances = New-Object System.Collections.ArrayList
$installedInstances = (get-itemproperty 'HKLM:\SOFTWARE\Microsoft\Microsoft SQL Server').InstalledInstances
foreach ($i in $installedInstances) {
$instancefullname = (Get-ItemProperty 'HKLM:\SOFTWARE\Microsoft\Microsoft SQL Server\Instance Names\SQL').$i
$productversion = (Get-ItemProperty "HKLM:\SOFTWARE\Microsoft\Microsoft SQL Server\$instancefullname\Setup").Version
$majorversion = switch -Regex ($productversion) {
'8' { 'SQL2000' }
'9' { 'SQL2005' }
'10.0' { 'SQL2008' }
'10.5' { 'SQL2008 R2' }
'11' { 'SQL2012' }
'12' { 'SQL2014' }
'13' { 'SQL2016' }
'14' { 'SQL2017' }
'15' { 'SQL2019' }
default { "Unknown" }
}
$instance = [PSCustomObject]#{
Instance = $i
InstanceNameFullName = $instancefullname;
Edition = (Get-ItemProperty "HKLM:\SOFTWARE\Microsoft\Microsoft SQL Server\$instancefullname\Setup").Edition;
ProductVersion = $productversion;
MajorVersion = $majorversion;
}
$listinstances.Add($instance)
}
Return $listinstances
}
$instances = ListSQLInstances
foreach ($instance in $instances) {
Write-Host $instance.Instance
}
Related
I have working script to insert data like computer name, date, ip adress and two other data what my powershell script read from txt file.
I don't know why it makes double insert to mysql server where first insert is good and second don't have data from this txt file.
The script I've posted below successfully inserts data into my database however it seems to insert two separate rows. One with the appropriate data in the End1 and End2 columns and one with incorrect data.
Code to capture and insert data:
$txt = "app_log.txt"
function find_nr {
get-content $txt -ReadCount 1000 |
foreach { $_ -match "P3#X" }
}
$string = find_nr
$separator = "\;"
function seperate {
$string.Split($separator,6)
}
function nr_lines {
$i = 99901; seperate | % {$i++;"$($i-1) `t $_"}
}
function find {
$line = $args[0] | Select-String -Pattern "99905" -CaseSensitive
($line.line.split(' ') |Where-Object {$_.Trim() -ne ''})[1]
}
function liczba {
$result = nr_lines
find $result
}
liczba
function find_sn {
get-content $txt -ReadCount 1000 |
foreach { $_ -match "P2#X" }
}
$string_sn = find_sn
$separator_sn = "\/"
function seperate_sn {
$string_sn.Split($separator_sn,20)
}
function sn_lines {
$i = 55501; seperate_sn | % {$i++;"$($i-1) `t $_"}
}
function find_sn {
$line_sn = $args[0] | Select-String -Pattern "55518" -CaseSensitive
($line_sn.line.split(' ') |Where-Object {$_.Trim() -ne ''})[1]
}
function sn {
$result2 = sn_lines
find_sn $result2
}
sn
[System.Reflection.Assembly]::LoadWithPartialName("MySql.Data")
$name = $env:COMPUTERNAME
$ipv4 = Test-Connection -ComputerName (hostname) -Count 1 | foreach { $_.ipv4address }
$time = (Get-Date).ToString('yyyy-MM-dd HH:mm:ss')
$end1 = liczba
$end2 = sn
Start-Sleep -s 5
[string]$sMySQLUserName = 'user'
[string]$sMySQLPW = 'pass'
[string]$sMySQLDB = 'db'
[string]$sMySQLHost = '1.0.0.0'
[string]$sConnectionString = "server="+$sMySQLHost+";port=3306;uid=" + $sMySQLUserName + ";pwd=" + $sMySQLPW + ";database="+$sMySQLDB+";SslMode=none"
$oConnection = New-Object MySql.Data.MySqlClient.MySqlConnection($sConnectionString)
$Error.Clear()
try
{
$oConnection.Open()
}
catch
{
write-warning ("DB error: $sMySQLDB na ip: $sMySQLHost. Error: "+$Error[0].ToString())
}
$oMYSQLCommand = New-Object MySql.Data.MySqlClient.MySqlCommand
$oMYSQLCommand.CommandText="
INSERT into `db.scanner` (name,ipv4,date,raports,serialnumber) VALUES('$name','$ipv4','$time','$end1','$end2')"
$oMYSQLCommand.Connection=$oConnection
$oMYSQLCommand.ExecuteNonQuery()
$oConnection.Close()
Script starts every 3rd day of the week at the user login.
This is how it looks in mysql db:
ID NAME IP DATE END1 END2
239 S02C-KASA1 1.0.0.1 2021-09-02 08:15:37 0
238 S02C-KASA1 1.0.0.1 2021-09-02 08:15:37 1476 CAO1802176616FC
It appears the inserts are happening at the same time from the data. Does anyone have an idea as to what is causing this?
Edit:
This is a app_log.txt data:
Send : P23#sae\
Resp.: ...P2#X0;1;0;1;1;2;20;9;29/23.00/08.00/05.00/00.00/100.00/101.00/101.00/0/0./0./0./0./0./0./0./29930676.49/BDS13287171EF\
Resp.: .
Send : P24#sa9\
Resp.: ..P3#X2015;2;10;1771;59;0;249.00/0.00/0.00/0.00/0.00/0.00/0.00/80\
Resp.: .
find_nr takes this line: P3#X2015;2;10;1771;59;0;249.00/0.00/0.00/0.00/0.00/0.00/0.00/80\
find_sn takes this line:
P2#X0;1;0;1;1;2;20;9;29/23.00/08.00/05.00/00.00/100.00/101.00/101.00/0/0./0./0./0./0./0./0./29930676.49/BDS13287171EF\
function Enforce-MFA($exclude){
Connect-MsolService
$excludedUsers = 'admin','admin2','admin3','admin4' + $exclude
$excluded = ($excludedUsers | ForEach-Object { [regex]::Escape($_) }) -join '|'
$st = New-Object -TypeName Microsoft.Online.Administration.StrongAuthenticationRequirement
$st.RelyingParty = "*"
$st.State = "Enforced"
$sta = #($st)
$array = (Get-MsolUser | Where-Object { $_.DisplayName -notmatch $excluded }).UserPrincipalName
ForEach ($user in $array)
{
Set-MsolUser -UserPrincipalName $user -StrongAuthenticationRequirements $sta
Write-Host "Complete"
}
}
The general function is to grab a list of objects, exclude certain objects, and Enforce MFA for the remaining objects. This script seemed to work without any issue last week, but this week, I'm getting no data from the Array variable. I was working on a lot of different changes and I'm thinking I may have messed something up in the process, but I'm just not seeing it. What did I mess up or what am I not seeing?
You forgot to add -Credential $cred to the Connect-MsolService.
You should create that connection first and take it out of the function.
function Enforce-MFA {
$excludeTheseUsers = 'admin', 'user1', 'user2' # etc.
# for using the regex `-notmatch` operator later, you need to combine the entries with the regex OR sign ('|'),
# but you need to make sure to escape special characters some names may contain
$excludes = ($excludeTheseUsers | ForEach-Object { [regex]::Escape($_) }) -join '|'
# create the StrongAuthenticationRequirement object just once, to use on all users
$st = New-Object -TypeName Microsoft.Online.Administration.StrongAuthenticationRequirement
$st.RelyingParty = "*"
$st.State = "Enabled"
$sta = #($st)
# get an array of UserPrincipalNames
$array = (Get-MsolUser | Where-Object { $_.DisplayName -notmatch $excludes }).UserPrincipalName
foreach ($user in $array) {
Set-MsolUser -UserPrincipalName $user -StrongAuthenticationRequirements $sta
}
Write-Host "Enforcing MFA Complete"
}
# ask for credentials to make the connection
$cred = Get-Credential -Message 'Please enter your credentials to connect to Azure Active Directory'
Connect-MsolService -Credential $cred
As for your loop, try something like this:
# enter an endless loop
while($true) {
$var = Read-Host -Prompt "Enter the corresponding number: 1: Enforce 2: Enable 3: Disable 4: Exit"
switch($var){
1 { Enforce-MFA }
2 { Enable-MFA }
3 { Disable-MFA }
4 { exit }
default{ "Please choose either 1, 2 ,3 or 4" }
}
}
I have a long list of simple jobs I would like to somewhat automate. It's simple stuff, grab or post info via API and build some reports, nothing fancy.
I decided to build a master script which directs out to a variety of other scripts, each handling its own job. Each one of those little scripts, reference functions from a Utility script which I built that has functions which are common to all the other simple job scripts.
Each of the scripts work perfectly when I run them directly, however, when I try to run them via the master script, which routes to them, they all fail.
One example is that in many cases I need to fetch data from an API but get capped at 1000 object returns when I need 10k+. To solve this, I built a function which recursively calls itself until there is no more data left to collect. Again, this works when called by itself but not from the master script, for some reason, it bails out after the first run (should run 10+ times in this case). Then, it returns nothing.
I am thinking maybe this has something to do with how I am scoping the functions/variables?? Not sure. I have tried scoping to Global, Local & Script but none seem to work. Here's some of the code...
*Master Director Script runs script based on user input*
...
&$choice_hash[$action].script_path
$ScriptDirectory = Split-Path -Path $MyInvocation.MyCommand.Definition -Parent
. "$ScriptDirectory\Utilities.psm1"
$user_data = $null
$env_choice = $null
$csv_output_path = $null
$collated_user_data = [System.Collections.ArrayList]#()
function selectEnv {
$global:env_choice = Read-Host #"
> Select an Environment: [Prod] or [Dev]
Your Choice
"#
if ($env_choice -ne 'Prod' -and $env_choice -ne 'Dev') {
consoleCmt $env_choice
consoleCmt 'Invalid Choice. Try again...'
selectEnv
} else {
if ($env_choice -eq 'Prod') {
$global:csv_output_path = '\\etoprod\******\Exports\Report_Users_Prod.csv'
} else {
$global:csv_output_path = '\\etoprod\******\Exports\Report_Users_Dev.csv'
}
$global:user_data = process_data $env_choice 'api/xm/1/people?embed=roles&limit=1000'
}
}
function processUsersData {
foreach($user in $user_data) {
$user_roles = ''
$role_divider = ','
for($i = 0; $i -lt $user.roles.data.length; $i++) {
# Only append a comma if there are more, otherwise leave blank for CSV deliniation
if ($i -eq $user.roles.data.length - 1) {
$role_divider = ''
}
$user_roles += $user.roles.data[$i].name + $role_divider
}
# Build ordered hash table with above data
$sanatized_user = [pscustomobject][ordered]#{id = $user.targetName; firstName = $user.firstName; lastName = $user.lastName; siteName = $user.site.name; roles = $user_roles }
# Shovel into storage array used for building the CSV
$global:collated_user_data += $sanatized_user
}
}
notice 'Initiating Groups Report Script'
selectEnv
processUsersData
exportCsv $collated_user_data $csv_output_path
Utility Script (relevant functions being called)
$res = $null
$content = #()
...
function process_data($env, $url) {
fetch_data $env $url
foreach($i in $res.data) {
$global:content += $i
}
if($res.links.next) {
fetch_more $env $res.links.next
}
return $content **Should return full collection of data, but fails after one pass**
}
function fetch_data($env, $url) {
$base = generateEnvBase $env
$path = "$base/$url"
$req = Invoke-WebRequest -Credential $cred -Uri $path -Method GET
$global:res = ConvertFrom-Json $req
}
function fetch_more($env, $url) {
$base = generateEnvBase $env
$path = "$base$url"
$req = Invoke-WebRequest -Credential $cred -Uri $path -Method GET
$res = ConvertFrom-Json $req
foreach($i in $res.data) {
$global:content += $i
}
if($res.links.next) {
fetch_more $env $res.links.next
}
}
Sorry in advance if I have not followed procedure or etiquette, I'm new here.
This should work if you declare all variables in Main.ps1 that are needed by functions. You could also use the "Script" scope when creating a new variable inside a function that you want to use outside the function. Example $Script:Var = "Stuff" created inside a function will be available to whole script.
Directory Structure
C:\Script\Root
| Main.ps1
\---Utilities
fetch_data.ps1
fetch_more.ps1
processUsersData.ps1
process_data.ps1
selectEnv.ps1
Main.ps1
#---[ Initization ]---#
# Strings
[String]$RootPath = $PSScriptRoot
[String]$UtilPath = "$($RootPath)\Utilities"
[String]$env_choice = $null
[String]$csv_output_path = $null
# Arrays
[Array]$user_data = #()
[Array]$content = #()
[Array]$collated_user_data = #()
[Array]$res = #()
#---[ Source in Utilities ]---#
# Get the scripts
$Utilities = Get-ChildItem -Path "$UtilPath" -File | Where-Object {$_.Extension -eq ".ps1"}
# Source in each one
foreach ($Item in $Utilities) {
.$Item.FullName
}
#---[ Select an Environment ]---#
# Get the User's choice
$env_choice = selectEnv
# Process the choice
switch ($env_choice) {
Prod {
$csv_output_path = '\\etoprod\******\Exports\Report_Users_Prod.csv'
$user_data = process_data 'Prod' 'api/xm/1/people?embed=roles&limit=1000'
}
Dev {
$csv_output_path = '\\etoprod\******\Exports\Report_Users_Dev.csv'
$user_data = process_data 'Dev' 'api/xm/1/people?embed=roles&limit=1000'
}
Test {
Write-Output "Test is not an option. Choose wisely."
exit 1
}
Default {
Write-Output "Unknown Environment Choice."
exit 1
}
}
#---[ Process Users and Export ]---#
processUsersData
exportCsv $collated_user_data $csv_output_path
selectEnv.ps1
function selectEnv {
$Title = "Environment:"
$Info = "Please choose an environment"
# Options
$Prod = New-Object System.Management.Automation.Host.ChoiceDescription '&Prod', 'Production environment'
$Dev = New-Object System.Management.Automation.Host.ChoiceDescription '&Dev', 'Development environment'
$Test = New-Object System.Management.Automation.Host.ChoiceDescription '&Test', 'Testing environment'
$Options = [System.Management.Automation.Host.ChoiceDescription[]]($Prod, $Dev, $Test)
$Default = 0
# Promp the User
$Choice = $host.UI.PromptForChoice($Title , $Info , $Options, $Default)
$Result = $Options[$Choice].Label -Replace '&',''
return $Result
}
I'm trying to restore a database from a backup file using SMO. If the database does not already exist then it works fine. However, if the database already exists then I get no errors, but the database is not overwritten.
The "restore" process still takes just as long, so it looks like it's working and doing a restore, but in the end the database has not changed.
I'm doing this in Powershell using SMO. The code is a bit long, but I've included it below. You'll notice that I do set $restore.ReplaceDatabase = $true. Also, I use a try-catch block and report on any errors (I hope), but none are returned.
Any obvious mistakes? Is it possible that I'm not reporting some error and it's being hidden from me?
Thanks for any help or advice that you can give!
function Invoke-SqlRestore {
param(
[string]$backup_file_name,
[string]$server_name,
[string]$database_name,
[switch]$norecovery=$false
)
# Get a new connection to the server
[Microsoft.SqlServer.Management.Smo.Server]$server = New-SMOconnection -server_name $server_name
Write-Host "Starting restore to $database_name on $server_name."
Try {
$backup_device = New-Object("Microsoft.SqlServer.Management.Smo.BackupDeviceItem") ($backup_file_name, "File")
# Get local paths to the Database and Log file locations
If ($server.Settings.DefaultFile.Length -eq 0) {$database_path = $server.Information.MasterDBPath }
Else { $database_path = $server.Settings.DefaultFile}
If ($server.Settings.DefaultLog.Length -eq 0 ) {$database_log_path = $server.Information.MasterDBLogPath }
Else { $database_log_path = $server.Settings.DefaultLog}
# Load up the Restore object settings
$restore = New-Object Microsoft.SqlServer.Management.Smo.Restore
$restore.Action = 'Database'
$restore.Database = $database_name
$restore.ReplaceDatabase = $true
if ($norecovery.IsPresent) { $restore.NoRecovery = $true }
Else { $restore.Norecovery = $false }
$restore.Devices.Add($backup_device)
# Get information from the backup file
$restore_details = $restore.ReadBackupHeader($server)
$data_files = $restore.ReadFileList($server)
# Restore all backup files
ForEach ($data_row in $data_files) {
$logical_name = $data_row.LogicalName
$physical_name = Get-FileName -path $data_row.PhysicalName
$restore_data = New-Object("Microsoft.SqlServer.Management.Smo.RelocateFile")
$restore_data.LogicalFileName = $logical_name
if ($data_row.Type -eq "D") {
# Restore Data file
$restore_data.PhysicalFileName = $database_path + "\" + $physical_name
}
Else {
# Restore Log file
$restore_data.PhysicalFileName = $database_log_path + "\" + $physical_name
}
[Void]$restore.RelocateFiles.Add($restore_data)
}
$restore.SqlRestore($server)
# If there are two files, assume the next is a Log
if ($restore_details.Rows.Count -gt 1) {
$restore.Action = [Microsoft.SqlServer.Management.Smo.RestoreActionType]::Log
$restore.FileNumber = 2
$restore.SqlRestore($server)
}
}
Catch {
$ex = $_.Exception
Write-Output $ex.message
$ex = $ex.InnerException
while ($ex.InnerException) {
Write-Output $ex.InnerException.message
$ex = $ex.InnerException
}
Throw $ex
}
Finally {
$server.ConnectionContext.Disconnect()
}
Write-Host "Restore ended without any errors."
}
I having the same problem, I'm trying to restore the database from a back taken from the same server but with a different name.
I have profiled the restore process and it doesn't add the 'with move' with the different file names. This is why it will restore the database when the database doesn't exist,but fail when it does.
There is an issue with the .PhysicalFileName property.
I was doing the SMO restore and was running into errors. The only way I found to diagnose the problem was to run SQL profile during the execution of my powershell script.
This showed me the actual T-SQL that was being executed. I then copied this into a query and tried to execute it. This showed me the actual errors: In my case it was that my database was had multiple data files that needed to be relocated.
The attached script works for databases that have only one data file.
Param
(
[Parameter(Mandatory=$True)][string]$sqlServerName,
[Parameter(Mandatory=$True)][string]$backupFile,
[Parameter(Mandatory=$True)][string]$newDBName
)
# Load assemblies
[System.Reflection.Assembly]::LoadWithPartialName("Microsoft.SqlServer.SMO") | Out-Null
[System.Reflection.Assembly]::LoadWithPartialName("Microsoft.SqlServer.SmoExtended") | Out-Null
[Reflection.Assembly]::LoadWithPartialName("Microsoft.SqlServer.ConnectionInfo") | Out-Null
[Reflection.Assembly]::LoadWithPartialName("Microsoft.SqlServer.SmoEnum") | Out-Null
# Create sql server object
$server = New-Object ("Microsoft.SqlServer.Management.Smo.Server") $sqlServerName
# Copy database locally if backup file is on a network share
Write-Host "Loaded assemblies"
$backupDirectory = $server.Settings.BackupDirectory
Write-Host "Backup Directory:" $backupDirectory
$fullBackupFile = $backupDirectory + "\" + $backupFile
Write-Host "Copy DB from: " $fullBackupFile
# Create restore object and specify its settings
$smoRestore = new-object("Microsoft.SqlServer.Management.Smo.Restore")
$smoRestore.Database = $newDBName
$smoRestore.NoRecovery = $false;
$smoRestore.ReplaceDatabase = $true;
$smoRestore.Action = "Database"
Write-Host "New Database name:" $newDBName
# Create location to restore from
$backupDevice = New-Object("Microsoft.SqlServer.Management.Smo.BackupDeviceItem") ($fullBackupFile, "File")
$smoRestore.Devices.Add($backupDevice)
# Give empty string a nice name
$empty = ""
# Specify new data file (mdf)
$smoRestoreDataFile = New-Object("Microsoft.SqlServer.Management.Smo.RelocateFile")
$defaultData = $server.DefaultFile
if (($defaultData -eq $null) -or ($defaultData -eq $empty))
{
$defaultData = $server.MasterDBPath
}
Write-Host "defaultData:" $defaultData
$smoRestoreDataFile.PhysicalFileName = Join-Path -Path $defaultData -ChildPath ($newDBName + "_Data.mdf")
Write-Host "smoRestoreDataFile.PhysicalFileName:" $smoRestoreDataFile.PhysicalFileName
# Specify new log file (ldf)
$smoRestoreLogFile = New-Object("Microsoft.SqlServer.Management.Smo.RelocateFile")
$defaultLog = $server.DefaultLog
if (($defaultLog -eq $null) -or ($defaultLog -eq $empty))
{
$defaultLog = $server.MasterDBLogPath
}
$smoRestoreLogFile.PhysicalFileName = Join-Path -Path $defaultLog -ChildPath ($newDBName + "_Log.ldf")
Write-Host "smoRestoreLogFile:" $smoRestoreLogFile.PhysicalFileName
# Get the file list from backup file
$dbFileList = $smoRestore.ReadFileList($server)
# The logical file names should be the logical filename stored in the backup media
$smoRestoreDataFile.LogicalFileName = $dbFileList.Select("Type = 'D'")[0].LogicalName
$smoRestoreLogFile.LogicalFileName = $dbFileList.Select("Type = 'L'")[0].LogicalName
# Add the new data and log files to relocate to
$smoRestore.RelocateFiles.Add($smoRestoreDataFile)
$smoRestore.RelocateFiles.Add($smoRestoreLogFile)
# Restore the database
$smoRestore.SqlRestore($server)
"Database restore completed successfully"
Just like if you do this from T-SQL, if there is something using the database, then that'll block the restore. Whenever I'm tasked with restoring a database, I like to take it offline (with rollback immediate) first. That kills any connections to the db. You may have to set it back online first; I don't remember if restore is smart enough to realise that the files that you're overwriting belong to the database you're restoring or not. Hope this helps.
I have the following PowerShell code:
function Get-SmoConnection
{
param
([string] $serverName = "", [int] $connectionTimeout = 0)
if($serverName.Length -eq 0)
{
$serverConnection = New-Object `
Microsoft.SqlServer.Management.Common.ServerConnection
}
else
{
$serverConnection = New-Object `
Microsoft.SqlServer.Management.Common.ServerConnection($serverName)
}
if($connectionTimeout -ne 0)
{
$serverConnection.ConnectTimeout = $connectionTimeout
}
try
{
$serverConnection.Connect()
$serverConnection
}
catch [system.Management.Automation.MethodInvocationException]
{
$null
}
}
$connection = get-smoconnection "ServerName" 2
if($connection -ne $null)
{
Write-Host $connection.ServerInstance
Write-Host $connection.ConnectTimeout
}
else
{
Write-Host "Connection could not be established"
}
It seems to work, except for the part that attempts to set the SMO connection timeout. If the connection is successful, I can verify that ServerConnection.ConnectTimeout is set to 2 (seconds), but when I supply a bogus name for the SQL Server instance, it still attempts to connect to it for ~ 15 seconds (which is I believe the default timeout value).
Does anyone have experience with setting SMO connection timeout? Thank you in advance.
I can't seem to reproduce the behavior you are seeing. If recreate your function as script rather than a function the ConnectionTimeout property seems to work regardless of whether the server name parameter is bogus or not:
Measure-Command {./get-smoconnection.ps1 'Z03\sq2k8' 2}