I was contemplating the creation of a tool to load varbinary(max) fields in a SQL Server 2008 database with files as selected from a open file dialog. Is there any such tool or equivalent that I could use? My SQL server is in a hosted environment where I don't have physical access to the server so loading it with TSQL is not an option.
How about powershell?
# from: http://sev17.com/2010/05/t-sql-tuesday-006-blobs-filestream-and-powershell/
#
$server = "superfly\sqlexpress"
$database = "Yak"
$query = "INSERT dbo.FileStore VALUES (#FileData, #FileName)"
$filepath = "d:\yak.txt"
$FileName = get-childitem $filepath | select -ExpandProperty Name
$connection=new-object System.Data.SqlClient.SQLConnection
$connection.ConnectionString="Server={0};Database={1};Integrated Security=True" -f $server,$database
$command=new-object system.Data.SqlClient.SqlCommand($query,$connection)
$command.CommandTimeout=120
$connection.Open()
$fs = new-object System.IO.FileStream($filePath,[System.IO.FileMode]'Open',[System.IO.FileAccess]'Read')
$buffer = new-object byte[] -ArgumentList $fs.Length
$fs.Read($buffer, 0, $buffer.Length)
$fs.Close()
$command.Parameters.Add("#FileData", [System.Data.SqlDbType]"VarBinary", $buffer.Length)
$command.Parameters["#FileData"].Value = $buffer
$command.Parameters.Add("#FileName", [System.Data.SqlDbType]"NChar", 50)
$command.Parameters["#FileName"].Value = $FileName
$command.ExecuteNonQuery()
$connection.Close()
Related
Import data from a SQL Database hosted on Azure virtual machine, convert sql query data as a json document and store it in Azure Data Lake Storage. I am using powershell to create the json document.
Hit a roadblock on how to import json documents into data lake store and to automate the import.
$InstanceName = "SQLDB\TST"
$connectionString = "Server=$InstanceName;Database=dbadb;Integrated Security=True;"
$query = "SELECT * FROM Employee"
$connection = New-Object System.Data.SqlClient.SqlConnection
$connection.ConnectionString = $connectionString
$connection.Open()
$command = $connection.CreateCommand()
$command.CommandText = $query
$result = $command.ExecuteReader()
$table = new-object "System.Data.DataTable"
$table.Load($result)
$table | select $table.Columns.ColumnName | ConvertTo-Json | Set-Content "C:\JsonDocs\result.json"
$connection.Close()
If you want to store JSON file in Azure data lake store with PowerShell, you can use the PowerShell command Import-AzDataLakeStoreItem
For example
Connect-AzAccount
Import-AzDataLakeStoreItem -AccountName $dataLakeStorageGen1Name `
-Path "you json file path" `
-Destination "the path in data lake store"
For more details, please refer to the document
Can i export all azure active directory users to a MySQL database via a runbook in azure automation?
Someone who has some kind of code with for example firstname and lastname only of a user?
Yes you can. Your code would look something like this.
######## Connect to Azure AD ##################
# Get Azure Run As Connection Name
$connectionName = "AzureRunAsConnection"
# Get the Service Principal connection details for the Connection name
$servicePrincipalConnection = Get-AutomationConnection -Name $connectionName
# Logging in to Azure AD with Service Principal
"Logging in to Azure AD..."
Connect-AzureAD -TenantId $servicePrincipalConnection.TenantId `
-ApplicationId $servicePrincipalConnection.ApplicationId `
-CertificateThumbprint $servicePrincipalConnection.CertificateThumbprint
######## Create CSV of all users ##################
$users = Get-AzureADUser -All $true
$userList = #()
foreach ($user in $users){
$userex = $user | Select-Object -ExpandProperty ExtensionProperty
$userProperties = [ordered] #{
accountEnabled=$user.GivenName
userPrincipalName=$user.Surname
}
$u = new-object PSObject -Property $userProperties
$userList += $u
}
$userList | Export-Csv .\users.csv -NoTypeInformation
$csvFullPath = Resolve-Path .\users.csv
######## Import CSV data into MySql ##################
[system.reflection.assembly]::LoadWithPartialName("MySql.Data")
$mysqlConn = New-Object -TypeName MySql.Data.MySqlClient.MySqlConnection
$mysqlConn.ConnectionString = "SERVER=localhost;DATABASE=loadfiletest;UID=root;PWD=pwd"
$mysqlConn.Open()
$MysqlQuery = New-Object -TypeName MySql.Data.MySqlClient.MySqlCommand
$MysqlQuery.Connection = $mysqlConn
$MysqlQuery.CommandText = "LOAD DATA LOCAL INFILE '$csvFullPath' INTO TABLE loadfiletest.testtable FIELDS TERMINATED BY ',' OPTIONALLY ENCLOSED BY '""' LINES TERMINATED BY '\r\n' (GivenName, Surname)"
$MysqlQuery.ExecuteNonQuery()
You should read this if: you are trying to find out how to transform SQL server data into JSON and put it into a text .json file
Question:
Can someone tell me what's wrong with this code? My goal is to read data from a SQL Server table, convert it to JSON and then save the result as a JSON text file. The code runs but the resulting .json file just has:
{
"FieldCount": 11
},
{
repeated over and over again and nothing more.
My code:
$instance = "localhost\SQLEXPRESS"
$connectionString = "Server=$Instance; Database=myDB;Integrated Security=True;"
$query = "Select * from myTable"
$connection = New-Object System.Data.SqlClient.SqlConnection
$connection.ConnectionString = $connectionString
$connection.Open()
$command = $connection.CreateCommand()
$command.CommandText = $query
$result = $command.ExecuteReader()
$result | ConvertTo-Json | Out-File "file.json"
$connection.Close()
Update:
Will award the answer to postanote as technically he/she answered my original question (although I will caveat and say I have not tried it).
However I would recommend either Mike's answer or what I eventually ended up going with, using BCP:
bcp "select * from myTable FOR JSON AUTO" queryout "C:\filepath\testsml.json" -c -S ".\SQLEXPRESS" -d myDBName -T
Note that the JSON AUTO will automatically come up with a json scehma for you vs. JSON pipeline which allows you to customize it.
You have to install BCP first:
https://learn.microsoft.com/en-us/sql/tools/bcp-utility?view=sql-server-2017
If you are using sql server express 2016 or later you should be able to do it on the database side using FOR JSON clause. Try something like
$instance = "localhost\SQLEXPRESS"
$connectionString = "Server=$Instance; Database=myDB;Integrated Security=True;"
$query = "Select * from myTable FOR JSON AUTO"
$connection = New-Object System.Data.SqlClient.SqlConnection
$connection.ConnectionString = $connectionString
$connection.Open()
$command = $connection.CreateCommand()
$command.CommandText = $query
$command.ExecuteScalar() | Out-File "file.json"
Try something like this …
### Exporting SQL Server table to JSON
Clear-Host
#--Establishing connection to SQL Server --#
$InstanceName = "."
$connectionString = "Server=$InstanceName;Database=msdb;Integrated Security=True;"
#--Main Query --#
$query = "SELECT * FROM sysjobs"
$connection = New-Object System.Data.SqlClient.SqlConnection
$connection.ConnectionString = $connectionString
$connection.Open()
$command = $connection.CreateCommand()
$command.CommandText = $query
$result = $command.ExecuteReader()
$table = new-object "System.Data.DataTable"
$table.Load($result)
#--Exporting data to the screen --#
$table | select $table.Columns.ColumnName | ConvertTo-Json
$connection.Close()
# Results
{
"job_id": "5126aca3-1003-481c-ab36-60b45a7ee757",
"originating_server_id": 0,
"name": "syspolicy_purge_history",
"enabled": 1,
"description": "No description available.",
"start_step_id": 1,
"category_id": 0,
"owner_sid": [
1
],
"notify_level_eventlog": 0,
"notify_level_email": 0,
"notify_level_netsend": 0,
"notify_level_page": 0,
"notify_email_operator_id": 0,
"notify_netsend_operator_id": 0,
"notify_page_operator_id": 0,
"delete_level": 0,
"date_created": "\/Date(1542859767703)\/",
"date_modified": "\/Date(1542859767870)\/",
"version_number": 5
}
The "rub" here is that the SQL command FOR JSON AUTO even with execute scalar, will truncate JSON output, and outputting to a variable with VARCHAR(max) will still truncate. Using SQL 2016 LocalDB bundled with Visual Studio if that matters.
SSRS gives you the ability to export a report into the original RDL format: http://sql-articles.com/articles/general/download-export-rdl-files-from-report-server/
What I am wondering is if there is a way to export all reports (via a command-line interface that I could write) or some tool into the original RDL format which can then be zipped up, etc.
Thank you for your time.
I've not tested this but it appears to do what you need.
https://gallery.technet.microsoft.com/scriptcenter/SSRS-all-RDL-files-from-00488104
I've created a powershell script to do this. You have to connect to SQL server which has the SSRS database. It is compressing all the files into a zip file.
Add-Type -AssemblyName "System.IO.Compression.Filesystem"
$dataSource = "SQLSERVER"
$user = "sa"
$pass = "sqlpassword"
$database = "ReportServer"
$connectionString = "Server=$dataSource;uid=$user; pwd=$pass;Database=$database;Integrated Security=False;"
$tempfolder = "$env:TEMP\Reports"
$zipfile = $PSScriptRoot + '\reports.zip'
$connection = New-Object System.Data.SqlClient.SqlConnection
$connection.ConnectionString = $connectionString
$connection.Open()
$allreports = $connection.CreateCommand()
$allreports.CommandText = "SELECT ItemID, Path, CASE WHEN Type = 2 THEN '.rdl' ELSE 'rds' END AS Ext FROM Catalog WHERE Type IN(2,5)"
$result = $allreports.ExecuteReader()
$reportable = new-object "System.Data.DataTable"
$reportable.Load($result)
[int]$objects = $reportable.Rows.Count
foreach ($report in $reportable) {
$cmd = $connection.CreateCommand()
$cmd.CommandText = "SELECT CAST(CAST(Content AS VARBINARY(MAX)) AS XML) FROM Catalog WHERE ItemID = '" + $report[0] + "'"
$xmldata = [string]$cmd.ExecuteScalar()
$filename = $tempfolder + $report["Path"].Replace('/', '\') + $report["Ext"]
New-Item $filename -Force | Out-Null
Set-Content -Path ($filename) -Value $xmldata -Force
Write-Host "$($objects.ToString()).$($report["Path"])"
$objects -= 1
}
Write-Host "Compressing to zip file..."
if (Test-Path $zipfile) {
Remove-Item $zipfile
}
[IO.Compression.Zipfile]::CreateFromDirectory($tempfolder, $zipfile)
Write-Host "Removing temporarly data"
Remove-Item -LiteralPath $tempfolder -Force -Recurse
Invoke-Item $zipfile
I have a Powershell script that backs up my MySQL DB's each night using mysqldump. This all works fine but I would like to extend the script to update a reporting db (db1) from the backup of the prod db (db2). I have written the following test script but it does not work. I have a feeling the problem is the reading of the sql file to the CommandText but I am not sure how to debug.
[system.reflection.assembly]::LoadWithPartialName("MySql.Data")
$mysql_server = "localhost"
$mysql_user = "root"
$mysql_password = "password"
write-host "Create coonection to db1"
# Connect to MySQL database 'db1'
$cn = New-Object -TypeName MySql.Data.MySqlClient.MySqlConnection
$cn.ConnectionString = "SERVER=$mysql_server;DATABASE=db1;UID=$mysql_user;PWD=$mysql_password"
$cn.Open()
write-host "Running backup script against db1"
# Run Update Script MySQL
$cm = New-Object -TypeName MySql.Data.MySqlClient.MySqlCommand
$sql = Get-Content C:\db2.sql
$cm.Connection = $cn
$cm.CommandText = $sql
$cm.ExecuteReader()
write-host "Closing Connection"
$cn.Close()
Any assistance would be appreciated. Thanks.
This line:
$sql = Get-Content C:\db2.sql
Returns an array of strings. When that gets assigned to something expecting a string then PowerShell will concatenate the array of strings into a single string using the contents of the $OFS (output field separator) variable. If this variable isn't set, the default separator is a single space. Try this instead and see if it works:
$sql = Get-Content C:\db2.sql
...
$OFS = "`r`n"
$cm.CommandText = "$sql"
Or if you're on PowerShell 2.0:
$sql = (Get-Content C:\db2.sql) -join "`r`n"