How to use a powershell custom object variable in a MySQL query - mysql

Im trying to make a script which takes data from serval places in our network and centralize them in one database. At the moment I'm trying to take data from AD and but it in my database but i get some weird outcome.
function Set-ODBC-Data{
param(
[string]$query=$(throw 'query is required.')
)
$cmd = new-object System.Data.Odbc.OdbcCommand($query,$DBConnection)
$cmd.ExecuteNonQuery()
}
$DBConnection = $null
$DBConnected = $FALSE
try{
$DBConnection = New-Object System.Data.Odbc.OdbcConnection
$DBConnection.ConnectionString = "Driver={MySQL ODBC 8.0 Unicode Driver};Server=127.0.0.1;Database=pcinventory;User=uSR;Password=PpPwWwDdD;Port=3306"
$DBConnection.Open()
$DBConnected = $TRUE
Write-Host "Connected to the MySQL database."
}
catch{
Write-Host "Unable to connect to the database..."
}
$ADEMEA = "ADSERVER.SERVER.WORK"
$addata = Get-ADComputer -filter * -property Name,CanonicalName,LastLogonDate,IPv4Address,OperatingSystem,OperatingSystemVersion -Server $ADEMEA | Select-Object Name,CanonicalName,LastLogonDate,IPv4Address,OperatingSystem,OperatingSystemVersion
ForEach($aditem in $addata){
Set-ODBC-Data -query "INSERT INTO ad VALUES( '$aditem.Name', '','','','','' )"
}
The result in my database looks someting like this

This happens, as $aditem is a custom Powershell object, and the SQL insert query doesn't quite know how to handle it. The outcome is a hashtable (aka key-value store) containing objects' attributes and attribute values.
As for fix, the good one is to use parametrized queries.
As for quick and dirty work-around that makes SQL injection easy, build insert string in a few parts. Using string formatting {} and -f makes it quite simple. Like so,
$q = "INSERT INTO ad VALUES( '{0}', '{1}', '{2}' )" -f $aditem.name, "more", "stuff"
write-host "Query: $q" # For debugging purposes
Set-ODBC-Data -query $q
The problem in quick and dirty is, as mentioned SQL injection. Consider what happens if the input is
$aditem.name, "more", "'); drop database pcinventory; --"
If the syntax is about right and permissions are adequate, it will execute the insertion. Right after that, it will drop your pcinventory database. So don't be tempted to use the fast approach, unless you are sure about what you are doing.

Related

How to add a SQL update statement to powershell after a foreach loop

I'm a noob with PowerShell. and have a question.
I have a working PS1 script that runs a SQL query to get a ID parameter, then creates a PDF from a Crystal Report, form another program based on that returned SQL query and then, loops through the returned parameters and repeats.
What I need is to add an SQL update statement to the PS1 script, that updates a table based on the same ID that is returned from the SQL query, after the PDF is created. Before moving on to the next ID, or after the script creates all the PDF's based on the Id's.
This is what I need to add to the script.
Update DB2.dbo.table2
SET DB2.dbo.table2.Field_01 = 1
FROM DB2.dbo.table2
WHERE (DB2.dbo.Table2.ID = {ID})
The SP1 script that works looks like this.
$serverName="MAIN"
$databaseName="DB1"
$sqlCommand="select ID from ID_Query"
$connectionString = "Data Source=$serverName; " +
"Integrated Security=SSPI; " +
"Initial Catalog=$databaseName"
$connection = new-object system.data.SqlClient.SQLConnection($connectionString)
$command = new-object system.data.sqlclient.sqlcommand($sqlCommand,$connection)
$connection.Open()
$adapter = New-Object System.Data.sqlclient.sqlDataAdapter $command
$dataset = New-Object System.Data.DataSet
$adapter.Fill($dataSet) | Out-Null
$connection.Close()
foreach ($Row in $dataSet.Tables[0].Rows)
{
$commandLine= -join(#'
"C:\Program Files (x86)\CR_Program\Program_name\CR_Printer.exe" -report="C:\Crystal_report.rpt" -exportformat=PDF -exportfile="c:\test\{Parameters.ID}.pdf" -parameters"
'#,$($Row[0]),#'
"
'#)
cmd /c $commandLine
}
I hoping to mark a column field_01 to 1, so the script does not create another Crystal repoort.pdf for the same ID, as the ID will be marked to 1 then the query that runs wont see it.
or maybe there a better way to do this.
Thanks.
You can add a timestamp to the filename along with the first field of your DataTable (query results) supposing it's an ID :
# After $connection.Close()
$crPrinter = "C:\Program Files (x86)\CR_Program\Program_name\CR_Printer.exe"
$report = "-report='C:\Crystal_report.rpt'"
foreach ($Row in $dataSet.Tables[0].Rows)
{
$now = [DateTime]::Now.ToString("yyMMddHHmmss")
$outputFile = "-exportfile='c:\test\Report_id_$Row[0]_$now.pdf' -parameters"
$commandLine= [string]::Format("{0} {1} {2}", $crPrinter, $report, $outputFile)
cmd /c $commandLine
Start-Sleep 1 # Sleeps the script to offset a second
}

Powershell Export-CSV from MySQL database reader fails mid-export

I'm a bit new to PowerShell, and I've got a new requirement to get Data out of a MySQL database and into an Oracle one. The strategy I chose was to output to a CSV and then import the CSV into Oracle.
I wanted to get a progress bar for the export from MySQL into CSV, so I used the data reader to achieve this. It works, and begins to export, but somewhere during the export (around record 5,000 of 4.5mil -- not consistent) it will throw an error:
Exception calling "Read" with "0" argument(s): "Fatal error encountered during data read." Exception calling "Close" with "0" argument(s): "Timeout in IO operation" Method invocation failed because [System.Management.Automation.PSObject] does not contain a method named 'op_Addition'. Exception calling "ExecuteReader" with "0" argument(s): "The CommandText property has not been properly initialized."
Applicable code block is below. I'm not sure what I'm doing wrong here, and would appreciate any feedback possible. I've been pulling my hair out on this for days.
Notes: $tableObj is a custom object with a few string fields to hold table name and SQL values. Not showing those SQL queries here, but they work.
Write-Host "[INFO]: Gathering data from MySQL select statement..."
$conn = New-Object MySql.Data.MySqlClient.MySqlConnection
$conn.ConnectionString = $MySQLConnectionString
$conn.Open()
#
# Get Count of records in table
#
$countCmd = New-Object MySql.Data.MySqlClient.MySqlCommand($tableObj.SqlCount, $conn)
$recordCount = 0
try{
$recordCount = $countCmd.ExecuteScalar()
} Catch {
Write-Host "[ERROR]: (" $tableObj.Table ") Error getting Count."
Write-Host "---" $_.Exception.Message
Exit
}
$recordCountString = $recordCount.ToString('N0')
Write-Host "[INFO]: Count for table '" $tableObj.Table "' is " $recordCountString
#
# Compose the command
#
$cmd = New-Object MySql.Data.MySqlClient.MySqlCommand($tableObj.SqlExportInit, $conn)
#
# Write to CSV using DataReader
#
Write-Host "[INFO]: Data gathered into memory. Writing data to CSV file '" $tableObj.OutFile "'"
$counter = 0 # Tracks items selected
$reader=$cmd.ExecuteReader()
$dataRows = #()
# Read all rows into a hash table
while ($reader.Read())
{
$counter++
$percent = ($counter/$recordCount)*100
$percentString = [math]::Round($percent,3)
$counterString = $counter.ToString('N0')
Write-Progress -Activity '[INFO]: CSV Export In Progress' -Status "$percentString% Complete" -CurrentOperation "($($counterString) of $($recordCountString))" -PercentComplete $percent
$row = #{}
for ($i = 0; $i -lt $reader.FieldCount; $i++)
{
$row[$reader.GetName($i)] = $reader.GetValue($i)
}
# Convert hashtable into an array of PSObjects
$dataRows += New-Object psobject -Property $row
}
$conn.Close()
$dataRows | Export-Csv $tableObj.OutFile -NoTypeInformation
EDIT: Didn't work, but I also added this line to my connection string: defaultcommandtimeout=600;connectiontimeout=25 per MySQL timeout in powershell
Using #Carl Ardiente's thinking, the query is timing out, and you have to set the timeout to something insane to fully execute. You simply have to set the timeout value for your session before you start getting data.
Write-Host "[INFO]: Gathering data from MySQL select statement..."
$conn = New-Object MySql.Data.MySqlClient.MySqlConnection
$conn.ConnectionString = $MySQLConnectionString
$conn.Open()
# Set timeout on MySql
$cmd = New-Object MySql.Data.MySqlClient.MySqlCommand("set net_write_timeout=99999; set net_read_timeout=99999", $conn)
$cmd.ExecuteNonQuery()
#
# Get Count of records in table
#
...Etc....
Not that I've found the solution, but none of the connection string changes worked. Manually setting the timeout didn't seem to help either. It seemed to be caused from too many rows returned, so I broke up the function to run in batches, and append to a CSV as it goes. This gets rid of the IO / timeout error.

Reuse parameterized (prepared) SQL Query

i've coded an ActiveDirectory logging system a couple of years ago...
it never become a status greater than beta but its still in use...
i got an issue reported and found out what happening...
they are serveral filds in such an ActiveDirectory Event witch are UserInputs, so i've to validate them! -- of course i didnt...
so after the first user got the brilliant idea to use singlequotes in a specific foldername it crashed my scripts - easy injection possible...
so id like to make an update using prepared statements like im using in PHP and others.
Now this is a Powershell Script.. id like to do something like this:
$MySQL-OBJ.CommandText = "INSERT INTO `table-name` (i1,i2,i3) VALUES (#k1,#k2,#k3)"
$MySQL-OBJ.Parameters.AddWithValue("#k1","value 1")
$MySQL-OBJ.Parameters.AddWithValue("#k2","value 2")
$MySQL-OBJ.Parameters.AddWithValue("#k3","value 3")
$MySQL-OBJ.ExecuteNonQuery()
This would work fine - 1 times.
My Script runs endless as a Service and loops all within a while($true) loop.
Powershell clams about the param is already set...
Exception calling "AddWithValue" with "2" argument(s): "Parameter
'#k1' has already been defined."
how i can reset this "bind" without closing the database connection?
id like the leave the connection open because the script is faster without closing and opening the connections when a event is fired (10+ / sec)
Example Code
(shortend and not tested)
##start
function db_prepare(){
$MySqlConnection = New-Object MySql.Data.MySqlClient.MySqlConnection
$MySqlConnection.ConnectionString = "server=$MySQLServerName;user id=$Username;password=$Password;database=$MySQLDatenbankName;pooling=false"
$MySqlConnection.Open()
$MySqlCommand = New-Object MySql.Data.MySqlClient.MySqlCommand
$MySqlCommand.Connection = $MySqlConnection
$MySqlCommand.CommandText = "INSERT INTO `whatever` (col1,col2...) VALUES (#va1,#va2...)"
}
while($true){
if($MySqlConnection.State -eq 'closed'){ db_prepare() }
## do the event reading and data formating stuff
## bild some variables to set as sql param values
$MySQLCommand.Parameters.AddWithValue("#va1",$variable_for_1)
$MySQLCommand.Parameters.AddWithValue("#va2",$variable_for_2)
.
.
.
Try{ $MySqlCommand.ExecuteNonQuery() | Out-Null }
Catch{ <# error handling #> }
}
Change your logic so that the db_prepare() method initializes a MySql connection and a MySql command with parameters. Set the parameter values for pre-declared parameter names in loop. Like so,
function db_prepare(){
# ...
# Add named parameters
$MySQLCommand.Parameters.Add("#val1", <datatype>)
$MySQLCommand.Parameters.Add("#val2", <datatype>)
}
while($true) {
# ...
# Set values for the named parameters
$MySQLCommand.Parameters.SetParameter("#val1", <value>)
$MySQLCommand.Parameters.SetParameter("#val2", <value>)
$MySqlCommand.ExecuteNonQuery()
# ...
}

Memory leak when inserting into MySQL with Powershell v4

I'm using powershell v4 on W2K12 R2 (fully patched) to insert a large number(100+ million) of records into a MySQL database. I've run into a bit of a problem where memory usage continues growing and growing despite aggressively removing variables and garbage collecting. Note that the memory usage is growing on the box that I'm running the script on -not the DB server.
The insertion speed is good and the job runs fine. However, I have a memory leak and have been beating my head against a wall for a week trying to figure out why. I know from testing that the memory accumulates when calling the MySQL portion of the script and not anywhere else.
I've noticed that after every insertion that the memory grows from anywhere between 1MB and 15MB.
Here is the basic flow of the process (code at the bottom).
-records are being added to an array until there are 1,000 records in the array
-once there are a thousand records, they are inserted, as a batch, into the DB
-the array is then emptied using the .clear() method (I've verified that 0 records remain in array).
-I've tried aggressively garbage collecting after every insert (no luck there).
-also tried removing variables and then garbage collecting. Still no luck.
The code below is simplified for the sake of brevity. But, it shows how I'm iterating over the records and doing the insert:
$reader = [IO.File]::OpenText($filetoread)
$lineCount = 1
while ($reader.Peek() -ge 0) {
if($lineCount -ge 1000-or $reader.Peek() -lt 0) {
insert_into_db
$lineCount = 0
}
$lineCount++
}
$reader.Close()
$reader.Dispose()
One call to establish the connection:
[void][system.reflection.Assembly]::LoadFrom("C:\Program Files (x86)\MySQL\MySQL Connector Net 6.8.3\Assemblies\v4.5\MySql.Data.dll")
$connection = New-Object MySql.Data.MySqlClient.MySqlConnection($connectionString)
And here is the call to MySQL to do the actual inserts for each 1,000 records:
function insert_into_db {
$command = $connection.CreateCommand() # Create command object
$command.CommandText = $query # Load query into object
$script:RowsInserted = $command.ExecuteNonQuery() # Execute command
$command.Dispose() # Dispose of command object
$command = $null
$query = $null
}
If anyone has any ideas or suggestions I'm all ears!
Thanks,
Jeremy
My initial conclusion about the problem being related to the Powershell -join operator appear to be wrong.
Here is what I was doing. Note that I'm adding each line to an array, which I will un-roll later when I form my SQL. (On a side note, adding items to an array tends to more performant than concatenating strings)
$dataForInsertion = = New-Object System.Collections.Generic.List[String]
$reader = [IO.File]::OpenText($filetoread)
$lineCount = 1
while ($reader.Peek() -ge 0) {
$line = $reader.Readline()
$dataForInsertion.add($line)
if($lineCount -ge 1000-or $reader.Peek() -lt 0) {
insert_into_db -insertthis $dataForInsertion
$lineCount = 0
}
$lineCount++
}
$reader.Close()
$reader.Dispose()
Calling the insert function:
sql_query -query "SET autocommit=0;INSERT INTO ``$table`` ($columns) VALUES $($dataForInsertion -join ',');COMMIT;"
The improved insert function now looks like this:
function insert_into_db {
$command.CommandText = $query # Load query into object
$script:RowsInserted = $command.ExecuteNonQuery() # Execute command
$command.Dispose() # Dispose of command object
$query = $null
}
So, it turns out my initial conclusion about the source of the problem was wrong. the Powershell -join operator had nothing to do with the issue.
In my SQL insert function I was repeatedly calling $connection.CreateCommand() on every insert. Once I moved that into the function that handles setting up the connection (which is only called once -or when needed) the memory leak disappeared.

Run a SQL Script Against MySQL using Powershell

I have a Powershell script that backs up my MySQL DB's each night using mysqldump. This all works fine but I would like to extend the script to update a reporting db (db1) from the backup of the prod db (db2). I have written the following test script but it does not work. I have a feeling the problem is the reading of the sql file to the CommandText but I am not sure how to debug.
[system.reflection.assembly]::LoadWithPartialName("MySql.Data")
$mysql_server = "localhost"
$mysql_user = "root"
$mysql_password = "password"
write-host "Create coonection to db1"
# Connect to MySQL database 'db1'
$cn = New-Object -TypeName MySql.Data.MySqlClient.MySqlConnection
$cn.ConnectionString = "SERVER=$mysql_server;DATABASE=db1;UID=$mysql_user;PWD=$mysql_password"
$cn.Open()
write-host "Running backup script against db1"
# Run Update Script MySQL
$cm = New-Object -TypeName MySql.Data.MySqlClient.MySqlCommand
$sql = Get-Content C:\db2.sql
$cm.Connection = $cn
$cm.CommandText = $sql
$cm.ExecuteReader()
write-host "Closing Connection"
$cn.Close()
Any assistance would be appreciated. Thanks.
This line:
$sql = Get-Content C:\db2.sql
Returns an array of strings. When that gets assigned to something expecting a string then PowerShell will concatenate the array of strings into a single string using the contents of the $OFS (output field separator) variable. If this variable isn't set, the default separator is a single space. Try this instead and see if it works:
$sql = Get-Content C:\db2.sql
...
$OFS = "`r`n"
$cm.CommandText = "$sql"
Or if you're on PowerShell 2.0:
$sql = (Get-Content C:\db2.sql) -join "`r`n"