Why powershell doesn't update database? - mysql

I have the following Powershell script:
$a = Get-Content C:\users\diana\desktop\names.txt
snmpwalk -v 2c -c root $a .1.3.6.1.2.1.25.3.3.1.2 > c:\users\diana\desktop\cpu.txt
snmpwalk -v 2c -c root $a .1.3.6.1.2.1.25.5.1.1.2 > c:\users\diana\desktop\ramvid.txt
snmpwalk -v 2c -c root $a .1.3.6.1.2.1.25.2.2 > c:\users\diana\desktop\ram.txt
get-content C:\users\diana\desktop\ramvid.txt | %{ [int]$used+=$_.split(' ')[3]; } ; echo $used > C:\users\diana\desktop\naujas.txt
get-content C:\users\diana\desktop\ram.txt | %{ [int]$total=$_.split(' ')[3]; } ; echo $total > C:\users\diana\desktop\ramfiltruotas.txt
[decimal]$b=($used*100)/$total
[math]::floor($b) > C:\users\diana\desktop\naujas2.txt
get-content C:\users\diana\desktop\cpu.txt | %{ [int]$array=$_.split(' ')[3]; }
$c=($array | Measure-Object -Average).average
echo $c > C:\users\diana\desktop\naujas3.txt
[void][system.reflection.Assembly]::LoadWithPartialName(“MySQL.Data”)
$myconnection = New-Object MySql.Data.MySqlClient.MySqlConnection
$myconnection.ConnectionString = "database=db;server=localhost;Persist Security Info=false;user id=root;pwd= "
$myconnection.Open()
$command = $myconnection.CreateCommand()
$command.CommandText = "UPDATE db.server SET (cpu='$c',ram='$b') WHERE server_name like '192.168.95.139'";
$myconnection.Close()
The upper part of the code works great, but when it comes to MySQL nothing happens. Not a single error, nothing, the table doesn't update.
Can someone point me where is the problem here?

Looks like you're not executing the command, perhaps:
$command.ExecuteNonQuery()
I think you should also dispose:
$command.Dispose()

Related

log file variables in function eat up all memory

i have the following simple function which is used several times in a script which iterates through a directory and checks the age of the files in it.
function log($Message) {
$logFilePath = 'C:\logPath\myLog.txt'
$date = Get-Date -Format 'yyyyMMddHHmmss'
$logMessage = "{0}_{1}" -f $date,$Message
if(Test-Path -Path $logFilePath) {
$logFileContent = Get-Content -Path $logFilePath
} else {
$logFileContent = ''
}
$logMessage,$logFileContent | Out-File -FilePath $logFilePath
}
i figured out that this eats up all ram. i don't understand why. i thought the scope of variables in a function are destroyed once the function is run. i fixed the ram issue by adding Remove-Variable logMessage,logFileContent,logFilePath,date to the very end of the function but would like to know how this ram issue could be solved otherwise and why the variables within the function are not destroyed automatically.
Powershell or .Net has a garbage collector, so freeing the memory isn't instant. Garbage Collection in Powershell to Speed Scripts Also the memory management is probably better in Powershell 7. I tried repeating your function many times, but the memory usage didn't go above a few hundred megs.
There's probably some more efficient .net way to prepend a line to a file: Prepend "!" to the beginning of the first line of a file
I have a weird way to prepend a line. I'm not sure how well this would work with a large file. With a 700 meg file, the Working Set memory stayed at 76 megs.
$c:file
one
two
three
$c:file = 'pre',$c:file
$c:file
pre
one
two
three
ps powershell
Handles NPM(K) PM(K) WS(K) CPU(s) Id SI ProcessName
------- ------ ----- ----- ------ -- -- -----------
607 27 67448 76824 1.14 3652 3 powershell
Although as commented, I can hardly believe that this function gobbles up your memory, you could optimize it:
function log ([string]$Message) {
$logFilePath = 'C:\logPath\myLog.txt'
# prefix the message with the current date
$Message = "{0:yyyyMMddHHmmss}_{1}" -f (Get-Date), $Message
if (Test-Path -Path $logFilePath -PathType Leaf) {
# newest log entry on top: append current content
$Message = "{0}`r`n{1}" -f $Message, (Get-Content -Path $logFilePath -Raw)
}
Set-Content -Path $logFilePath -Value $Message
}
I just want to rule out the RAM usage is caused by prepending to the file. Have you tried not storing the log contents in a variable? i.e.
$logMessage,(Get-Content -Path $logFilePath) | Out-File -FilePath $logFilePath
Edit 5/8/20 - Turns out that prepending (when done efficiently) isn't as slow as I thought - it is on the same order as using -Append. However the code (that js2010 pointed to) is long and ugly, but if you really need to prepend to the file, this is the way to do it.
I modified the OP's code a bit to automatically insert a new line.
function log-prepend{
param(
$content,
$filePath = 'C:\temp\myLogP.txt'
)
$file = get-item $filePath
if(!$file.exists){
write-error "$file does not exist";
return;
}
$filepath = $file.fullname;
$tmptoken = (get-location).path + "\_tmpfile" + $file.name;
write-verbose "$tmptoken created to as buffer";
$tfs = [System.io.file]::create($tmptoken);
$fs = [System.IO.File]::Open($file.fullname,[System.IO.FileMode]::Open,[System.IO.FileAccess]::ReadWrite);
try{
$date = Get-Date -Format 'yyyyMMddHHmmss'
$logMessage = "{0}_{1}`r`n" -f $date,$content
$msg = $logMessage.tochararray();
$tfs.write($msg,0,$msg.length);
$fs.position = 0;
$fs.copyTo($tfs);
}
catch{
write-verbose $_.Exception.Message;
}
finally{
$tfs.close();
$fs.close();
if($error.count -eq 0){
write-verbose ("updating $filepath");
[System.io.File]::Delete($filepath);
[System.io.file]::Move($tmptoken,$filepath);
}
else{
$error.clear();
[System.io.file]::Delete($tmptoken);
}
}
}
Here was my original answer that shows how to test the timing using a stopwatch.
When you prepend to a log file, you're reading the entire log file into memory, then writing it back.
You really should be using append - that would make the script run a lot faster.
function log($Message) {
$logFilePath = 'C:\logPath\myLog.txt'
$date = Get-Date -Format 'yyyyMMddHHmmss'
$logMessage = "{0}_{1}" -f $date,$Message
$logMessage | Out-File -FilePath $logFilePath -Append
}
Edit: To convince you that prepending to a log file is a bad idea, here's a test you can do on your own system:
function logAppend($Message) {
$logFilePath = 'C:\temp\myLogA.txt'
$date = Get-Date -Format 'yyyyMMddHHmmss'
$logMessage = "{0}_{1}" -f $date,$Message
$logMessage | Out-File -FilePath $logFilePath -Append
}
function logPrepend($Message) {
$logFilePath = 'C:\temp\myLogP.txt'
$date = Get-Date -Format 'yyyyMMddHHmmss'
$logMessage = "{0}_{1}" -f $date,$Message
if(Test-Path -Path $logFilePath) {
$logFileContent = Get-Content -Path $logFilePath
} else {
$logFileContent = ''
}
$logMessage,$logFileContent | Out-File -FilePath $logFilePath
}
$processes = Get-Process
$stopwatch = [system.diagnostics.stopwatch]::StartNew()
foreach ($p in $processes)
{
logAppend($p.ProcessName)
}
$stopwatch.Stop()
$stopwatch.Elapsed
$stopwatch = [system.diagnostics.stopwatch]::StartNew()
foreach ($p in $processes)
{
logPrepend($p.ProcessName)
}
$stopwatch.Stop()
$stopwatch.Elapsed
I've run this several times until I got a few thousand lines in the log file.
Going from: 1603 to 1925 lines, my results were:
Append: 7.0167008 s
Prepend: 21.7046793 s

Script for MySQL backup to multiple files

I am creating a script to back up the MySQL running on a Windows 2012 server, using PowerShell. Unlike other tips found here, I want to generate a .sql file for each of the databases.
This post shows how to create multiple files. I adapted it to PowerShell:
Get-ChildItem -Path "$($MYSQL_HOME)\data" | cmd /C "$($MYSQL_HOME)\bin\ mysql -u -s -r $($dbuser) -p$($dbpass) -e 'databases show' '| while read dbname; cmd /C "$($MYSQL_HOME)\bin\mysqldump.exe --user = $($dbuser) --password = $($dbpass) --databases $dbname> $($BKP_FOLDER)\$dbname($BACKUPDATE).sql "
but it returns error in while.
What should I change so that you can generate multiple .sql, one for each database?
Your entire commandline is b0rken. Throw it away and start over. Try something like this:
Set-Location "$MYSQL_HOME\bin"
& .\mysql.exe -N -s -r -u $dbuser -p$dbpass -e 'show databases' | % {
& .\mysqldump.exe -u $dbuser -p$dbpass --single-transaction $_ |
Out-File "$BKP_FOLDER\${_}$BACKUPDATE.sql" -Encoding Ascii
}
Example using an input from file result of mysqldump.
#Variables#####
# Debe especificar la ruta completa, por ejemplo C:\Temp\archivo.txt
$file = $args[0]
$c = ""
$i = 0
$startSafe = 0;
###############
New-Item -ItemType Directory -Force -Path .\data
$x = 0;
foreach ($f in [System.IO.File]::ReadLines($file)) {
if ($f -like '-- Dumping data for table *') {
$startSafe = 1;
$x = $f.split('`')[1];
write $x;
}
if ($startSafe) {
if($f -eq "UNLOCK TABLES;"){$i += 1; $f >> .\data\$x.out.sql; $startSafe = 0; $x = $i}
if($f -ne "UNLOCK TABLES;"){ $f >> .\data\$x.out.sql;}
}
}

Powershell - Delete all lines in a CSV up to and including a specific string for all files in a folder.

I have the following code which searches for a string in a file and returns the line number to a variable. It then deletes all lines up to and including this line. I need to modify this code to run for all .CSV files in a directory.
$a = Select-String draft.csv -Pattern RESULTS -CaseSensitive | Select -expand LineNumber
$content = Get-Content draft.csv
$content | Foreach {$n=1} {if ($n++ -ge ($a+1)) {$_}} > draft.csv
I was able to accomplish this task using the following. If anyone has a faster method, I'm glad to hear it.
ForEach ($file in Get-ChildItem *.csv)
{
$filename = $file.fullname
$a = Select-String $filename -Pattern RESULTS -CaseSensitive | Select -expand LineNumber
$content = Get-Content $filename
$content | Foreach {$n=1} {if ($n++ -ge ($a+1)) {$_}} > $filename
Echo $filename "is Complete.... Moving On!"
}

Exporting PowerShell Foreach Loop to HTML

I've been trying to write a script that pings a list of computers from a text file and exports the output to a HTML file.
Using a ForEach loop and and if/else statement I have been able to get a working ping script working that displays in PowerShell but haven't been able to export the results to a html file.
When I run the script the HTML file opens but only displays the line "Here are the ping results for $date"
I'm pretty new to PowerShell so any kind of input or help would be appreciated!
$ComputersAry = Get-Content -Path "C:\Script\ping.txt"
$filepath = "C:\Script\"
$date = "{0:yyy_MM_dd-HH_mm}" -f (get-date)
$file = $filepath + "Results_" + $date + ".htm"
New-Item $filepath -type directory -force -Verbose
$Header = #"
<style>
TABLE {border-width: 1px;border-style: solid;border-color: black;border-collapse: collapse;}
TH {border-width: 1px;padding: 3px;border-style: solid;border-color: black;background-color: #6495ED;}
TD {border-width: 1px;padding: 3px;border-style: solid;border-color: black;}
</style>
<title>
LRS Ping Results
</title>
"#
Foreach ($MachineName in $ComputersAry) {
$PingStatus = Gwmi Win32_PingStatus -Filter "Address ='$MachineName'" | Select-Object StatusCode
if($PingStatus.StatusCode -eq 0){
$output = write-host "$MachineName,Ping Success!!,$Date"
} else {
$output = write-host "$MachineName,Ping FAIL, please investigate cause ASAP!!"
}
}
$pre= "Here are the ping results for $date"
$output | Select-Object Name, Status, Date | ConvertTo-HTML -Head $Header -PreContent $pre | Out-File $file
Invoke-Item $file
Nothing gets assigned to $output when you use Write-Host. Try this instead:
$output = "$MachineName,Ping Success!!,$Date"
...
$output = "$MachineName,Ping FAIL, please investigate cause ASAP!!"
Write-Host tells PowerShell to write directly to the host's display. This bypasses the Output (stdout) stream. While you could replace Write-Host with Write-Output almost no one uses Write-Output because the default stream every goes to is the output stream anyway. So when a string like "Hello World" reaches the end of the pipeline and there isn't an Out-File or Out-Printer it will by default get stuck into the Output stream and when the result of the pipeline execution is assigned to a variable, it gets whatever is in the output stream.
Try doing the following, save the code as a script and run it.
PS C:\Scripts> .\Demo.ps1 | ConvertTo-Html | Out-File C:\Scripts\out.htm
$result = '' | Select Online
$ComputersAry = GC C:\Scripts\2.txt
Foreach ($MachineName in $ComputersAry) {
$PingStatus = Gwmi Win32_PingStatus -Filter "Address ='$MachineName'" | Select-Object StatusCode
if($PingStatus.StatusCode -eq 0){
$result.Online = "$MachineName,Ping Success!!"
} else {
$result.Online = "$MachineName,Ping FAIL, please investigate cause ASAP!!"
}
$result
}

Invoke-sqlcmd unexpected behaviour

I am using invoke-sqlcmd to bcp out a file.
All appears to be fins and dandy, except when I try to delete the file before creating it.
It then takes a random time before the file is available.
Any ideas why ?
Faulty Code (do replace $dir and $sql_srv with your own values, of course)
$dir = "\\srv-ocmr\d$\temp\"
$sql_srv = "srv-ocmr\rec1ocm"
if (test-path $($dir+"*.dat")) {remove-item $($dir+"*.dat") } # culprit line
$str = $("bcp sysobjects out " + $dir + "so.dat -S" + $sql_srv + " -T -N " )
invoke-sqlcmd -ServerInstance $sql_srv -Query "exec xp_cmdshell '$str' " -Database master -Verbose
while (-not (test-path $($dir+"*.dat"))) {
sleep -Seconds 1
test-path $($dir+"*.dat")
}
Remove the culprit line, and voila, everything works as a charm:
$dir = "\\srv-ocmr\d$\temp\"
$sql_srv = "srv-ocmr\rec1ocm"
$str = $("bcp sysobjects out " + $dir + "so.dat -S" + $sql_srv + " -T -N " )
invoke-sqlcmd -ServerInstance $sql_srv -Query "exec xp_cmdshell '$str' " -Database master -Verbose
while (-not (test-path $($dir+"*.dat"))) {
sleep -Seconds 1
test-path $($dir+"*.dat")
}
Replacing remove-item by cmd /c del doesn't change anything, the same with using a local dir instead of a UNC.
Have you tried:
Get-ChildItem -Path:$dir -Filter:'*.dat' | Remove-item -Force
I'm suspecting oddness happening with Test-Path and wildcards. With the command I posted, it will use the pipeline. If get-childitem finds no files, remove-item will just silently do nothing. If files are found, Remove-Item will try to remove them with extreme prejudice.