Can i access 'dxdiag ' from powershell console - powershell-remoting

How can I access "dxdiag" with powershell . I want to run a script for gather some information from a few remote computers.

If you use the /x parameter, you can have dxdiag output to an xml file, which is then really easily parsed from powershell. Basically just something like this:
# Drop output in temp dir
$logFile = $env:TEMP + "\dxDiagLog.xml"
# Piping to Out-Null forces it to wait for dxdiag to complete before continuing. Otherwise
# it tries to load the file before it actuallygets written
dxdiag.exe /whql:off /dontskip /x $logFile | Out-Null
[xml]$dxDiagLog = Get-Content $logFile
$dxDiagLog.DxDiag.DirectSound.SoundDevices.SoundDevice | ft Description, DriverProvider
Which dumps this for output on my machine:
Description DriverProvider
----------- --------------
Speakers (Realtek High Definition Audio) Realtek Semiconductor Corp.
Polycom CX700 (Polycom CX700) Microsoft

In my case the command would run, and only later would it create the file.
(& dxdiag.exe /whql:off /dontskip /t `"$path`") | Out-Null
The problem is with the ampersand & which made the command exit before completion.
So either use:
dxdiag.exe /whql:off /dontskip /x $logFile | Out-Null
Or:
Start-Process -FilePath "C:\Windows\System32\dxdiag.exe" -ArgumentList "/dontskip /whql:off /t C:\Temp\dxdiag.txt" -Wait
From: https://community.spiceworks.com/topic/2116806-how-can-i-run-dxdiag-on-a-remote-pc

Related

Replace a string during the Mysqldump process on Windows

In a powershell script, I have a mysqldump command which outputs to stdin.
The goal is to replace all occurences of a string in that stdin before pushing it into a file, because there is not enough disk space on the machine to hold two separate files (dump is around 30Go).
I have tried this (removed the invoke-expression and mysql args):
mysqldump [...args] | ForEach-Object -Process {$_ -replace 'sourceText','targetText' | Add-Content $dumpDataFile}
Or this:
mysqldump [...args] | Foreach-Object {$_ -replace 'sourceText','targetText'} | Set-Content $dumpDataFile
but it is eating up all the memory on the machine.
I have also tried replacing content in the result file but it always ends up in copying to an another file.
I also thought about reading line by line and replacing line by line to a new file, with each X lines removing lines from the original file, but methods I have found to cut lines in files end up eating all memory.
In linux I would have used sed, I know it exists for windows but I do not want to add a dependency to the script.
Here is the command that is run:
$expr = "& 'C:\Program Files\MySQL\MySQL Server 5.7\bin\mysqldump.exe' --defaults-extra-file=env.cnf --log-error=err.log --no-create-info foo | ForEach-Object -Process {$_ -replace 'foo','bar' | Add-Content dump.sql}"
Invoke-Expression $expr
UPDATE
I have found that even piping out to out-null eats up all the memory:
& 'C:\Program Files\MySQL\MySQL Server 5.7\bin\mysqldump.exe' --defaults-extra-file=env.cnf --log-error=err.log --no-create-info foo | out-null
also the scripts run on an amazon virtual machine which has powershell 4
UPDATE 2
This also eats up all the memory, but it does not when running from cmd:
& 'C:\Program Files\MySQL\MySQL Server 5.7\bin\mysqldump.exe' --defaults-extra-file=env.cnf --log-error=err.log --no-create-info foo > dump.sql
Do you know how to call the full replace command with cmd? I do not manage to escape the mysqldump executable path
UPDATE 3
Realized that my dump contains huge tables, which results in some of the INSERT line being extremely long (thus the memory usage maybe). I tries without extended inserts but it is too long to import then.
If the disk space is premium, how about compressing the data? If NTFS compression isn't good enough, let's write the output into a GZipStream. It should offer good savings for text data. Thus the file on disk would be considerably smaller.
First off, a compression function (idea from a blog post):
function Compress-Stream {
[CmdletBinding()]
param (
[Parameter(Mandatory = $true, ValueFromPipeline=$true)]
[AllowEmptyString()]
[string]$Row
)
begin {
$ms = New-Object System.IO.MemoryStream
$cs = New-Object System.IO.Compression.GZipStream($ms, [System.IO.Compression.CompressionMode]::Compress)
$sw = New-Object System.IO.StreamWriter($cs)
}
process {
if(-not [string]::IsNullOrWhiteSpace($row)) {
$sw.Write($Row + [environment]::NewLine)
}
}
end {
try {$cs.Close(); $cs.Dispose()} catch{}
try {$sw.Close(); $sw.Dispose()} catch{}
$s = [System.Convert]::ToBase64String($ms.ToArray());
try {$ms.Close(); $ms.Dispose()} catch {}
$s
}
}
Sample usage is to query DBA Overflow data dump. Tt's much more manageable that SO. On my system the result set is 13 MB uncompressed, 3,5 MB compressed.
# SQL Server, so sqlcmd for illustration.
# Pipe results to compression and pipe compressed data into a file
sqlcmd -E -S .\sqli001 -d dbaoverflow -Q "select id, postid from votes order by id;" `
| compress-stream | Set-Content -Encoding ascii -Path c:\temp\data.b64
This should provide a compressed text file. To process it, use MemoryStream and GZipStream again:
$d = get-content c:\temp\data.b64
$data = [System.Convert]::FromBase64String($d)
$ms = New-Object System.IO.MemoryStream
$ms.Write($data, 0, $data.Length)
$ms.Seek(0,0) | Out-Null
$sr = New-Object System.IO.StreamReader(New-Object System.IO.Compression.GZipStream($ms, [System.IO.Compression.CompressionMode]::Decompress))
# $sr can now read decompressed data. For example,
$sr.ReadLine()
id postid
$sr.ReadLine()
----------- -----------
$sr.ReadLine()
1 2
Doing replacements and writing the final result into another a file should be easy enough.
In the end I use python to replace the string in the dump file while sending it to mysql.
It is fast enough and low on memory.

Can't concatenate value of 2 variables in batch script

I have created a batch file that reads a .json file containing npm packages and their version.
Now everything works except concatenating values in 2 variables before writing to file
This is the InstallPackage.json file:
{
"dependencies": {
"#types/react": "16.3.10",
"react": "16.3.2",
"react-dom": "16.3.2",
"react-scripts": "1.1.4"
},
"devdependencies":{}
}
This is the batch file:
<# : batch portion (contained within a PowerShell multi-line comment)
#echo off & setlocal
setlocal EnableDelayedExpansion
set LF=^
set JSON=
for /f "delims=" %%x in (InstallPackage.json) do (
set "JSON=!JSON!%%x!LF!"
)
rem # re-eval self with PowerShell and capture results
for /f "delims=" %%I in ('powershell "iex (${%~f0} | out-string)"') do set "%%~I"
rem # output captured results
set JSON[
rem # end main runtime
goto :EOF
: end batch / begin PowerShell hybrid code #>
add-type -AssemblyName System.Web.Extensions
$JSON = new-object Web.Script.Serialization.JavaScriptSerializer
$obj = $JSON.DeserializeObject($env:JSON)
# output object in key=value format to be captured by Batch "for /f" loop
foreach ($key in $obj.keys) {
foreach($package in $obj[$key].keys){
### error happens here ###
echo $package#$obj[$key][$package] >> packages.txt
}
}
instead of getting
#types/react#16.3.10
I get:
#types/react#System.Collections.Generic.Dictionary`2[System.String,System.Object][dependencies][#types/react]
It's a simple concatenate don't know why it's not working.
Your PowerShell output line isn't quite PowerShell-ese. It's sort of PowerCmd or something. Replace
echo $package#$obj[$key][$package] >> packages.txt
with
"JSON[{0:d}]={1}#{2}" -f $i++, $package, $obj[$key][$package] >> packages.txt
... if you want to populate the JSON[] array when PowerShell exits back to the Batch environment, or
"{0}#{1}" -f $package, $obj[$key][$package] >> packages.txt
... if you only want the concatenated values sent to packages.txt. And if that's the case, get rid of the for /F loop within Batch. You can also use gc within PowerShell to read the JSON, which is much more graceful than trying to populate a Batch variable with a multi-line value.
<# : batch portion (contained within a PowerShell multi-line comment)
#echo off & setlocal
set "JSON=InstallPackage.json"
rem # re-eval self with PowerShell and capture results
powershell -noprofile "iex (${%~f0} | out-string)"
rem # end main runtime
goto :EOF
: end batch / begin PowerShell hybrid code #>
add-type -AssemblyName System.Web.Extensions
$JSON = new-object Web.Script.Serialization.JavaScriptSerializer
$obj = $JSON.DeserializeObject((gc $env:JSON))
# output object in key#value format to packages.txt
foreach ($key in $obj.keys) {
foreach($package in $obj[$key].keys){
"{0}#{1}" -f $package, $obj[$key][$package] >> packages.txt
}
}
And I've got one more recommendation. Instead of writing packages.txt using the append redirect, your intentions would be clearer if you use the out-file cmdlet. Do you expect packages.txt to be created from scratch on every run? Or will packages.txt continue to grow with multiple runs? If you revisit this code a year from now, will you remember your intentions?
This would be clearer:
# output object in key#value format to packages.txt
&{
foreach ($key in $obj.keys) {
foreach($package in $obj[$key].keys){
"{0}#{1}" -f $package, $obj[$key][$package]
}
}
} | out-file packages.txt
If your intention is to grow packages.txt with successive runs, then use
out-file packages.txt -append
See? This way, packages.txt will explicitly be either overwritten or appended on each run, and will not require any checks whether the file already exists or expectations that the file will not already exist. It's less ambiguous this way.
So much code for something so simple. At least for Xidel:
xidel -s InstallPackage.json -e "let $a:=$json/dependencies return $a() ! join((.,$a(.)),'#')" >> packages.txt
'package.txt':
#types/react#16.3.10
react#16.3.2
react-dom#16.3.2
react-scripts#1.1.4

Powershell Copy Directories - Using A CSV Listing - Source - Destination Path

I am trying to Copy Directories - folders and Sub folders to another location using a CSV file that lists the source and destination of each directory or folder to be copied.
The Contents of the CSV are as such below:
I have referenced this thread:
https://serverfault.com/questions/399325/copying-list-of-files-through-powershell
Import-CSV C:\Users\WP\Desktop\a.csv | foreach{Copy-item "$_.Source" "$_.Destination"}
Error Received
CategoryInfo : ObjectNotFound: (#{Source=C:String) [Copy-Item], DriveNotFoundException
+ FullyQualifiedErrorId : DriveNotFound,Microsoft.PowerShell.Commands.CopyItemCommand
The other question I have is if in the CSV I want to copy to a folder that does not exists in the destination - can I use the CSV to command powershell to create the folder?
Thank you for your advice.
PowerShell will not expand the variable and access the property of the object inside the variable if you have them placed in double quotes by default. Only the '$_' is being expanded and '.source' is being tacked on to the end of the string, so from the view of the shell, your command looks something like Copy-item "{source=C:\Users\WP\Desktop\a;Destination=C:\Users\WP\Desktop\a}.Source" "{source=C:\Users\WP\Desktop\a;Destination=C:\Users\WP\Desktop\a}.Destination", which is probably not what you mean.
Here is the syntax that should work (I also included -Recurse so that it will copy the items inside the directory as well)
Import-CSV C:\Users\WP\Desktop\a.csv | foreach{Copy-item -Path $_.Source -Destination $_.Destination -Recurse}
Note: if you want to access the properties on an object inside of double quotes, use this syntax "$($_.source)".
For a csv like this:
Source,Destination
D:\junk\test1,D:\junk\test3
D:\junk\test2,D:\junk\test4
You can use code like the following:
$csv = Import-Csv D:\junk\test.csv
$csv | ForEach-Object {
if (-not (Test-Path $_.Destination)) {
New-Item -Name $_.Destination -ItemType Directory -Force -WhatIf
}
Copy-Item $_.Source $_.Destination -Recurse -Force -WhatIf
}
Suggestions for learning more about PowerShell:
Use WhatIf to test things.
Research what each line of this code does.
Experiment with code to see what it does.
Learn and use the debugger (PowerShell ISE) to help you write better code.
Remove the WhatIf parameters from the code to have it execute for real...
If you have dozens of problems that all involve doing the same thing with each element of a list, you might want to consider getting or writing a generic CSV template expander tool, like Expand-csv. With this tool you start with a CSV file and a template, and generate a script that contains all the commands.
Sample.csv looks like this:
Source,Destination
C:\Users\WP\Desktop\a,C:\Users\WP\Desktop\c
C:\Users\WP\Desktop\b,C:\Users\WP\Desktop\d
Sample.tmplt looks like this:
Copy-Item -Path $Source -Destination $Destination -Recurse
And the command to invoke Expand-csv looks like this:
Expand-csv Sample.csv Sample.tmplt > Sample.ps1
The output file, Sample.ps1 contains one copy command for each entry in the CSV file
And here is the definition of Expand-csv:
<# This function is a table driven template tool.
It's a refinement of an earlier attempt.
It generates output from a template and
a driver table. The template file contains plain
text and embedded variables. The driver table
(in a csv file) has one column for each variable,
and one row for each expansion to be generated.
5/13/2015
#>
function Expand-csv {
[CmdletBinding()]
Param(
[Parameter(Mandatory=$true)]
[string] $driver,
[Parameter(Mandatory=$true)]
[string] $template
)
Process
{
$OFS = "`r`n"
$list = Import-Csv $driver
[string]$pattern = Get-Content $template
foreach ($item in $list) {
foreach ($key in $item.psobject.properties) {
Set-variable -name $key.name -value $key.value
}
$ExecutionContext.InvokeCommand.ExpandString($pattern)
}
}
}

PowerShell script to go through a directory of csvs and convert them to html

This is a PowerShell question, not a SharePoint question.
I'm using a script to grab an inventory of SharePoint features, web parts, etc. It outputs each type of report in the same directory as csv files. So I'll end up with a directory on my computer with the csv files.
I'd like to run another PowerShell script after the first one that converts these csvs into html files for easily readable reports.
I'm getting stuck on the part where I would import-csv each file and create each html file with similarly named html files.
Here's what I have so far. Can anyone help me complete this to do what I want it to do? To use Import-CSV, I have to specify the file name as you can see in $dir. Is there another way?
$dir = "C:\Users\me\Desktop\output\TestInvSiteCollections.csv"
dir -LiteralPath $dir | % {Import-Csv $dir}
or use this somehow..
Import-Csv -LiteralPath $dir | ConvertTo-Html | Out-File "C:\Users\me\Desktop\output\myhtmlfile.html"
I would do it like this:
Get-ChildItem -Path 'C:\Users\me\Desktop\output\*.csv' | ForEach-Object {
Import-Csv $_ | ConvertTo-Html | Out-File -FilePath (Join-Path -Path $_.DirectoryName -ChildPath ($_.BaseName + '.html'));
}
I'm not entirely sure I find html tables easier to read than csv files. Excel's filtering and sorting is too useful.
Here's your code. It should split the name of the file and add the extension.
Import-Csv -LiteralPath $dir | ConvertTo-Html | Out-File ($dir.Split(".")[0]+".html")
This is a slightly more verbose method, but I generally prefer readable code over conciseness for maintainability:
#get the list of csv files
$csvFiles = Get-ChildItem $path -Filter *.csv
foreach ($file in $csvFiles)
{
#create FileInfo object
[System.IO.FileInfo]$fileInfo = "$path\$file"
#Get base name of file
$baseName = [System.IO.Path]::GetFileNameWithoutExtension($file.Name)
#do HTML conversion
Import-Csv $fileInfo.FullName | ConvertTo-Html | Out-File "$htmlPath\$baseName.html"
}
This is working code assuming you have $path defined somewhere and can obviously be modified to suite your needs.

Powershell code for mysql is not working

I have following code of Powershell where i am trying to sort lastest backup file of mysql database and then try to import this file
I am using the Powershell script for this according to script till the last i get desired o/p and then i copy this o/p and execute in seprate
cmd window it execute smooth but in power shell when i try to do the same thing it fails with following error please help me
Error message
C:\wamp\bin\mysql\mysql5.5.24\bin\mysql.exe --user=root --password=xxx testdest < "C:\mysqltemp\testsrc_2013-12-23_10-46-AM.sql"
cmd.exe : The system cannot find the file specified.
At C:\Users\IBM_ADMIN\AppData\Local\Temp\8a7b4576-97b2-42aa-a0eb-42bb934833a6.ps1:19 char:4
+ cmd <<<< /c " "$pathtomysqldump" --user=$param1 --password=$param2 $param3 < $param5 "
+ CategoryInfo : NotSpecified: (The system cann...file specified.:String) [], RemoteException
+ FullyQualifiedErrorId : NativeCommandError
Script is as following
##Select latest file created by Export of mysql dumper
$a=(get-childitem C:\mysqltemp | sort LastWriteTime -Descending | Select-Object Name | select -first 1 -ExpandProperty Name)
$pathtomysqldump = "C:\wamp\bin\mysql\mysql5.5.24\bin\mysql.exe"
#Write-Host "Print variable A -------------"
#$a
$a=$a.Replace(" ", "")
#Write-Host "After Triming ---------------"
#$a
$param1="root"
$param2="xxx"
$param3="testdest"
#$param4="""'<'"""
$param5="""C:\mysqltemp\$a"""
#$p1="$param1 $param2 $param3 < $param5"
# Invoke backup Command. /c forces the system to wait to do the backup
Write-Host " "$pathtomysqldump" --user=$param1 --password=$param2 $param3 < $param5 "
cmd /c " "$pathtomysqldump" --user=$param1 --password=$param2 $param3 < $param5 "
Thanks and Appreciate your help and time for the same.
This is a common misunderstanding involving calling command lines in the Windows operating system, particularly from PowerShell.
I highly recommend using the Start-Process cmdlet to launch a process instead of calling cmd.exe. It's much easier to mentally parse out and understand the path to the executable, and all of the command line parameters separately. The problem with your current script is that you're trying to call an executable file with the following name: C:\wamp\bin\mysql\mysql5.5.24\bin\mysql.exe --user=root --password=xxx testdest < "C:\mysqltemp\testsrc_2013-12-23_10-46-AM.sql", which has been wrapped in a call to cmd.exe. Obviously, that file does not exist, because you're including all of the parameters as part of the filesystem path.
There are too many layers going on here to make it simple to understand. Instead, use Start-Process similar to the following example:
# 1. Define path to mysql.exe
$MySQL = "C:\wamp\bin\mysql\mysql5.5.24\bin\mysql.exe"
# 2. Define some parameters
$Param1 = 'value1';
$Param2 = 'value2';
$Param3 = 'value 3 with spaces';
# 3. Build the command line arguments
# NOTE: Since the value of $Param3 has spaces in it, we must
# surround the value with double quotes in the command line
$ArgumentList = '--user={0} --password={1} "{2}"' -f $Param1, $Param2, $Param3;
Write-Host -Object "Arguments are: $ArgumentList";
# 4. Call Start-Process
# NOTE: The -Wait parameter tells it to "wait"
# The -NoNewWindow parameter prevents a new window from popping up
# for the process.
Start-Process -FilePath $MySQL -ArgumentList $ArgumentList -Wait -NoNewWindow;