I would like to be able to find all blanks from a CSV file and if a blank character is found on a line then should appear on the screen and I should be asked if I want to keep the entire line which contains that white space or remove it.
Let's say the directory is C:\Cr\Powershell\test. In there there is one CSV file abc.csv.
Tried doing it like this but in PowerShell ISE the $_.PSObject.Properties isn't recognized.
$csv = Import-Csv C:\Cr\Powershell\test\*.csv | Foreach-Object {
$_.PSObject.Properties | Foreach-Object {$_.Value = $_.Value.Trim()}
}
I apologize for not includding more code and what I tried more so far but they were silly attempts since I just begun.
This looks helpful but I don't know exactly how to adapt it for my problem.
Ok man here you go:
$yes = New-Object System.Management.Automation.Host.ChoiceDescription "&Yes", "Retain line."
$no = New-Object System.Management.Automation.Host.ChoiceDescription "&No", "Delete line."
$n = #()
$f = Get-Content .\test.csv
foreach($item in $f) {
if($item -like "* *"){
$res = $host.ui.PromptForChoice("Title", "want to keep this line? `n $item", [System.Management.Automation.Host.ChoiceDescription[]]($yes, $no), 0)
switch ($res)
{
0 {$n+=$item}
1 {}
}
} else {
$n+=$item
}
}
$n | Set-Content .\test.csv
if you have questions please post in the comments and i will explain
Get-Content is probably a better approach than Import-Csv, because that'll allow you to check an entire line for spaces instead of having to check each individual field. For fully automated processing you'd just use a Where-Object filter to remove non-matching lines from the output:
Get-Content 'C:\CrPowershell\test\input.csv' |
Where-Object { $_ -notlike '* *' } |
Set-Content 'C:\CrPowershell\test\output.csv'
However, since you want to prompt for each individual line that contains spaces you need a ForEach-Object (or a similiar construct) and a nested conditional, like this:
Get-Content 'C:\CrPowershell\test\input.csv' | ForEach-Object {
if ($_ -notlike '* *') { $_ }
} | Set-Content 'C:\CrPowershell\test\output.csv'
The simplest way to prompt a user for input is Read-Host:
$answer = Read-Host -Prompt 'Message'
if ($answer -eq 'y') {
# do one thing
} else {
# do another
}
In your particular case you'd probably do something like this for any matching line:
$anwser = Read-Host "$_`nKeep the line? [y/n] "
if ($answer -ne 'n') { $_ }
The above checks if the answer is not n to make removal of the line a conscious decision.
Other ways to prompt for user input are choice.exe (which has the additional advantage of allowing a timeout and a default answer):
choice.exe /c YN /d N /t 10 /m "$_`nKeep the line"
if ($LastExitCode -ne 2) { $_ }
or the host UI:
$title = $_
$message = 'Keep the line?'
$yes = New-Object Management.Automation.Host.ChoiceDescription '&Yes'
$no = New-Object Management.Automation.Host.ChoiceDescription '&No'
$options = [Management.Automation.Host.ChoiceDescription[]]($yes, $no)
$answer = $Host.UI.PromptForChoice($title, $message, $options, 1)
if ($answer -ne 1) { $_ }
I'm leaving it as an exercise for you to integrate whichever prompting routine you chose with the rest of the code.
Related
i have the following simple function which is used several times in a script which iterates through a directory and checks the age of the files in it.
function log($Message) {
$logFilePath = 'C:\logPath\myLog.txt'
$date = Get-Date -Format 'yyyyMMddHHmmss'
$logMessage = "{0}_{1}" -f $date,$Message
if(Test-Path -Path $logFilePath) {
$logFileContent = Get-Content -Path $logFilePath
} else {
$logFileContent = ''
}
$logMessage,$logFileContent | Out-File -FilePath $logFilePath
}
i figured out that this eats up all ram. i don't understand why. i thought the scope of variables in a function are destroyed once the function is run. i fixed the ram issue by adding Remove-Variable logMessage,logFileContent,logFilePath,date to the very end of the function but would like to know how this ram issue could be solved otherwise and why the variables within the function are not destroyed automatically.
Powershell or .Net has a garbage collector, so freeing the memory isn't instant. Garbage Collection in Powershell to Speed Scripts Also the memory management is probably better in Powershell 7. I tried repeating your function many times, but the memory usage didn't go above a few hundred megs.
There's probably some more efficient .net way to prepend a line to a file: Prepend "!" to the beginning of the first line of a file
I have a weird way to prepend a line. I'm not sure how well this would work with a large file. With a 700 meg file, the Working Set memory stayed at 76 megs.
$c:file
one
two
three
$c:file = 'pre',$c:file
$c:file
pre
one
two
three
ps powershell
Handles NPM(K) PM(K) WS(K) CPU(s) Id SI ProcessName
------- ------ ----- ----- ------ -- -- -----------
607 27 67448 76824 1.14 3652 3 powershell
Although as commented, I can hardly believe that this function gobbles up your memory, you could optimize it:
function log ([string]$Message) {
$logFilePath = 'C:\logPath\myLog.txt'
# prefix the message with the current date
$Message = "{0:yyyyMMddHHmmss}_{1}" -f (Get-Date), $Message
if (Test-Path -Path $logFilePath -PathType Leaf) {
# newest log entry on top: append current content
$Message = "{0}`r`n{1}" -f $Message, (Get-Content -Path $logFilePath -Raw)
}
Set-Content -Path $logFilePath -Value $Message
}
I just want to rule out the RAM usage is caused by prepending to the file. Have you tried not storing the log contents in a variable? i.e.
$logMessage,(Get-Content -Path $logFilePath) | Out-File -FilePath $logFilePath
Edit 5/8/20 - Turns out that prepending (when done efficiently) isn't as slow as I thought - it is on the same order as using -Append. However the code (that js2010 pointed to) is long and ugly, but if you really need to prepend to the file, this is the way to do it.
I modified the OP's code a bit to automatically insert a new line.
function log-prepend{
param(
$content,
$filePath = 'C:\temp\myLogP.txt'
)
$file = get-item $filePath
if(!$file.exists){
write-error "$file does not exist";
return;
}
$filepath = $file.fullname;
$tmptoken = (get-location).path + "\_tmpfile" + $file.name;
write-verbose "$tmptoken created to as buffer";
$tfs = [System.io.file]::create($tmptoken);
$fs = [System.IO.File]::Open($file.fullname,[System.IO.FileMode]::Open,[System.IO.FileAccess]::ReadWrite);
try{
$date = Get-Date -Format 'yyyyMMddHHmmss'
$logMessage = "{0}_{1}`r`n" -f $date,$content
$msg = $logMessage.tochararray();
$tfs.write($msg,0,$msg.length);
$fs.position = 0;
$fs.copyTo($tfs);
}
catch{
write-verbose $_.Exception.Message;
}
finally{
$tfs.close();
$fs.close();
if($error.count -eq 0){
write-verbose ("updating $filepath");
[System.io.File]::Delete($filepath);
[System.io.file]::Move($tmptoken,$filepath);
}
else{
$error.clear();
[System.io.file]::Delete($tmptoken);
}
}
}
Here was my original answer that shows how to test the timing using a stopwatch.
When you prepend to a log file, you're reading the entire log file into memory, then writing it back.
You really should be using append - that would make the script run a lot faster.
function log($Message) {
$logFilePath = 'C:\logPath\myLog.txt'
$date = Get-Date -Format 'yyyyMMddHHmmss'
$logMessage = "{0}_{1}" -f $date,$Message
$logMessage | Out-File -FilePath $logFilePath -Append
}
Edit: To convince you that prepending to a log file is a bad idea, here's a test you can do on your own system:
function logAppend($Message) {
$logFilePath = 'C:\temp\myLogA.txt'
$date = Get-Date -Format 'yyyyMMddHHmmss'
$logMessage = "{0}_{1}" -f $date,$Message
$logMessage | Out-File -FilePath $logFilePath -Append
}
function logPrepend($Message) {
$logFilePath = 'C:\temp\myLogP.txt'
$date = Get-Date -Format 'yyyyMMddHHmmss'
$logMessage = "{0}_{1}" -f $date,$Message
if(Test-Path -Path $logFilePath) {
$logFileContent = Get-Content -Path $logFilePath
} else {
$logFileContent = ''
}
$logMessage,$logFileContent | Out-File -FilePath $logFilePath
}
$processes = Get-Process
$stopwatch = [system.diagnostics.stopwatch]::StartNew()
foreach ($p in $processes)
{
logAppend($p.ProcessName)
}
$stopwatch.Stop()
$stopwatch.Elapsed
$stopwatch = [system.diagnostics.stopwatch]::StartNew()
foreach ($p in $processes)
{
logPrepend($p.ProcessName)
}
$stopwatch.Stop()
$stopwatch.Elapsed
I've run this several times until I got a few thousand lines in the log file.
Going from: 1603 to 1925 lines, my results were:
Append: 7.0167008 s
Prepend: 21.7046793 s
Yesterday I asked this question. Now I would like to do almost the same exercise with a small change: if there is a blank character on a line (go line by line in the CSV) ask the user if the blank character should be removed or not; the difference is, ONLY the blank and not the whole line.
The code which works for my previous question is:
$yes = New-Object System.Management.Automation.Host.ChoiceDescription "&Yes", "Retain line."
$no = New-Object System.Management.Automation.Host.ChoiceDescription "&No", "Delete line."
$n = #()
$f = Get-Content .\test.csv
foreach ($item in $f) {
if($item -like "* *"){
$res = $host.ui.PromptForChoice("Title", "want to keep this line? `n $item", [System.Management.Automation.Host.ChoiceDescription[]]($yes, $no), 0)
switch ($res) {
0 {$n+=$item}
1 {}
}
} else {
$n+=$item
}
}
$n | Set-Content .\test.csv
What I think I should use is the Trim() function to achieve this.
So, I think I should modify in the if clause like below (aplogies for the silly syntax mistakes which I might do):
$yes = New-Object System.Management.Automation.Host.ChoiceDescription "&Yes", "Retain blank."
$no = New-Object System.Management.Automation.Host.ChoiceDescription "&No", "Delete blank."
$n = #()
$f = Get-Content .\test.csv
foreach ($item in $f) {
if ($item -like "* *") {
$res = $host.ui.PromptForChoice("Title", "want to keep the blank on this line? `n $item", [System.Management.Automation.Host.ChoiceDescription[]]($yes, $no), 0)
switch ($res) {
0 {$n+=$item.Trim()}
1 {}
}
} else {
$n+=$item.Trim()
}
}
$n | Set-Content .\test.csv
This runs, but still deteles the line, so it shouldn't matter if I trimmed it or not first, I need to fix it so it will be kept or trimmed but not discarded .
EDIT:
Adjusting the switch ($res) like this doesn't work:
switch ($res) {
0 {$n+=$item.Trim()}
1 {$n+=$item}
}
} else {
$n+=$item.Trim()
}
Trim() without parameter removes all whitespace (not just spaces) from beginning and end of a string. You can't use it for removing spaces anywhere else in a string. Instead use the -replace operator:
$_ -replace ' '
Note that this time you need to output the unmodified string not only if it doesn't contain a space, but also if the user chooses to keep the existing space(s).
So I have a script that runs at logon to search for PST's on a users machine, then copies them to a holding area waiting for migration.
When the search/copy is complete it outputs to a CSV that looks something like this:
Hostname,User,Path,Size_in_MB,Creation,Last_Access,Copied
COMP1,user1,\\comp1\c$\Test PST.pst,20.58752,08/12/2015,08/12/2015,Client copied
COMP1,user1,\\comp1\c$\outlook\outlook.pst,100,08/12/2015,15,12,2015,In Use
The same logon script has an IF to import the CSV if the copied status is in use and makes further attempts at copying the PST into the holding area. If it's successful it exports the results to the CSV file.
My question is, is there anyway of getting it to either amend the existing CSV changing the copy status? I can get it to add the new line to the end, but not update.
This is my 'try again' script:
# imports line of csv where PST file is found to be in use
$PST_IN_USE = Import-CSV "\\comp4\TEMPPST\PST\$HOSTNAME - $USER.csv" | where { $_.copied -eq "In Use" }
ForEach ( $PST_USE in $PST_IN_USE )
{ $NAME = Get-ItemProperty $PST_IN_USE.Path | select -ExpandProperty Name
$NEW_NAME = $USER + "_" + $PST_IN_USE.Size_in_MB + "_" + $NAME
# attempts to copy the file to the pst staging area then rename it.
TRY { Copy-Item $PST_IN_USE.Path "\\comp4\TEMPPST\PST\$USER" -ErrorAction SilentlyContinue
Rename-Item "\\comp4\TEMPPST\PST\$USER\$NAME" -NewName $NEW_NAME
# edits the existing csv file replacing "In Use" with "Client Copied"
$PST_IN_USE.Copied -replace "In Use","Client Copied"
} # CLOSES TRY
# silences any errors.
CATCH { }
$PST_IN_USE | Export-Csv "\\comp4\TEMPPST\PST\$HOSTNAME - $USER.csv" -NoClobber -NoTypeInformation -Append
} # CLOSES ForEach ( $PST_USE in $PST_IN_USE )
This is the resulting CSV
Hostname,User,Path,Size_in_MB,Creation,Last_Access,Copied
COMP1,user1,\\comp1\c$\Test PST.pst,20.58752,08/12/2015,08/12/2015,Client copied
COMP1,user1,\\comp1\c$\outlook\outlook.pst,100,08/12/2015,15,12,2015,In Use
COMP1,user1,\\comp1\c$\outlook\outlook.pst,100,08/12/2015,15,12,2015,Client copied
It's almost certainly something really simple, but if it is, it's something I've yet to come across in my scripting. I'm mostly working in IF / ELSE land at the moment!
If you want to change the CSV file, you have to write it completely again, not just appending new lines. In your case this means:
# Get the data
$data = Import-Csv ...
# Get the 'In Use' entries
$inUse = $data | where Copied -eq 'In Use'
foreach ($x in $inUse) {
...
$x.Copied = 'Client Copied'
}
# Write the file again
$data | Export-Csv ...
The point here is, you grab all the lines from the CSV, modify those that you process and then write the complete collection back to the file again.
I've cracked it. It's almost certainly a long winded way of doing it, but it works and is relatively clean too.
#imports line of csv where PST file is found to be in use
$PST_IN_USE = Import-CSV "\\comp4\TEMPPST\PST\$HOSTNAME - $USER.csv" | where { $_.copied -eq "In Use" }
$PST_IN_USE | select -ExpandProperty path | foreach {
# name of pst
$NAME = Get-ItemProperty $_ | select -ExpandProperty Name
# size of pst in MB without decimals
$SIZE = Get-ItemProperty $_ | select -ExpandProperty length | foreach { $_ / 1000000 }
# path of pst
$PATH = $_
# new name of pst when copied to the destination
$NEW_NAME = $USER + "_" + $SIZE + "_" + $NAME
TRY { Copy-Item $_ "\\comp4\TEMPPST\PST\$USER" -ErrorAction SilentlyContinue
TRY { Rename-Item "\\comp4\TEMPPST\PST\$USER\$NAME" -NewName $NEW_NAME -ErrorAction SilentlyContinue | Out-Null }
CATCH { $NEW_NAME = "Duplicate exists" }
$COPIED = "Client copied" }
CATCH { $COPIED = "In use" ; $NEW_NAME = " " }
$NEW_FILE = Test-Path "\\comp4\TEMPPST\PST\$HOSTNAME - $USER 4.csv"
IF ( $NEW_FILE -eq $FALSE )
{ "Hostname,User,Path,Size_in_MB,Creation,Last_Access,Copied,New_Name" |
Set-Content "\\lccfp1\TEMPPST\PST\$HOSTNAME - $USER 4.csv" }
"$HOSTNAME,$USER,$PATH,$SIZE,$CREATION,$LASTACCESS,$COPIED,$NEW_NAME" |
Add-Content "\\comp4\TEMPPST\PST\$HOSTNAME - $USER 4.csv"
} # CLOSES FOREACH #
$a = Import-CSV "\\comp4\TEMPPST\PST\$HOSTNAME - $USER.csv" | where { $_.copied -ne "in use" }
$b = Import-Csv "\\comp4\TEMPPST\PST\$HOSTNAME - $USER 4.csv"
$a + $b | export-csv "\\comp4\TEMPPST\PST\$HOSTNAME - $USER 8.csv" -NoClobber -NoTypeInformation
Thanks for the help. Sometimes it takes a moments break and a large cup of coffee to see things a different way.
I have an performance issue with the below code. I want to parse some information from a JSON file to a CSV. The JSON itself has around 200k lines. The performance of this conversion is not good as it takes over 1h to process such a file.
I think the problem might be with the Add-Content function as I'm using a normal HDD for it. Could you please let me know if you see any improvements of the code or any changes that I could do?
$file = "$disk\TEMP\" + $mask
$res = (Get-Content $file) | ConvertFrom-Json
$file = "$disk\TEMP\result.csv"
Write-Host "Creating CSV from JSON" -ForegroundColor Green
Add-Content $file ("{0},{1},{2},{3},{4}" -f "TargetId", "EventType", "UserId", "Username", "TimeStamp")
$l = 0
foreach ($line in $res) {
if($line.EventType -eq 'DirectDownloadCompleted' -and $line.TargetDefinition -eq 'GOrder') {
#nothing here
} elseif($line.EventType -eq 'DirectDownloadCompleted' -and $line.TargetDefinition -eq 'GFile') {
Add-Content $file ("{0},{1},{2},{3},{4}" -f
$line.AssetId, $line.EventType, $line.UserId, $line.UserName, $line.TimeStamp)
$l = $l + 1
} else {
Add-Content $file ("{0},{1},{2},{3},{4}" -f $line.TargetId, $line.EventType, $line.UserId, $line.UserName, $line.TimeStamp)
$l = $l + 1
}
}
Ok, a few lessons here I think. First off, don't re-write the Export-CSV cmdlet. Instead convert your info into an array of objects, and output it all at once. This will make it so that you only have to write to the file once, which should increase your speed dramatically. Also, don't do ForEach>If>IfElse>Else when this function already exists in the Switch cmdlet. Try something like this:
$Results = Switch($res){
{$_.EventType -eq 'DirectDownloadCompleted' -and $_.TargetDefinition -eq 'GOrder'}{Continue}
{$_.EventType -eq 'DirectDownloadCompleted' -and $_.TargetDefinition -eq 'GFile'}{$_ | Select #{l='TargetId';e={$_.AssetId}},EventType,UserId,Username,TimeStamp;Continue}
Default {$_ | Select TargetId,EventType,UserId,Username,TimeStamp}
}
$Results | Export-CSV $file -NoType
$l = $Results.Count
I've got a function which works fine.
It pulls the first character of the firstname and the whole lastname from a text box in a PowerShell GUI and then it creates a sAMAccountName from both.
Now I need only the first 8 characters from the generated sAMAccountName.
Here is the function
Function Set-sAMAccountName {
Param([Switch]$Csv=$false)
if(!$Csv)
{
$GivenName = $txtFirstName.text
$Surname = $txtLastName.text
}
else{}
Switch($XML.Options.Settings.sAMAccountName.Style | Where{$_.Enabled -eq $True} | Select -ExpandProperty Format)
{
"FirstName.LastName" {"{0}.{1}" -f $GivenName,$Surname}
"FirstInitialLastName" {"{0}{1}" -f ($GivenName)[0],$Surname}
"LastNameFirstInitial" {"{0}{1}" -f $Surname,($GivenName)[0]}
Default {"{0}.{1}" -f ($GivenName)[0],$Surname}
}
}
Any ideas?
Thx a lot in advance
Substring works like that:
you pass the index of where you want to start
you pass the index of where you want to end the reading of the substring (if your not passing anything it will go until the end of the string's length)
so in your case it will be start reading at index 0 end reading at index 8:
$str = "a simple string"
$newString = $str.Substring(0,8)
I really recommend to read about string manipulation here
Okay I got it now!
I've added the -and condition to check the length of the sAMAccountName and said -lt 8 and it's working now. The sAMAccountName is now 8 characters long.
This was the code before:
$txtName_TextChanged={
Write-Verbose "Creating required account fields"
if ($XML.Options.Settings.DisplayName.Generate -eq $True) {$txtDN.Text = Set-DisplayName}
if ($XML.Options.Settings.sAMAccountName.Generate -eq $True) {$txtsAM.Text = (Set-sAMAccountName)}
if ($XML.Options.Settings.UPN.Generate -eq $True) {$txtUPN.Text = Set-UPN}
}
And after the change:
$txtName_TextChanged={
Write-Verbose "Creating required account fields"
if ($XML.Options.Settings.DisplayName.Generate -eq $True) {$txtDN.Text = Set-DisplayName}
if ($XML.Options.Settings.sAMAccountName.Generate -eq $True -and $txtsAM.Text.Length -lt 8) {$txtsAM.Text = (Set-sAMAccountName)}
if ($XML.Options.Settings.UPN.Generate -eq $True) {$txtUPN.Text = Set-UPN}
}