PowerShell Error: Import-CSV Cannot Open File - csv

My PowerShell code is tripping some strange error codes, which I have thoroughly searched for to no avail.
I am trying to create a script that calculates some basic statistics for a number of CSV files, which seems to work until I try to create and populate new columns.
The error code I'm getting is:
Import-Csv: Cannot open file "C:\zMFM\Microsoft.Powershell.Commands.GenericMeasureInfo".
At C:\zMFM\StatsDebug.ps1:46 Char:21
+$STATS2 = Import-CSV <<< $STATS
+CategoryInfo : OpenError (:) [Import-Csv], FileNotFoundException
+FullyQualifiedErrorId : FileOpenFailure.Microsoft.Powershell.Commands.ImportCsvCommand at C:\zMFM\statsdebug.ps1:55 char:20
That's followed by an error using a null expression, but I assume fixing this problem with Import-Csv will in turn fix that error. Here's my code, any help would be great, thanks!
$i = 1
#$colSTDDEV = New-Object System.Data.DataColumn StdDev,([double])
$colVZA = New-Object System.Data.DataColumn VZA,([double])
#$colVAZ = New-Object System.Data.DataColumn VAZ,([double])
While ($i -le 211) {
#Set the variable to the filename with the iteration number
$filename = "c:\zMFM\z550Output\20dSummer\fixed20dSum550Output$i.csv"
#Check to see if that a file with $filename exists. If not, skip to the next iteration of $i. If so, run the code to collect the
statistics for each variable and output them each to a different file
If (Test-Path $filename) {
#Calculate the Standard Deviation
#First get the average of the values in the column
$STDEVInputFile = Import-CSV $filename
#Find the average and count for column 'td'
$STDEVAVG = $STDEVInputFile | Measure-Object td -Average | Select Count, Average
$DevMath = 0
# Sum the squares of the differences between the mean and each value in the array
Foreach ($Y in $STDEVInputFile) {
$DevMath += [math]::pow(($Y.Average - $STDEVAVG.Average), 2)
#Divide by the number of samples minus one
$STDEV = [Math]::sqrt($DevMath / ($STDEVAVG.Count-1))
}
#Calculate the basic statistics for column 'td' with the MEASURE-OBJECT cmdlet
$STATS = Import-CSV $Filename |
Measure-Object td -ave -max -min
#Append the standard deviation variable to the statistics array and add the value
$STATS2 = Import-CSV $Stats
$VZA = $filename.VZA
#$VAZ = $filename.VAZ #COMMENTED FOR DEBUGING
#$STATS2.Columns.Add($colSTDDEV) #COMMENTED FOR DEBUGING
#$STATS2[0].StdDev = $STDEV #COMMENTED FOR DEBUGING
$STATS2.Columns.Add($colVZA)
$STATS2[0].VZA = $VZA
#$STATS2.Columns.Add($colVAZ) #COMMENTED FOR DEBUGING
#$STATS2[0].VZA = $VAZ #COMMENTED FOR DEBUGGING
#Export the $STATS file containing everything you need in the correct folder
Export-CSV -notype "c:\zMFM\z550Output\20dSummer\20dSum550Statistics.csv"
}
$i++
}

$STDEVInputFile = Import-CSV $filename
...
$STATS = Import-CSV $Filename |
Measure-Object td -ave -max -min
# $STATS here will be type [GenericMeasureInfo], so you cannot use this as a filename.
$STATS2 = Import-CSV $Stats
# Are you trying to import $filename a third time here? You can't use $Stats because it's a [GenericMeasureInfo] object, not a [string].
Based on your code, it looks like you're trying to import a CSV with a filename of type [Microsoft.PowerShell.Commands.GenericMeasureInfo]. Filenames are strings. Also, why are you importing $filename twice? Would recommend importing it once, then operating off of the variable you saved it to.
Would also recommend changing your while loop to 1..211 | ForEach-Object. That way you won't have to worry about whether $i is less than or equal to your static number. Not a huge deal, but would make the code a little more readable.
TL;DR
Import-CSV $Stats is the problem. $Stats is not a valid filename, so you can't open/import it.

Related

What is the most efficient way to replace all \ with \\, within a huge JSON File?

I have to replace all occurrences of \ with \\ within a huge JSON Lines File. I wanted to use Powershell, but there might be other options too.
The source file is 4.000.000 lines and is about 6GB.
The Powershell script I was using took too much time, I let it run for 2 hours and it wasn't done yet. A performance of half an hour would be acceptable.
$Importfile = "C:\file.jsonl"
$Exportfile = "C:\file2.jsonl"
(Get-Content -Path $Importfile) -replace "[\\]", "\\" | Set-Content -Path $Exportfile
If the replacement is simply a conversion of a single backslash to a a double backslash, the file can be processed row by row.
Using a StringBuilder puts data into a memory buffer, which is flushed on disk every now and then. Like so,
$src = "c:\path\MyBigFile.json"
$dst = "c:\path\MyOtherFile.json"
$sb = New-Object Text.StringBuilder
$reader = [IO.File]::OpenText($src)
$i = 0
$MaxRows = 10000
while($null -ne ($line = $reader.ReadLine())) {
# Replace slashes
$line = $line.replace('\', '\\')
# ' markdown coloring is confused by backslash-apostrophe
# so here is an extra one just for looks
[void]$sb.AppendLine($line)
++$i
# Write builder contents into file every now and then
if($i -ge $MaxRows) {
add-content $dst $sb.ToString() -NoNewline
[void]$sb.Clear()
$i = 0
}
}
# Flush the builder after the while loop if there's data
if($sb.Length -gt 0) {
add-content $dst $sb.ToString() -NoNewline
}
$reader.close()
Use -ReadCount parameter for Get-Content cmdlet (and set it to 0).
-ReadCount
Specifies how many lines of content are sent through the pipeline at a
time. The default value is 1. A value of 0 (zero) sends all of the
content at one time.
This parameter does not change the content displayed, but it does
affect the time it takes to display the content. As the value of
ReadCount increases, the time it takes to return the first line
increases, but the total time for the operation decreases. This can
make a perceptible difference in large items.
Example (runs cca 17× faster for a file cca 20MB):
$file = 'D:\bat\files\FileTreeLista.txt'
(Measure-Command {
$xType = (Get-Content -Path $file ) -replace "[\\]", "\\"
}).TotalSeconds, $xType.Count -join ', '
(Measure-Command {
$yType = (Get-Content -Path $file -ReadCount 0) -replace "[\\]", "\\"
}).TotalSeconds, $yType.Count -join ', '
Get-Item $file | Select-Object FullName, Length
13,3288848, 338070
0,7557814, 338070
FullName Length
-------- ------
D:\bat\files\FileTreeLista.txt 20723656
Based on the your earlier question How can I optimize this Powershell script, converting JSON to CSV?. You should try to use the PopwerShell pipeline for this, especially as it concerns large input and output files.
The point is that you shouldn't focus on single parts of the solution to determine the performance because this usually leaves wrong impression as the performance of a complete (PowerShell) pipeline solution is supposed to be better than the sum of its parts. Besides it saves a lot of memory and result is a lean PowerShell syntax...
In your specific case, if correctly setup, the CPU will replacing the slashes, rebuilds the json strings and converting it to objects while the harddisk is busy reading and writing the data...
To implement the replacement of the slashes into the PowerShell pipeline together with the ConvertFrom-JsonLines cmdlet:
Get-Content .\file.jsonl | ForEach-Object { $_.replace('\', '\\') } |
ConvertFrom-JsonLines | ForEach-Object { $_.events.items } |
Export-Csv -Path $Exportfile -NoTypeInformation -Encoding UTF8

Can't get it to provide proper output

I have this function and I can not get it to work, the $DecimalConversion output is not coming out.. I think I am having some syntax errors.
function Get-DecimalNumber(){
$FileCheck = Test-Path "C:\Conversions\conversions.csv"
if($FileCheck){
Do
{
[int]$GetDecimal = Read-host "Write a number between 1-255" | Out-Null
}
while ($GetDecimal -notmatch '\d{1,3}' -or (1..255) -notcontains $GetDecimal)
$DecimalConversion= "{0:X}" -f $GetDecimal
$DecimalConversion
}
else{Write-Warning "Can not find conversions.csv, creating now under C:\Conversions\"; New-Item "C:\Conversions\conversions.csv" -Force | Out-Null}
}
$getfunction=Get-DecimalNumber
You could probably use a better while condition. However, ur issue is caused because of the out-null cmdlet on read-host.
If you use that, $GetDecimal will not get the value you pass in since the out-null is processed before the assignment happens. Just remove it. And it should work.
Final code, I think this looks better, let me know what you think!
function Get-DecimalNumber {
<#
.Description
The Get-DecimalNumber function gets user input for a decimal number and
converts it into hexadecimal and binary numbers. Then this data is added to an
excel file (.csv) and the date of conversion is displayed in short form m/d/yyyy.
#>
$ErrorActionPreference = 'silentlycontinue' #Silences errors
$Test = Test-Path "C:\temp\test\conversions.csv" #Variable to test path
if (! $Test) { #Checking if path does not exist
Write-Warning "conversions.csv File Not Present, creating under C:\temp\test\"
New-Item 'C:\temp\test\conversions.csv' -Force | Out-Null; break; exit #Creating path with file & suppressing output
}
else {
[int]$Num = Read-Host "Enter number from 1-255"
if ($Num -gt 255 -or $Num -le 1) {
Write-Warning "You did not enter a number in the specified range"; break; exit
}
else {
$Hex = [Convert]::ToString($Num, 16) #Converting from decimal to hexadecimal
$Bin = [Convert]::ToString($Num, 2) #Converting from decimal to binary
Write-Host "Decimal to Hex and Binary:"
$NewHashTable1 = #{ } #Creating hashtable
$NewHashTable1.Add('Decimal', $Num) #Adding values from variables to hash table
$NewHashTable1.Add('Hexadecimal', $hex)
$NewHashTable1.Add('Binary', $bin)
$NewHashTable1 #Output to screen the previously created hashtable
$NewHashTable1 >> "C:\temp\test\conversions.csv" #Appending hashtable to .csv file
Write-Output "'n"
Get-Date -Format d #Output date in short format
$Now = Get-Date -Format d
$Now >> "C:\temp\test\conversions.csv" #Output date to .csv file
}
}
}

PowerShell: Get 2 strings into a hashtable and out to .csv

PowerShell newbie here,
I need to:
Get text files in recursive local directories that have a common string, students.txt in them.
Get another string, gc.student="name,name" in the resulting file set from #1 and get the name(s).
Put the filename from #1, and just the name,name from #2 (not gc.student="") into a hashtable where the filename is paired with its corresponding name,name.
Output the hashtable to an Excel spreadsheet with 2 columns: File and Name.
I've figured out, having searched and learned here and elsewhere, how to output #1 to the screen, but not how to put it into a hashtable with #2:
$documentsfolder = "C:\records\"
foreach ($file in Get-ChildItem $documentsfolder -recurse | Select String -pattern "students.txt" ) {$file}
I'm thinking to get name in #2 I'll need to use a RegEx since there might only be 1 name sometimes.
And for the output to Excel, this: | Export-Csv -NoType output.csv
Any help moving me on is appreciated.
I think this should get you started. The explanations are in the code comments.
# base directory
$documentsfolder = 'C:\records\'
# get files with names ending with students.txt
$files = Get-ChildItem $documentsfolder -recurse | Where-Object {$_.Name -like "*students.txt"}
# process each of the files
foreach ($file in $files)
{
$fileContents = Get-Content $file
$fileName = $file.Name
#series of matches to clean up different parts of the content
#first find the gc.... pattern
$fileContents = ($fileContents | Select-String -Pattern 'gc.student=".*"').Matches.Value
# then select the string with double quotes
$fileContents = ($fileContents | Select-String '".*"').Matches.Value
# then remove the leading and trailing double quotes
$fileContents = $fileContents -replace '^"','' -replace '"$',''
# drop the objects to the pipeline so that you can pipe it to export-csv
# I am creating custom objects so that your CSV headers will nave nice column names
Write-Output [pscustomobject]#{file=$fileName;name=$fileContents}
} | Export-Csv -NoType output.csv

Newly Created Column is Null.... Why?

I'm trying to do a simple task in PowerShell where some basic statistics are calculated for a number of columns in a CSV file. I'm nearly done, but I keep getting an error where new columns I create are coming up as being Null. I cannot see where I am going wrong, here.
Specifically, the line of code causing the error is
$STATS2.Columns.Add($colVZA) |
The tables created when importing $filename do have columns named VZA, VAZ, etc., so that's not the problem.
It seems like adding and populating columns should be a simple task, so I'm sure I'm missing something simple here. Here's my code:
#######################
function Get-Type
{
param($type)
$types = #(
'System.Boolean',
'System.Byte[]',
'System.Byte',
'System.Char',
'System.Datetime',
'System.Decimal',
'System.Double',
'System.Guid',
'System.Int16',
'System.Int32',
'System.Int64',
'System.Single',
'System.UInt16',
'System.UInt32',
'System.UInt64')
if ( $types -contains $type ) {
Write-Output "$type"
}
else {
Write-Output 'System.String'
}
} #Get-Type
#######################
<#
.SYNOPSIS
Creates a DataTable for an object
.DESCRIPTION
Creates a DataTable based on an objects properties.
.INPUTS
Object
Any object can be piped to Out-DataTable
.OUTPUTS
System.Data.DataTable
.EXAMPLE
$dt = Get-psdrive| Out-DataTable
This example creates a DataTable from the properties of Get-psdrive and assigns output to $dt variable
.NOTES
Adapted from script by Marc van Orsouw see link
Version History
v1.0 - Chad Miller - Initial Release
v1.1 - Chad Miller - Fixed Issue with Properties
v1.2 - Chad Miller - Added setting column datatype by property as suggested by emp0
v1.3 - Chad Miller - Corrected issue with setting datatype on empty properties
v1.4 - Chad Miller - Corrected issue with DBNull
v1.5 - Chad Miller - Updated example
v1.6 - Chad Miller - Added column datatype logic with default to string
v1.7 - Chad Miller - Fixed issue with IsArray
.LINK
http://thepowershellguy.com/blogs/posh/archive/2007/01/21/powershell-gui-scripblock-monitor-script.aspx
#>
function Out-DataTable
{
[CmdletBinding()]
param([Parameter(Position=0, Mandatory=$true, ValueFromPipeline = $true)] [PSObject[]]$InputObject)
Begin
{
$dt = new-object Data.datatable
$First = $true
}
Process
{
foreach ($object in $InputObject)
{
$DR = $DT.NewRow()
foreach($property in $object.PsObject.get_properties())
{
if ($first)
{
$Col = new-object Data.DataColumn
$Col.ColumnName = $property.Name.ToString()
if ($property.value)
{
if ($property.value -isnot [System.DBNull]) {
$Col.DataType = [System.Type]::GetType("$(Get-Type $property.TypeNameOfValue)")
}
}
$DT.Columns.Add($Col)
}
if ($property.Gettype().IsArray) {
$DR.Item($property.Name) =$property.value | ConvertTo-XML -AS String -NoTypeInformation -Depth 1
}
else {
$DR.Item($property.Name) = $property.value
}
}
$DT.Rows.Add($DR)
$First = $false
}
}
End
{
Write-Output #(,($dt))
}
$i = 1
While ($i -le 211) {
#Set the variable to the filename with the iteration number
$filename = "c:\zMFM\z550Output\20dSummer\fixed20dSum550Output$i.csv"
#Check to see if that a file with $filename exists. If not, skip to the next iteration of $i. If so, run the code to collect the statistics for each variable and output them each to a different file
If (Test-Path $filename) {
#Calculate the Standard Deviation
#First get the average of the values in the column
$STDEVInputFile = Import-CSV $filename
#Find the average and count for column 'td'
$STDEVAVG = $STDEVInputFile | Measure-Object td -Average | Select Count, Average
$DevMath = 0
# Sum the squares of the differences between the mean and each value in the array
Foreach ($Y in $STDEVInputFile) {
$DevMath += [math]::pow(($Y.Average - $STDEVAVG.Average), 2)
#Divide by the number of samples minus one
$STDEV = [Math]::sqrt($DevMath / ($STDEVAVG.Count-1))
}
#Calculate the basic statistics for column 'td' with the MEASURE-OBJECT cmdlet
$STATS = Import-CSV $Filename |
Measure-Object td -ave -max -min |
#Export the statistics as a CSV
Export-CSV -notype "c:\zMFM\z550Output\20dSummer\tempstats$i.csv"
$GetColumns = Import-CSV $filename
#Append the standard deviation variable to the statistics table and add the value
$STATS2 = Import-CSV "c:\zMFM\z550Output\20dSummer\tempstats$i.csv"
$StatsTable = Get-PSDrive | Out-DataTable
#$colSTDDEV = New-Object System.Data.DataColumn StdDev,([double])
$colVZA = New-Object System.Data.DataColumn VZA,([double])
#$colVAZ = New-Object System.Data.DataColumn VAZ,([double])
$colVZA = $GetColumns[0].VZA
#$colVAZ = $GetColumns[0].VAZ #COMMENTED FOR DEBUGGING
#$colSTDDEV = $STDEV
#$StatsTable.Columns.Add($colSTDDEV) #COMMENTED FOR DEBUGGING
#$StatsTable[0].StdDev = $STDEV #COMMENTED FOR DEBUGGING
$StatsTable.Columns.Add($colVZA) |
#$StatsTable[0].VZA = $VZA
#$StatsTable.Columns.Add($colVAZ) #COMMENTED FOR DEBUGGING
#$StatsTable[0].VZA = $VAZ #COMMENTED FOR DEBUGGING
#Export the $STATS file containing everything you need in the correct folder
Export-CSV -notype "c:\zMFM\z550Output\20dSummer\20dSum550Statistics.csv"
}
$i++
}
Even though each object in $STATS2 have the same properties, the $STATS2 object itself is just a simple array, an unstructured list of objects - it doesn't have a Columns property with an Add() method:
$STATS2.Colums.Add($colVZA)
^ ^ ^
[array] | |
$null |
this fails
You can convert the array you get from Import-Csv from an array to a DataTable object (which has Columns) by inspecting each property in the first object in the array, like in the Out-DataTable sample on the technet script gallery

Error When Calculating Standard Deviation in PowerShell

I'm attempting to calculate some statistics for the values in each column of a CSV file using PowerShell. The Measure-Object cmdlet seems like it will do the trick for everything I need other than the Standard Deviation. I tracked down a description online where the Standard Deviation is calculated using [MATH], but when I run the code, I get the following error on the lines containing Pow():
Method invocation failed because system.management.automation.psobject doesn't contain a method named 'op_Subtraction'.
Here's my code, any help would be appreciated:
$i = 1
While ($i -le 211) {
#Set the variable to the filename with the iteration number
$filename = "c:\zMFM\z550Output\20dSummer\fixed20dSum550Output$i.csv"
#Check to see if that a file with $filename exists. If not, skip to the next iteration of $i. If so, run the code to collect the statistics for each variable and output them each to a different file
If (Test-Path $filename) {
#Calculate the Standard Deviation
#First get the average of the values in the column
$STDEVInputFile = Import-CSV $filename
#Find the average and count for column 'td'
$STDEVAVG = $STDEVInputFile | Measure-Object td -Average | Select Count, Average
$DevMath = 0
# Sum the squares of the differences between the mean and each value in the array
Foreach ($Y in $STDEVInputFile) {
$DevMath += [math]::pow(($Y - $STDEVAVG.Average), 2)
#Divide by the number of samples minus one
$STDEV = [Math]::sqrt($DevMath / ($STDEVAVG.Count-1))
}
#Calculate the basic statistics for column 'td' with the MEASURE-OBJECT cmdlet
$STATS = Import-CSV $Filename |
Measure-Object td -ave -max -min
#Append the standard deviation variable to the statistics array and add the value
$colSTDDEV = New-Object System.Data.DataColumn StdDev,([double])
$colVZA = New-Object System.Data.DataColumn VZA,([double])
$colVAZ = New-Object System.Data.DataColumn VAZ,([double])
$VZA = $Stats.VZA
$VAZ = $Stats.VAZ
$STATS.Columns.Add($colSTDDEV)
$STATS[0].StandardDev = $STDEV
$STATS.Columns.Add($colVZA)
$STATS[0].StandardDev = $VZA
$STATS.Columns.Add($colVAZ)
$STATS[0].StandardDev = $VAZ
#Export the $STATS file containing everything you need in the correct folder
Export-CSV -notype "c:\zMFM\z550Output\20dSummer\20dSum550Statistics.csv"
}
$i++
}
$DevMath += [math]::pow(($Y - $STDEVAVG.Average), 2)
You may have to replace this with:
$DevMath += [math]::pow(($Y.Average - $STDEVAVG.Average), 2)
since $Y seems to be an object, not a numeric value.