Convert DataTable to CSV - csv

I want to be able to Import-Csv into a PowerShell table so I can edit each section with PowerShell script, e.g. $table.row[0].name = 100. Import-Csv doesn't give that "table" it makes with CSV file.
$tabName = "TableA"
$table = New-Object system.Data.DataTable "$tabName"
$col1 = New-Object system.Data.DataColumn ColumnName1,([string])
$col2 = New-Object system.Data.DataColumn ColumnName2,([int])
$table.columns.add($col1)
$table.columns.add($col2)
$row = $table.NewRow()
$row.ColumnName1 = "A"
$row.ColumnName2 = "1"
$table.Rows.Add($row)
$row = $table.NewRow()
$row.ColumnName1 = "B"
$row.ColumnName2 = "2"
$table.Rows.Add($row)
$table.PrimaryKey = $table.Columns[0]
$table | Export-Csv C:\test.csv
I want a way to $table | Import-Csv C:\test.csv and have the ability to $table.row[0].columnname1 = "C" and it changes "A" to "C". Then I can re-export it after making changes.

You can export a DataTable object to CSV simply by piping the table into the Export-Csv cmdlet:
$table | Export-Csv C:\table.csv -NoType
However, by doing that you lose all type information of the table columns (the Export-Csv cmdlet can only save information about the type of the objects that represent the rows, not about the type of their properties).
A better way to save and restore a DataTable object is to save the table as XML:
$writer = New-Object IO.StreamWriter 'C:\path\to\data.xml'
$table.WriteXml($writer, [Data.XmlWriteMode]::WriteSchema)
$writer.Close()
$writer.Dispose()
and restore the XML into a DataSet:
$ds = New-Object Data.DataSet
$ds.ReadXml('C:\path\to\data.xml', [Data.XmlReadMode]::ReadSchema)
$table = $ds.Tables[0]
Make sure to export and import the schema along with the data, because that's where the type information is stored.

The result of import-csv, as you found, is not a .NET DataTable object. There is this function to convert to a DataTable called "Out-DataTable" https://gallery.technet.microsoft.com/scriptcenter/4208a159-a52e-4b99-83d4-8048468d29dd . I don't know that it can go both directions, though.

Related

How to clone a PowerShell PSCustomObject variable to disconnect it from another variable? [duplicate]

I have a powershell script in which I do the following
$somePSObjectHashtables = New-Object Hashtable[] $somePSObject.Length;
$somePSObjects = Import-CSV $csvPath
0..($somePSObject.Length - 1) | ForEach-Object {
$i = $_;
$somePSObjectHashtables[$i] = #{};
$somePSObject[$_].PSObject.Properties | ForEach-Object {
$somePSObjectHashtables[$i][$_.Name] = $_.Value;
}
}
I need to do this because I want to make several distinct copies of the data in the CSV to perform several distinct manipulations. In a sense I'm performing an "INNER JOIN" on the resulting array of PSObject. I can easily iterate through $somePSObjectHashtables with a ForEach-Object and call Hashtable.Clone() on each member of the array. I can then use New-Object PSObject -Property $someHashTable[$i] to get a deep copy of the PSObject.
My question is, is there some easier way of making the deep copy, without an intermediary Hashtable?
Note that here is a shorter, maybe a bit cleaner version of this (that I quite enjoy):
$data = Import-Csv .\test.csv
$serialData = [System.Management.Automation.PSSerializer]::Serialize($data)
$data2 = [System.Management.Automation.PSSerializer]::Deserialize($serialData)
Note:
However, weirdly, it does not keep the ordering of ordered hashtables.
$data = [ordered] #{
1 = 1
2 = 2
}
$serialData = [System.Management.Automation.PSSerializer]::Serialize($data)
$data2 = [System.Management.Automation.PSSerializer]::Deserialize($serialData)
$data2
Will output:
Name Value
---- -----
2 2
1 1
While with other types it works just fine:
$data = [PsCustomObject] #{
1 = 1
2 = 2
}
$data = #(1, 2, 3)
For getting really deep copies we can use binary serialization (assuming that all data are serializable; this is definitely the case for data that come from CSV):
# Get original data
$data = Import-Csv ...
# Serialize and Deserialize data using BinaryFormatter
$ms = New-Object System.IO.MemoryStream
$bf = New-Object System.Runtime.Serialization.Formatters.Binary.BinaryFormatter
$bf.Serialize($ms, $data)
$ms.Position = 0
$data2 = $bf.Deserialize($ms)
$ms.Close()
# Use deep copied data
$data2
Here's an even shorter one that I use as a function:
using namespace System.Management.Automation
function Clone-Object ($InputObject) {
<#
.SYNOPSIS
Use the serializer to create an independent copy of an object, useful when using an object as a template
#>
[psserializer]::Deserialize(
[psserializer]::Serialize(
$InputObject
)
)
}

Powershell: Custom object to CSV

I created custom object that basically stores Date and few integers in it's keys:
$data = [Ordered]#{
"Date" = $currentdate.ToString('dd-MM-yyyy');
"Testers" = $totalTesterCount;
"StNoFeedback" = $tester_status.NoFeedback;
"StNotSolved" = $tester_status.NotSolved;
"StSolved" = $tester_status.Solved;
"StNoIssues" = $tester_status.NoIssues;
"OSNoFeedback" = $tester_os.NoFeedback;
"OSW7" = $tester_os.W7;
"OSW10" = $tester_os.W10;
"OfficeNoFeedback" = $tester_Office.NoFeedback;
"OfficeO10" = $tester_Office.O10;
"OfficeO13" = $tester_Office.O13;
"OfficeO16" = $tester_Office.O16;
}
I need to Output it to CSV file in a way that every value is written in new column.
I tried using $data | export-csv dump.csv
but my CSV looks like that:
#TYPE System.Collections.Specialized.OrderedDictionary
"Count","IsReadOnly","Keys","Values","IsFixedSize","SyncRoot","IsSynchronized"
"13","False","System.Collections.Specialized.OrderedDictionary+OrderedDictionaryKeyValueCollection","System.Collections.Specialized.OrderedDictionary+OrderedDictionaryKeyValueCollection","False","System.Object","False"
Not even close to what I want to achieve. How to get something closer to:
date,testers,stnofeedback....
04-03-2016,2031,1021....
I created the object because it was supposed to be easy to export it as csv. Maybe there is entirely different, better approach? Or is my object lacking something?
You didn't create an object, you created an ordered dictionary. A dictionary can't be exported to CSV-directly as it's a single object which holds multiple key-value-entries.
([ordered]#{}).GetType()
IsPublic IsSerial Name BaseType
-------- -------- ---- --------
True True OrderedDictionary System.Object
To export a dictionary you need to use GetEnumerator() to get the objects one by one, which would result in this CSV:
$data = [Ordered]#{
"Date" = (get-date).ToString('dd-MM-yyyy')
"Testers" = "Hello world"
}
$data.GetEnumerator() | ConvertTo-Csv -NoTypeInformation
"Name","Key","Value"
"Date","Date","04-03-2016"
"Testers","Testers","Hello world"
If you want a single object, cast the hashtable of properties to a PSObject using [pscustomobject]#{}.
$data = [pscustomobject]#{
"Date" = (get-date).ToString('dd-MM-yyyy')
"Testers" = "Hello world"
}
$data | ConvertTo-Csv -NoTypeInformation
"Date","Testers"
"04-03-2016","Hello world"
Or if you're using PS 1.0 or 2.0:
$data = New-Object -TypeName psobject -Property #{
"Date" = (get-date).ToString('dd-MM-yyyy')
"Testers" = "Hello world"
}
$data | ConvertTo-Csv -NoTypeInformation
"Testers","Date"
"Hello world","04-03-2016"

PowerShell Error: Import-CSV Cannot Open File

My PowerShell code is tripping some strange error codes, which I have thoroughly searched for to no avail.
I am trying to create a script that calculates some basic statistics for a number of CSV files, which seems to work until I try to create and populate new columns.
The error code I'm getting is:
Import-Csv: Cannot open file "C:\zMFM\Microsoft.Powershell.Commands.GenericMeasureInfo".
At C:\zMFM\StatsDebug.ps1:46 Char:21
+$STATS2 = Import-CSV <<< $STATS
+CategoryInfo : OpenError (:) [Import-Csv], FileNotFoundException
+FullyQualifiedErrorId : FileOpenFailure.Microsoft.Powershell.Commands.ImportCsvCommand at C:\zMFM\statsdebug.ps1:55 char:20
That's followed by an error using a null expression, but I assume fixing this problem with Import-Csv will in turn fix that error. Here's my code, any help would be great, thanks!
$i = 1
#$colSTDDEV = New-Object System.Data.DataColumn StdDev,([double])
$colVZA = New-Object System.Data.DataColumn VZA,([double])
#$colVAZ = New-Object System.Data.DataColumn VAZ,([double])
While ($i -le 211) {
#Set the variable to the filename with the iteration number
$filename = "c:\zMFM\z550Output\20dSummer\fixed20dSum550Output$i.csv"
#Check to see if that a file with $filename exists. If not, skip to the next iteration of $i. If so, run the code to collect the
statistics for each variable and output them each to a different file
If (Test-Path $filename) {
#Calculate the Standard Deviation
#First get the average of the values in the column
$STDEVInputFile = Import-CSV $filename
#Find the average and count for column 'td'
$STDEVAVG = $STDEVInputFile | Measure-Object td -Average | Select Count, Average
$DevMath = 0
# Sum the squares of the differences between the mean and each value in the array
Foreach ($Y in $STDEVInputFile) {
$DevMath += [math]::pow(($Y.Average - $STDEVAVG.Average), 2)
#Divide by the number of samples minus one
$STDEV = [Math]::sqrt($DevMath / ($STDEVAVG.Count-1))
}
#Calculate the basic statistics for column 'td' with the MEASURE-OBJECT cmdlet
$STATS = Import-CSV $Filename |
Measure-Object td -ave -max -min
#Append the standard deviation variable to the statistics array and add the value
$STATS2 = Import-CSV $Stats
$VZA = $filename.VZA
#$VAZ = $filename.VAZ #COMMENTED FOR DEBUGING
#$STATS2.Columns.Add($colSTDDEV) #COMMENTED FOR DEBUGING
#$STATS2[0].StdDev = $STDEV #COMMENTED FOR DEBUGING
$STATS2.Columns.Add($colVZA)
$STATS2[0].VZA = $VZA
#$STATS2.Columns.Add($colVAZ) #COMMENTED FOR DEBUGING
#$STATS2[0].VZA = $VAZ #COMMENTED FOR DEBUGGING
#Export the $STATS file containing everything you need in the correct folder
Export-CSV -notype "c:\zMFM\z550Output\20dSummer\20dSum550Statistics.csv"
}
$i++
}
$STDEVInputFile = Import-CSV $filename
...
$STATS = Import-CSV $Filename |
Measure-Object td -ave -max -min
# $STATS here will be type [GenericMeasureInfo], so you cannot use this as a filename.
$STATS2 = Import-CSV $Stats
# Are you trying to import $filename a third time here? You can't use $Stats because it's a [GenericMeasureInfo] object, not a [string].
Based on your code, it looks like you're trying to import a CSV with a filename of type [Microsoft.PowerShell.Commands.GenericMeasureInfo]. Filenames are strings. Also, why are you importing $filename twice? Would recommend importing it once, then operating off of the variable you saved it to.
Would also recommend changing your while loop to 1..211 | ForEach-Object. That way you won't have to worry about whether $i is less than or equal to your static number. Not a huge deal, but would make the code a little more readable.
TL;DR
Import-CSV $Stats is the problem. $Stats is not a valid filename, so you can't open/import it.

Powershell Storing sqlResults into an array

I have a power shell script that querys the database and returns two columns which are a key value pair. Let's call them a & b.
How do I store this in a map to be called at a later date? Below is mysql code, it runs and prints out columns out to the screen.
$Connection = New-Object MySql.Data.MySqlClient.MySqlConnection
$Connection.ConnectionString = $ConnectionString
$Connection.Open()
$Command = New-Object MySql.Data.MySqlClient.MySqlCommand($Query, $Connection)
$DataAdapter = New-Object MySql.Data.MySqlClient.MySqlDataAdapter($Command)
$DataSet = New-Object System.Data.DataSet
$RecordCount = $dataAdapter.Fill($dataSet, "data")
$DataSet.Tables[0]
Just not sure how to store the key value pair in an mapto be used later. Both columns are numeric.
Thanks.
You want to cycle through each object in the results?
$Records = $DataSet.Tables[0]
$Records.Keys | ForEach-Object {$Record.$_}
This will allow you to iterate through the objects and run some bit of code for each of them.
For instance, I've got a hashtable like this:
$Records = [hashtable]#{'Name'='Stephen';'Hair'='Red'}
The results:
$Records.Keys | ForEach-Object {"object `$Records.$_ = $($Records.$_)"}
>object $Records.Hair = Red
>object $Records.Name = Stephen
If this isn't what you're looking for, please comment so that I can try to help you to a solution.

Powershell SQL query result not fitting

Hi I have constructed a script that works fine except for one thing, sometimes the returned string is so long that it doesnt fit in the powershell console and when I later on send the text to a richtextbox I get all the ....... at the end and not the whole string
$username = "myaccount"
$sqlconnection = New-Object system.data.sqlclient.sqlconnection
$sqlconnection.ConnectionString ="server=myserver\sccm;database=sccm;trusted_connection=true;"
$sqlconnection.Open()
sqlcmd = New-Object system.data.sqlclient.sqlcommand
$sqlcmd = $sqlconnection.CreateCommand()
$sqlcmd.CommandText = "SELECT Info from SCCM.dbo.log WHERE Username = '$username'"
$sqlcmd.Connection = $sqlconnection
$data = New-Object system.data.sqlclient.sqldataadapter $sqlcmd
$dataset = New-Object system.data.dataset
$data.Fill($dataset)
$global:result = $dataset.Tables
I cannot specify the -Width parameter anywhere so I am lost on how to get the full length of the result?
Rather than display in powershell, you could save the dataset as a csv file:
#Fill the dataset with the SQL response. Using [void] redirects console output to null (don't display)
[void]$data.Fill($dataset)
#Pipe the contents of the dataset. Use Select to select all columns/properties excluding those that were created by the DataSet object (not actual data)
#Pipe to Export-CSV to create a CSV file, use -notypeinformation flag to skip Object type information from the file (e.g. System.String etc)
$dataset.Tables[0] | Select * -ExcludeProperty RowError, RowState, HasErrors, Table, ItemArray | Export-CSV -notypeinformation -path C:\Somedir\Somefile.csv