I am using PowerShell 4 to make a series of web requests. From one call I get a generic array - for the sake of discussion it looks like this
$data = '[{"Id":"1","Name":"One"},{"Id":"2","Name":"Two"}]'
I am trying to parse this data to pull out the Name properties. However, when I use the following call it writes a line with Name and no information under it:
$data | ConvertFrom-Json | Select-Object Name
But if I save the object to an intermediate object like this it works:
$o1 = $data | ConvertFrom-Json
$o1 | Select-Object Name
I get the proper output.
The object types are different but I don't understand why. Here's output from relevant Get-Member calls:
$test | ConvrtFrom-Json | gm
TypeName: System.Object[]
and
$o1 | gm
TypeName: System.Management.Automation.PSCustomObject
Can anyone help me understand what I'm doing wrong in terms of my collection management? I'd like to be able to do this in one statement.
It seems like the parentheses are needed for some mysterious reason, as OP pointed out in the comment. Adding parentheses was the fix for me as well.
I am not sure why this breaks without parens, but I can confirm that code without parens is not an issue when executed in powershell core.
Related
fairly new to powershell and I have given myself a bit of a challenge which I think should be possible, I'm just not sure about the best way around it.
We have a user who has a large number of columns in a csv (can vary from 20-50), rows can vary between 1 and 10,000. the data is say ClientName,Address1,Address2,Postcode etc.. (although these can vary wildly depending on the source of the data - external companies) This needs importing into a system using a pre-built routine which looks at the file and needs the database column headers as the csv headers. so say ClientDisplay,Ad_Line1,Ad_Line2,PCode etc..
I was thinking along the lines of either a generic powershell 'mapping' form which could read the headers from ExternalSource.csv and either a DatabaseHeaders.csv (or a direct sql query lookup) display them as columns in a form and then highlight one from each column and a 'link' button, once you have been through all the columns in ExternalSource.csv a 'generate csv' button which takes the mapped headers an appends the correct data columns from ExternalSource.csv
Am I barking up the wrong tree completely trying to use powershell? at the moment its a very time consuming process so just trying to make life easier for users.
Any advice appreciated..
Thanks,
Jon
You can use the Select-Object cmdlet with dynamic columns to shape the data into the form you need.
Something like:
Import-Csv -Path 'source.svc' |
Select-Object Id, Name, #{ Name='Ad_Line1'; Expression={ $_.Address1 } } |
Export-Csv -Path 'target.csv'
In this example, the code #{ Name='Ad_Line1'; Expression={ $_.Address1 } } is a dynamic column, that creates a column with name AD_Line1' and the value ofAddress1`
It is possible to read the column mappings from a file, you will have to write some code to read the file, select the properties and create the format.
A very simple solution could be to read the Select-Object part from another script file, so you can differentiate that part for each import.
A (simple, naive, low performant) solution could look like this (untested code):
# read input file
$input = Import-Csv -Path $inputFile
# read source, target name columns from mapping file
$mappings = Import-Csv -Path $mappingFile | Select Source, Target
# apply transformations
$transformed = $input
foreach($mapping in $mappings) {
# collect the data, add an extra column for each mapping
$transformed = $transformed | Select-Object *, #{ Name = $mapping.Target; Expression = { $_.$($mapping.Source) } }
}
#export transformed data
$transformed | Export-Csv -Path $outputFile
Alternatively; It is possible to convert the data into XML with Import-Csv | Export-CliXml, apply an Xslt template on the Xml to perform a transformation, and save the Xml objects into Csv again with Import-CliXml | Export-Csv.
See this blog by Scott Hansleman on how you can use XSLT with PowerShell.
I want to load a JSON file and show it in a powershell GridView. I was hoping this would work:
'[{"a":1,"b":2},{"a":3,"b":4},{"a":5,"b":6}]' | ConvertFrom-Json | Out-GridView
But that just shows me this unhelpful view:
How can I transform the list into something the grid view understands?
('[{"a":1,"b":2},{"a":3,"b":4},{"a":5,"b":6}]' | ConvertFrom-Json) | Out-GridView
# or
$converted = '[{"a":1,"b":2},{"a":3,"b":4},{"a":5,"b":6}]' | ConvertFrom-Json
$converted | Out-GridView
This is a peculiarity of ConvertFrom-Json and anything that uses it implicitly (like Invoke-RestMethod). It doesn't seem to pass objects along the pipeline as you would expect, so you must complete the pipeline to get the objects, and then use them afterwards.
One way to do that is to assign it to a variable, another way is to wrap it in parentheses ( ).
I am not certain why this is the case, but I imagine it's an implementation detail about what it does internally and how it returns its objects.
I was trying to see if I could dig more into this, using ForEach-Object to see what's going wrong, but instead it actually just worked, so here's another way to get it working, but in a single pipeline (by using a superfluous ForEach-Object):
'[{"a":1,"b":2},{"a":3,"b":4},{"a":5,"b":6}]' | ConvertFrom-Json | ForEach-Object { $_ } | Out-GridView
# non-scrolling
'[{"a":1,"b":2},{"a":3,"b":4},{"a":5,"b":6}]' | ConvertFrom-Json | % { $_ } | ogv
I'm trying to cycle through a csv and replace any values in a column named Enabled from True to A.
Import-Csv .\test.csv | Where-Object {$_.Enabled -eq 'True'} --> what goes here to replace 'True' with 'A'?
Where-Object acts like a filter, so the columns that get passed to the rest of the pipeline will only be the ones where Enabled is True; which will prevent you from including the others in your output file (I'm assuming you want to have a complete file at the end).
So I would recommend using ForEach-Object and then modifying based on a condition inside there, but still passing each object through (modified or not):
Import-Csv .\test.csv | ForEach-Object {
if ($_.Enabled -eq 'True') {
$_.Enabled = 'A'
}
$_
} | Export-Csv .\test-modified.csv -NoTypeInformation
Briantist's answer works just fine. If you really wanted to get crazy with it you could create an Excel comobject, select the workbook/sheet, then select the "enabled" entire column, snip out empty cells/column header, then loop through and essentially do the same thing as what briantist said, although this way you can do things like add conditional formatting, etc. Just depends what all you are trying to do
I'm very new to PowerShell and I'm trying to build on older batch files that I made into PowerShell and add some features.
At the moment I have a CSV file which I've used in the pass as a sort of "environment" file, previously I would do batch jobs against this CSV file.
I have a line
Import-Csv "csvfile" | select-object -property * | out-gridview -passthru
The CSV file is built something like:
Name,location,folder
Test,e,Testsite
Test1,c,windows
test2,c,temp
Basically I want to select one of the grows and click Okay and assign the 3 items to variables.. $foldername,$driveLetter,$destinationDirectory
I've looked high and low and I can't seem to manage it I did find one example on StackOverflow which I shamelessly copied, massaged and got to work ... but that gridview is prebuilt by the OP of that post and doesn't have things like the piping to grid-view.-Passthru has (Filter & scroll bar) but I was able to assign variables using this method but my CSV is pretty huge and I want to be able to have it auto size itself and filter / scroll.
You need to use the -OutPutMode Single option of Out-Gridview to restrict selection to a single item from the gridview.
Import-Csv "csvfile" |
select-object -property * |
out-gridview -OutputMode Single -Title 'Select a row' |
ForEach-Object {
$foldername,$driveLetter,$destinationDirectory = $_.Name,$_.location,$_.folder
}
I am assuming this warning is causing problem.
WARNING: GeoReplicationEnabled property will be deprecated in a future
release of Azure PowerShell. The value will be merged into the
AccountType property
because when I did this command
Get-AzureWebsite | export-csv -Path "C:\Users\km\Desktop\AzureProject\Hello Pay-As-You-Go-Website.csv"
my CSV file is totally fine
SO the problem I am having is
When I execute this command
Get-AzureStorageAccount | Format-Table -Property StorageAccountName, Location, AccountType, StorageAccountStatus
The result is like this
StorageAccountName Location AccountType
StorageAccountStatus
--------------------- --------- ------------ -------------------- HelloSushi East US Standard_GRS Created
WARNING: GeoReplicationEnabled property will be deprecated in a future
release of Azure PowerShell. The value will be merged into the
AccountType property.
and I add this code to move this result to CSV like this
Get-AzureStorageAccount | Format-Table -Property StorageAccountName, Location, AccountType, StorageAccountStatus | export-csv -Path "C:\Users\km\Desktop\AzureProject\Susco Pay-As-You-Go-Storage.csv"
but I checked on CSV.file, it is totally does not make sense. it is not same one.
so ,
I would like to show the result exactly on CSV like when I did this code
Get-AzureStorageAccount | Format-Table -Property StorageAccountName, Location, AccountType, StorageAccountStatus
How can I do that?
Try Out-File instead of Export-CSV it is giving the exact same output as console
==
You can't do a Format-List to Export-Csv, this link explains it.