I've been trying to assign a cmdlet to a variable and then being able to inspect the variable's value as [it was] in the original cmdlet; as opposed to just executing it or only seeing its usual properties with commands like:
get-variable $seevars | fl *
or:
get-childitem -path variable:\$seevars | fl *
E.g. I create a variable:
$seevars = get-childitem variable: | where-object -property name -match somePrefix*
After a while, I may forget what $seevars contains or may want to modify it and so will want to inspect $seevars, expecting to see the value of :
'get-childitem variable: | where-object -property name -match somePrefix*'
Instead I get, undesirably, its properties in its technical format (I assume); out of which it will be hard for me to reconstruct its original, literal cmdlet.
Is this at all possible?
I've hundreds of rows in CSV file while contains JSON data like below. Below is a sample of each row.
{"Id":"value","RecordType":"value","CreationTime":"value","Operation":"value"}
I tried to convert the same into CSV as below but no luck as of now.
Expected format of CSV file:
id RecordType CreationTime Operation
value value value value
$properties = #('Id', 'RecordType', 'CreationTime', 'Operation')
(Get-Content -Path-to_CSVfile -Raw | ConvertFrom-Json) |
Select-Object -Property $properties |
Export-Csv -NoTypeInformation -Path $path-to-new-csv-file
If someone has an idea about this please help me. I tried ConvertTo-Json but it's failing with error:
ConvertFrom-Json : Invalid JSON primitive: "id"
Here are the first two rows of CSV data.
{"Id":"ac325bc9-97f0-4b29-8fc4-90b80b945f6c","RecordType":20,"CreationTime":"2019-09-14T08:07:22","Operation":"AnalyzedByExternalApplication","OrganizationId":"f38a5ecd-2813-4862-b11b-ac1d563c806f","UserType":0,"UserKey":"3fee8456-6d20-4794-8219-5a7c381e965f","Workload":"PowerBI","UserId":"abcd#mail.com","ClientIP":"000.000.50.177","UserAgent":"MSOLAP 15.0 Client","Activity":"AnalyzedByExternalApplication","ItemName":"Other","DatasetName":"XYZ Driven Company","ObjectId":"Other","IsSuccess":true,"RequestId":"6836be8e-6e97-4bc9-a838-bf6e7b71e0c8","ActivityId":"7E92AE6A-F548-448D-93A8-6F5736DEA085"}
{"Id":"3a20c8a9-ef44-483a-b9c0-43e10deae9ae","RecordType":20,"CreationTime":"2019-09-14T08:07:20","Operation":"AnalyzedByExternalApplication","OrganizationId":"f38a5ecd-2813-4862-b11b-ac1d563c806f","UserType":0,"UserKey":"3fee8456-6d20-4794-8219-5a7c381e965f","Workload":"PowerBI","UserId":"abcd#mail.com","ClientIP":"000.000.50.177","UserAgent":"MSOLAP 15.0 Client","Activity":"AnalyzedByExternalApplication","ItemName":"Other","DatasetName":"XYZ Driven Company","ObjectId":"Other","IsSuccess":true,"RequestId":"02e5d772-057b-45b6-ae60-22b7fa610f98","ActivityId":"7E92AE6A-F548-448D-93A8-6F5736DEA085"}
I'm looking this data in another CSV file as below. Each value after ":" should insert into CSV as rows.
Id RecordType CreationTime Operation OrganizationId UserType UserKey Workload UserId ClientIP UserAgent Activity ItemName DatasetName ObjectId IsSuccess RequestId ActivityId
ac325bc9-97f0-4b29-8fc4-90b80b945f6c 20 2019-09-14T08:07:22 AnalyzedByExternalApplication f38a5ecd-2813-4862-b11b-ac1d563c806f 0 3fee8456-6d20-4794-8219-5a7c381e965f PowerBI abcd#mail.com 000.000.50.177 MSOLAP 15.0 Client AnalyzedByExternalApplication Other xyz Driven Company Other TRUE 6836be8e-6e97-4bc9-a838-bf6e7b71e0c8 7E92AE6A-F548-448D-93A8-6F5736DEA085
3a20c8a9-ef44-483a-b9c0-43e10deae9ae 20 2019-09-14T08:07:20 AnalyzedByExternalApplication f38a5ecd-2813-4862-b11b-ac1d563c806f 0 3fee8456-6d20-4794-8219-5a7c381e965f PowerBI abcd#mail.com 000.000.50.177 MSOLAP 15.0 Client AnalyzedByExternalApplication Other XYZ Driven Company Other TRUE 02e5d772-057b-45b6-ae60-22b7fa610f98 7E92AE6A-F548-448D-93A8-6F5736DEA085
Correct data from CSV when opened in text editor.
"{""Id"":""ac325bc9-97f0-4b29-8fc4-90b80b945f6c"",""RecordType"":20,""CreationTime"":""2019-09-14T08:07:22"",""Operation"":""AnalyzedByExternalApplication"",""OrganizationId"":""f38a5ecd-2813-4862-b11b-ac1d563abchrf"",""UserType"":0,""UserKey"":""3fee8456-6d20-4794-8219-5a7c38abcdfe"",""Workload"":""Pxyswer"",""UserId"":""abcd#mail.com"",""ClientIP"":""123.456.50.177"",""UserAgent"":""MSOLAP 15.0 Client"",""Activity"":""AnalyzedByExternalApplication"",""ItemName"":""Other"",""DatasetName"":""ABCD Driven Company"",""ObjectId"":""Other"",""IsSuccess"":true,""RequestId"":""6836be8e-6e97-4bc9-a838-bf6e7b71e0c8"",""ActivityId"":""7E92AE6A-F548-448D-93A8-6F5736DEA085""}"
If your input file contains just that single example data row the code you posted will work. If the input file contains multiple statements like that your code will not work, because it'd be invalid JSON data.
Valid JSON:
{"Id":"value","RecordType":"value","CreationTime":"value","Operation":"value"}
Valid JSON:
[
{"Id":"value","RecordType":"value","CreationTime":"value","Operation":"value"},
{"Id":"value","RecordType":"value","CreationTime":"value","Operation":"value"}
]
Invalid JSON:
{"Id":"value","RecordType":"value","CreationTime":"value","Operation":"value"}
{"Id":"value","RecordType":"value","CreationTime":"value","Operation":"value"}
To convert the latter kind of input data you need to convert each row as a separate JSON document:
$properties = 'Id', 'RecordType', 'CreationTime', 'Operation'
Get-Content 'C:\path\to\input.csv' |
ConvertFrom-Json |
Select-Object $properties |
Export-Csv 'C:\path\to\output.csv' -NoType
To export all input fields except particular ones you'd define the properties to exclude rather than the ones to include:
$exclude = 'foo', 'bar'
Get-Content 'C:\path\to\input.csv' |
ConvertFrom-Json |
Select-Object -Properties * -Exclude $exclude |
Export-Csv 'C:\path\to\output.csv' -NoType
Edit:
Apparently your input file is a CSV with only one column and no header, so you can import it via Import-Csv, but you need to specify the column header yourself. Expand the field to get the individual JSON values, then proceed as described above.
$properties = 'Id', 'RecordType', 'CreationTime', 'Operation'
Import-Csv 'C:\path\to\input.csv' -Header foo |
Select-Object -Expand foo |
ConvertFrom-Json |
Select-Object $properties |
Export-Csv 'C:\path\to\output.csv' -NoType
If you want all JSON values exported, simply omit the Select-Object $properties step.
I have a simple json file as follows:
[
{"ClientName": "Test Site 1", "ClientID": "000001"},
{"ClientName": "Test Site 2", "ClientID": "000002"},
{"ClientName": "Test Site 3", "ClientID": "000003"}
]
When I use the following PowerShell command:
ConvertFrom-Json (Get-Content TestSites.json -Raw)
I get back a System.Object[]. This doesn't allow me to pipe the output to another function I have which accepts "ClientName" and "ClientID" parameters.
However, when I assign that object to another variable, like this:
$myobj = ConvertFrom-Json (Get-Content TestSites.json -Raw)
$myobj is actually a System.Management.Automation.PSCustomObject which is capable of being passed to my function.
How can I just pipe the results of the original command without having to assign it to another variable first?
I hope that makes sense.
Your JSON is an array correct? PowerShell will unroll arrays in the pipeline unless you explicity change that behavior. Assuming your JSON is stored in the variable $json as a single string consider the following examples.
ConvertFrom-Json $json | ForEach-Object{$_.gettype().fullname}
System.Object[]
(convertFrom-Json $json) | ForEach-Object{$_.gettype().fullname}
System.Management.Automation.PSCustomObject
System.Management.Automation.PSCustomObject
System.Management.Automation.PSCustomObject
You should be able to wrap the expression in brackets to change the outcome. In the second example it should be sending the 3 objects individually down the pipe. In the first it is being sent as a single object array.
My explanation needs work but I am sure of the cause is how PowerShell deals with arrays and the pipeline. Unrolling being a common word used to describe it.
So depending on your use case you might just be able to wrap the expression in brackets so it gets processed before the pipe to ForEach in my example.
If you have an array such as System.Object[] you could try piping via foreach:
ConvertFrom-Json (Get-Content TestSites.json -Raw) | %{ $_ | your-function }
If you want to pass the whole array down the pipe as-is, you can try adding a comma (aka a unary comma before the variable:
,$myobj | whatever
You can probably see how the latter works by comparing the following:
$myobj | Get-Member # Shows the type of the elements of the array
Get-Member -InputObject $myobj # Shows the type of the array
,$myobj | Get-Member # Shows the type of the array
I have a PowerShell scripts which replaces
"version" : "xxx"
with
"version" : "myBuildNumber"
Now I encountered that I have multiple of these in my file.
I only want to replace the first occurrence.
I tried already Powershell - Replace first occurrences of String but it does not work with my regex.
Here's my script:
(Get-Content myFile.txt) -replace '(?<pre>"version"[\s]*:[\s]*)(?<V>"[^\"]*")', "`$1`"$Env:BUILD_VERSION`"" | Out-File myFile.txt
Since you are patching a JSON file, regex isn't the way to go. Instead you should parse the JSON, access and change the property you want and write it back:
$filePath = 'your_Path_To_project.json'
$json = (Get-Content $filePath -raw | ConvertFrom-Json)
$json.version = $Env:BUILD_VERSION
$json | ConvertTo-Json -Depth 10 | Set-Content $filePath
I am trying to take a filename such as: John_Doe_E_DOB_1/1/46_M(This is the gender)_ID_0000000_IMG_FileName_Date-of-File_1/1/15_Doc-page-1 And create a CSV file to open in Excel with column headers for: Last Name, First Name, MI, ID No, File Name, Date of File along with doc type. Here's my code so far:
Get-ChildItem -Path C:\Users\name\desktop\test -Recurse | ForEach-Object {$_ | add-member -name "Owner" -membertype noteproperty -value (get-acl $_.fullname).owner -passthru} | Sort-Object fullname | Select BaseName,Name,Owner | Export-Csv -Force -NoTypeInformation C:\Users\name\desktop\test\thing.csv
All this is doing is dropping that really long file name in at the top, and then adding the ext at the end in another column. Example:
John_Doe_E_DOB_1/1/46_M(This is the gender)_ID_0000000_IMG_FileName_Date-of-File_1/1/15_Doc-page-1 Would be in column 1 and
John_Doe_E_DOB_1/1/46_M(This is the gender)_ID_0000000_IMG_FileName_Date-of-File_1/1/15_Doc-page-1.txt <----- Would be the only difference in column 2
How can I split this up for over a million files, all different lengths, and sizes, and get it to break up into the categories listed above? All help would be greatly appreciated.
I would replace the Select stage of your pipeline with a call to a filter function like this:
filter GenObj {
$parts = $_.FullName.Split('_')
new-object pscustomobject -property #{
Owner = (get-acl $_.fullname).owner
FirstName = $parts[0]
LastName = $parts[1]
MiddleInitial = $parts[2]
# Fill in the rest
}
}
Get-ChildItem -Path C:\Users\name\desktop\test -Recurse |
Sort-Object fullname |
GenObj |
Export-Csv -Force -NoTypeInformation C:\Users\name\desktop\test\thing.csv
This will create a new custom object with all the properties on it that correspond to the parts of the filename you want to extract.
This string splitting approach may not work depending on how you handle names with no middle initial.
Also be aware that if you are processing a million files, the use of Sort-Object will cause every single FileInfo object (one for every file) to get buffered in memory so the sort can be performed. You may likely run out of memory and the command will fail. I would consider removing Sort-Object in this scenario.