merging two Jsons in powershell - json

I have the following two Objects , which have been got from 2 Json file using:
$Env = ConvertFrom-Json "$(get-content "C:\chef\environments.json")"
$Roles = ConvertFrom-Json "$(get-content "C:\chef\roles.json")"
Heres the out put after conversion :
PS C:\chef> $Env
run_list
--------
{recipe[djin_chef-max_any::default]}
PS C:\chef> $Roles
7-zip : #{home=%SYSTEMDRIVE%\7-zip}
cookbook_versions :
default : #{env=development}
modmon : #{env=dev}
paypal : #{artifact=%5BINTEGRATION%5D}
seven_zip : #{url=https://djcm-zip-local/djcm/chef}
task_sched : #{credentials=XN$q}
windows : #{password=K1N5}
I need to merge these two Json objects in powershell and I tried the following:
PS C:\chef> $Roles+$Env
Method invocation failed because [System.Management.Automation.PSObject] does not contain a method named 'op_Addition'.
At line:1 char:1
+ $Roles+$Env
+ ~~~~~~~~~~~
+ CategoryInfo : InvalidOperation: (op_Addition:String) [], RuntimeException
+ FullyQualifiedErrorId : MethodNotFound
Is there another elegant way of doing it if I am doing it wrong or why am I getting this error ?

$Env only has one property, so you could add a new member to $Roles:
$Roles | Add-Member -NotepropertyName run_list -NotePropertyValue $Env.run_list
This syntax work in PowerShell v3, but you listed v2 and v2 in your tags.. so for v2:
$Roles | Add-Member -MemberType NoteProperty -Name run_list -Value $Env.run_list

Related

PowerShell 7 Remoting from Azure DevOps Pipeline

We were using Azure DevOps Pipelines and PowerShell Remoting to execute PS1 scripts:
trigger: none
jobs:
- job: PSRemoting
timeoutInMinutes: 5760
pool:
name: 'DevOps-Agent2-VM'
steps:
- checkout: none
- task: PowerShellOnTargetMachines#3
displayName: 'PSRemoting'
inputs:
Machines: '$(VM-PublicIP)'
UserName: '$(VM-UserName)'
UserPassword: '$(VM-Password)'
InlineScript: |
. C:\Scripts\Test-Parallel.ps1
We have added -parallel feature that requires PowerShell 7 and our scripts are failing now:
2022-11-02T13:37:18.3246154Z ForEach-Object : Parameter set cannot be resolved using the specified named parameters.
At line:1 char:1
+ & 'C:\windows\System32\WindowsPowerShell\v1.0\powershell.exe' -NoLogo ...
+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+ CategoryInfo : NotSpecified: (ForEach-Object ...med parameters.:String) [], RemoteException
+ FullyQualifiedErrorId : NativeCommandError
At C:\Scripts\Test-Parallel.ps1:3 char:10
+ $items | ForEach-Object -Parallel {
+ ~~~~~~~~~~~~~~~~~~~~~~~~~~
+ CategoryInfo : MetadataError: (:) [ForEach-Object], ParentContainsErrorRecordExcep
tion
+ FullyQualifiedErrorId : AmbiguousParameterSet,Microsoft.PowerShell.Commands.ForEachObjectCo
mmand
##[error]Non-Zero exit code: '1' for ComputerName: 'VM1'
##[error]Atleast one remote job failed. Consult logs for more details. ErrorCodes(s): 'RemoteDeployer_NonZeroExitCode'
##[section]Finishing: PSRemoting
Here is Test-Parallel.ps1:
$items = 1..100
$items | ForEach-Object -Parallel {
Write-Host "$(Get-Date) | Sleeping for 1 second..."
Start-Sleep 1
} -ThrottleLimit 10
How can we force PowerShell to run version 7 with our Azure DevOps Pipeline task?
Try this in your Test-Parallel.ps1 script:
workflow Test-PipelineDeploy {
$items = 1..100
foreach -parallel ($item in $items) {
Write-Host "$(Get-Date) | Sleeping for 1 second..."
Start-Sleep 1
} -ThrottleLimit 10
}
Then call Test-PipelineDeploy

JSON Powershell memory issue

I use a command to read a JSON file, this all works perfectly, until the file becomes large.
I currently have a JSON file of about 1.5GB. I read the file using Powershell using the following command:
get-content -Path C:\TEMP\largefile.json | out-string | ConvertFrom-Json
It returns the following error:
out-string : Exception of type 'System.OutOfMemoryException' was thrown.
+ ... oices = get-content -Path C:\TEMP\largefile.json | out-string | Conve ...
+ ~~~~~~~~~~
+ CategoryInfo : NotSpecified: (:) [Out-String], OutOfMemoryException
+ FullyQualifiedErrorId : System.OutOfMemoryException,Microsoft.PowerShell.Commands.OutStringCommand
I've increased the memory as shown here:
get-item wsman:localhost\Shell\MaxMemoryPerShellMB
WSManConfig: Microsoft.WSMan.Management\WSMan::localhost\Shell
Type Name SourceOfValue Value
---- ---- ------------- -----
System.String MaxMemoryPerShellMB 8096
Any ideas on how to process this?
Edit (additions based on comments):
When I remove the out-string I get this error:
ConvertFrom-Json : Exception of type 'System.OutOfMemoryException' was thrown.
+ ... oices = get-content -Path C:\TEMP\largefile.json | ConvertFrom-Json ...
+ ~~~~~~~~~~~~~~~~
+ CategoryInfo : NotSpecified: (:) [Out-String], OutOfMemoryException
+ FullyQualifiedErrorId : System.OutOfMemoryException,Microsoft.PowerShell.Commands.OutStringCommand
The Powershell version that I have is: 5.1.17763.1490
The file contains multiple columns regarding PDF files. These files are exported via an API into a JSON so it contains the file metadata such as owner and when it was created but also the actual PDF file in the column Body which later will be decoded to an actual PDF file. The structure is as followed:
[{"Id":"ID","ParentId":"parent","Name":"filename","OwnerId":"owner","CreatedDate":"date","Body":"*******"}
{"Id":"ID","ParentId":"parent","Name":"filename","OwnerId":"owner","CreatedDate":"date","Body":"*******"}
{"Id":"ID","ParentId":"parent","Name":"filename","OwnerId":"owner","CreatedDate":"date","Body":"*******"}
{"Id":"ID","ParentId":"parent","Name":"filename","OwnerId":"owner","CreatedDate":"date","Body":"*******"}
{"Id":"ID","ParentId":"parent","Name":"filename","OwnerId":"owner","CreatedDate":"date","Body":"*******"}
]
Thank for the details.
For this issue I would try to convert each line separately and stream that through your process:
Get-Content C:\TEMP\largefile.json | ForEach-Object {
$_ = $_.Trim().TrimStart('[').TrimEnd(']')
if ($_) { $_ | ConvertFrom-Json }
}
As already suggested, I wouldn't be surprised if these memory issues wouldn't appear in PowerShell core. if possible, I recommend you to also give that a try.

Do Azure workflows need additional authentication considerations?

I'm unable to deploy availability sets in parallel from a simple table from Powershell ISE onto my MSDN subscription.
Table
Type RG Name Loc
AvSet NLG NLGUTCDCPWFEAVL01 eastus2
AvSet NLG NLGUTCDCPAPPAVL01 eastus2
AvSet NLG NLGUTCDCPCCDBAVL01 eastus2
This works when executed without a workflow.
$c=Import-Csv C:\Users\ayanm\Downloads\NLG.csv|? type -eq 'AVSet'
foreach ($b in $c)
{New-AzureRmAvailabilitySet -ResourceGroupName $b.RG -Name $b.name -Location $b.loc}
But when I try to put it in a workflow, it doesn't
Workflow Deploy-AVSet
{$c=Import-Csv C:\Users\ayanm\Downloads\NLG.csv|? type -eq 'AVSet'
foreach -Parallel ($b in $c)
{New-AzureRmAvailabilitySet -ResourceGroupName $b.RG -Name $b.name -Location $b.loc}
}
Error:
Microsoft.PowerShell.Utility\Write-Error : Run Login-AzureRmAccount to login.
At Deploy-AVSet:4 char:4
+ CategoryInfo : NotSpecified: (:) [Write-Error], RemoteException
+ FullyQualifiedErrorId : System.Management.Automation.RemoteException,Microsoft.PowerShell.Commands.WriteErrorCommand
Checked Powershell version; 5.1. Updated all modules. Rebooted computer. Is this an unsupported workflow activity?
https://blogs.technet.microsoft.com/heyscriptingguy/2013/01/02/powershell-workflows-restrictions/
The `Login-AzureRmAccount' cmdlet doesn't MSDN crediantial object. So I added an O365 account as an owner of the subscription and am able to deploy in parallel.
Workflow Deploy-AVSet
{$c=Import-Csv C:\Users\ayanm\Downloads\NLG.csv|? type -eq 'AVSet'
$cred= New-Object System.Management.Automation.PSCredential "name#domain.onmicrosoft.com",$(ConvertTo-SecureString "Password" -asplaintext -force)
foreach ($b in $c)
{AzureRM.Resources\Login-AzureRmAccount -Credential $cred
New-AzureRmAvailabilitySet -ResourceGroupName $b.RG -Name $b.name -Location $b.loc -PlatformFaultDomainCount $b.faultdomain -PlatformUpdateDomainCount $b.UpdateDomain
}
}

Issue with get shortcut function not loading in separate script

So I have the following script I am trying to run which keeps erroring out
. .\stshortcut.ps1 |
Get-Shortcut . |
Where Target -eq "cmd.exe" |
%{$myPath, $myNewName = $null;
Write-Warning "Processing $($_.Link)";
If (-Not (Test-Path .\BadShortcuts -PathType Container)) {New-Item -
WhatIf -ItemType Directory BadShortcuts | Out-Null};
[string]$myPath = $_.Arguments.Split()[-1] -replace '"';
[string]$myNewName = $_.Link -replace "\.lnk$";
Rename-Item -WhatIf -Force -Path $myPath -NewName $myNewName;
(Get-Item -Force $myNewName).Attributes = '';
Move-Item -WhatIf $_.LinkPath .\BadShortcuts;}`
the error I get is as follows
The term 'Get-Shortcut' is not recognized as the name of a cmdlet,
function, script file, or operable program. Check the spelling of the
name, or if a path was included, verify that the path is correct and try
again.
At C:\Shared\APPS\FixShortcutX2.ps1:1 Char 13
+ Get Shortcut <<<<< . |
+ CategoryInfor : ObjectNotFound: (Get-Shortcut:String)[],
CommandNotFoundException
+ FullyQualifiedErrorID : CommandNotFoundException
the stshortcut.ps1 script has the get-shortcut and set-shortcut functions and are called to do such - I got this script from
https://www.reddit.com/r/PowerShell/comments/4su2jg/zeroday_malware_renamed_folders_on_a_shared_drive/
which is an answer script to fix a macro virus from a word doc attachment - sent from a spoofed email address -
Any assistance is GREATLY appreciated
EDITx2 after some further helpful advice and editing I now am receiving the following
Where-Object : Cannot bind parameter 'FilterScript' . Cannot convert the "Target" value of type "System.String" to type "System.Management.Automation.ScriptBlock".
At C:\Shared\Apps\FixShortcutX2.ps1:3 char:6
+ Where <<<<< Target -eq "cmd.exe" |
+CategoryInfo : InvalidArgument: (:) [Where-Object], ParameterBindingException
+ FullyQualifiedErrorId : CannotConvertArgumentNoMessage,Microsoft.PowerShell.Commands.WhereObjectCommand
You need to dot-source stshortcut.ps1 before you can use Get-Shortcut.
Change the first line:
.\stshortcut.ps1 |
To
. .\stshortcut.ps1

Powershell - Where-Object : A positional parameter cannot be found that accepts argument 'System.Object[]

I want to add the profile pictures of the exchange to our users in Active Directory.
I tried to do this in Powershell, but cause I'm pretty new to Powershell i'm already stucking.
My Script:
$Name = #()
$Photo = #()
Import-Csv C:\temp\adusers.csv |
ForEach-Object {
$Name += $_.DisplayName
}
Import-Csv C:\temp\photolinks.csv |
ForEach-Object {
$Photo += $_.PSChildName
}
$Name | ForEach-Object {
Where $Photo -Like "*$Name*" {
Set-UserPhoto "$Name" -PictureData ([System.IO.File]::ReadAllBytes("F:\1 path\" + $Photo))
}
}
I created a ps script to get all our AD Users before and I created a list of our userphotos with gci before as well. I exported those results in a csv and I now want to import thos data and tell Powershell add for every User in the List adusers a photo with Set-UserPhoto.
The error message I got is the following:
Where-Object : A positional parameter cannot be found that accepts argument 'System.Object[]'.
At C:\Users\user1\Desktop\setuserphoto.ps1:14 char:13
+ Where $Photo -Like "*$Name*" {
+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+ CategoryInfo : InvalidArgument: (:) [Where-Object], ParameterBindingException
+ FullyQualifiedErrorId : PositionalParameterNotFound,Microsoft.PowerShell.Commands.WhereObjectCommand
Can anyone help me here?