How to have Powershell read multiple values from a single CSV cell? - csv

I am trying to apply permissions to multiple folders at once. I have a .csv file with the folder paths in one column and the security groups in a second column. Most folders will have multiple groups with access. The .csv is in the following format:
folderpath groupname
---------- ---------
C:\Folder1 Group1
C:\Folder2 Group2, Group3
My code will not currently apply the permissions because there are multiple values in some groupname cells. It will apply if only one group is listed.
#Specify CSV location
$csv = import-csv G:\testcsv.csv
#Start loop
foreach($masterlist in $csv)
{
$folderpath = $masterlist.folderpath
$groupname = $masterlist.groupname
#Apply permission to folderpath
add-ntfsaccess -path $folderpath -account $groupname -accessrights modify,synchronize
}
For example, if I run the above code, Group1 will successfully be given permission to C:\Folder1. But C:\Folder2 will have nothing applied because Group2 and Group3 are both in the same cell. Can I adjust my code without having to manually separate hundreds of these cells in the .csv?
Thank you for your help!!!!!

just create inside loop foreach Group :) simplest method
#Start loop
foreach($masterlist in $csv)
{
$folderpath = $masterlist.folderpath
$groupnames = $masterlist.groupname -split ","
foreach ($group in $groupnames ) {
#Apply permission to folderpath
add-ntfsaccess -path $folderpath -account $group -accessrights modify,synchronize
}
}
this is adHoc response but i think you get the idea.

Related

What is the ItemType for SSRS Catalog Type 14?

SQL Server 2019. I manually uploaded a xlsx file to the server and it was placed in an "Excel Workbooks" group. When I look in the reportserver catalog table I see the Type value is 14. I'm trying to write a powershell script to upload a bunch of xlsx files instead of having to do it manually, one by one. I need to know the ItemType. I can use "Resource" but it doesn't upload it to the "Excel Workbooks" group, it uploads it to the "Resources" group. I did list everything in my report server and see the TypeName for the xlsx I manually uploaded is "ExcelWorkbook", but is not a valid ItemType. Any suggestions? Below is the powershell I'm using (I'm still new to powershell). Thanks!
$WebServiceUrl = "http://xxxx"
$ReportFolder = "PDF_Reports2"
$SourceDirectory = $PSScriptRoot
$Overwrite = $true
$SSRSProxy = New-WebServiceProxy -Uri $WebServiceUrl'/ReportServer/ReportService2010.asmx?WSDL' -UseDefaultCredential
# LIST ITEMS IN SERVER
#$SSRSProxy.ListChildren("/",$true)
$type = $SSRSProxy.GetType().Namespace
$datatype = ($type + '.Property')
$Property =New-Object ($datatype);
$Property.Name = "MimeType"
$Property.Value = "application/vnd.openxmlformats-officedocument.spreadsheetml.sheet"
$SourceDirectory = "c:\tmp\SSRS\"
$ItemType = "Resource" # Resource works, but it gets put in the Resources group, I want it in the Excel Workbooks group.
$ReportFolder = "/PDF_Reports2"
ForEach ($rdlfile in Get-ChildItem $SourceDirectory -Filter "*.xlsx" | Where-Object { $_.Attributes -ne "Directory" } )
{
$byteArray = [System.IO.File]::ReadAllBytes($rdlfile.FullName)
write-host $rdlfile.FullName
$Warnings = #();
$SSRSProxy.CreateCatalogItem($ItemType, $rdlfile, $ReportFolder, $Overwrite, $byteArray, $Property, [ref]$Warnings)
$warnings.Length
}

Transform exported CSV (includes embedded JSON) and save relevant columns and keys in new CSV file - Powershell

I am currently writing a script for the following problem:
Initial Problem
Data was exported from an Audit System into a CSV. The CSV itself consists of several columns of which one column has JSON data inside. Sadly there arent many options to influence the export / structure of the export. Since the amount of data included there is tough to filter and to analyse, the exported CSVs (when needed) need to be transformed so that only relevant columns and JSON keys are remaining within the new to-be-exported CSV. It has to be a new CSV as the file needs to potentially be shared. A textfile to-be-imported contains the relevant JSON keys that should remain in the to-be-exported CSV.
About the JSON: The keys can vary based on the events that are exported. Lets say there are 3-4 different variants but the textfile to be imported contains for all 3-4 keys the relevant subkeys that need to be included as new column in the export. If the subkey does not exist its okay if that particular column is empty in the export.
Initial Thoughts
a) Import the CSV and the file that is listing the relevant JSON keys that should be kept
b) Expand the JSON
c) Select the JSON entries that are relevant
d) Merge everything into new columns
e) Export again into a new file
What Questions are open / Where are the Problems?
I was writing some piece of code (I started my PS experience just 2 days ago) and encountered/wondering the following:
Are there any recommendations to improve the code since I am sure to do my very recent PS adventure there are probably many obvious things that have to be improved
Is there a way to make the export straight into a CSV format without the manual -join and then using Out-File?. I noticed that for my final test cases (I cannot share those because the data is extremely hard to anonymize) I didnt manage to come up with a delimiter (tried ",", ";" and "´t") that doesnt seem to be included in parts of the imported cells. Excel (when importing from text) doesnt seem to have an issue tho loading and parsing the data as CSV and to recognize the columns and boundaries correctly.
Happy to hear any tips!
Current Code
### Configure variables
$inputPath = "C:\Users\test\Downloads\inputTest.csv"
$formatTemplate = "C:\Users\test\Downloads\templateTest.txt"
$outputPath = "C:\Users\test\Downloads\outputTest.csv"
### Load the columns from template file to perform transformation depending on the required AuditData Fields. The file contains a list of relevant JSON keys from the Audit Data columns
$selectedAuditDataFields = Get-Content $formatTemplate
### Load CSV, select needed columns and expand the JSON entries within the AuditData column
$importCsvCompact = Import-Csv -Path $inputPath -Delimiter "," | Select-Object -Property CreationDate, UserIds, Operations, #{name = "AuditData"; Expression = {$_.AuditData | ConvertFrom-Json }}
### Calculate the number of Rows (import and export CSV have same number of rows) and Columns (3 standard columns plus template columns) for the CSV to be exported
$exportCsvNumberOfRows = $importCsvCompact.Count
$exportCsvNumberOfColumns = $selectedAuditDataFields.Length + 3
### Add header to to-be-exported-CSV
$header = [object[]]::new($exportCsvNumberOfColumns);
$header[0] = "CreationDate"
$header[1] = "UserIds"
$header[2] = "Operations"
for($columnIncrement = 3; $columnIncrement -ne $exportCsvNumberOfColumns; $columnIncrement++) {
$header[$columnIncrement] = ($selectedAuditDataFields[$columnIncrement-3])
}
$toAppend = $header -join ","
$toAppend | Out-File -FilePath $outputPath -Append
### initiate array for each transformed row and initiate counter of current row
$processingRowCounter = 0
### traverse each row of the CSV import and setup the new column structure
### connect the 3 standard columns with a subset of the expanded JSON entries (based on the imported template)
$importCsvCompact | ForEach-Object {
$csvArrayColumn = [object[]]::new($exportCsvNumberOfColumns);
$csvArrayColumn[0] = $importCsvCompact.CreationDate[$processingRowCounter]
$csvArrayColumn[1] = $importCsvCompact.UserIds[$processingRowCounter]
$csvArrayColumn[2] = $importCsvCompact.Operations[$processingRowCounter]
for($columnIncrement = 3; $columnIncrement -ne $exportCsvNumberOfColumns; $columnIncrement++) {
$csvArrayColumn[$columnIncrement] = $importCsvCompact.AuditData.($selectedAuditDataFields[$columnIncrement-3])[$processingRowCounter]
}
$processingRowCounter++
$directExport = $csvArrayColumn -join ","
$directExport | Out-File -FilePath $outputPath -Append
Write-Host "Processed $processingRowCounter Rows..."
}
Testfiles
templateTest.txt
https://easyupload.io/vx7k75
inputTest.csv
https://easyupload.io/ab77q9
Current Version based on Feedback
### Configure variables
$inputPath = "C:\Users\forstchr\Downloads\inputTest.csv"
$formatTemplate = "C:\Users\forstchr\Downloads\templateTest.txt"
$outputPath = "C:\Users\forstchr\Downloads\outputTest.csv"
### Load the columns from template file to perform transformation depending on the required AuditData Fields. The file contains a list of relevant JSON keys from the Audit Data columns
$selectedAuditDataFields = Get-Content $formatTemplate
### Calculate the number of Rows (import and export CSV have same number of rows) and Columns (3 standard columns plus template columns) for the CSV to be exported
$exportCsvNumberOfRows = $importCsvCompact.Count
$exportCsvNumberOfAuditColumns = $selectedAuditDataFields.Length
###Load CSV, select needed columns and expand the JSON entries within the AuditData column
Import-csv -Path $inputPath -Delimiter "," | Select-Object -Property CreationDate, UserIds, Operations, #{name = "AuditData"; Expression = {$_.AuditData | ConvertFrom-Json }} | % {
[pscustomobject]#{
'CreationDate' = $_.CreationDate
'UserIds' = $_.UserIds
'Operations' = $_.Operations
# the next part is not correct but hopefully displays what I am trying to achieve with the JSON in the AuditData column
for($auditFieldIncrement = 0; $auditFieldIncrement -ne $exportCsvNumberOfAuditColumn; $auditFieldIncrement++) {
'$selectedAuditDataFields[$auditFieldIncrement]' = $_.AuditData.($selectedAuditDataFields[$auditFieldIncrement])
}
}
} | Export-csv $outputPath
I have had to produce a "cleansed" csv file in one project. My general approach was as follows: import the existing csv data, and send it through the pipeline.
Foreach-object, do some processing, storing the results in variables. The last step in processing creates a hashtable typecast as a pscustomobject, and this result in passed through the pipeline. The output of the second pipeline is fed to Export-csv. Export-csv does all the joining and the commas for me, and also encloses the output fields in quotes, making them strings.
Here is a code snippet that illustrates the approach. The cleansing consists of reformatting dates so that they use a standard 14 digit format, and reformatting currency amounts so that they don't contain dollar signs. But that is not relevant to you.
Import-csv checking.csv | % {
$balance += [decimal]$(Get-Amount $_.AMOUNT)
[pscustomobject]#{
'TRNTYPE' = Get-Type $_.AMOUNT
'DTPOSTED' = (Get-Date $_.DATE).Tostring('yyyyMMddHHmmss')
'TRNAMT' = Get-Amount $_.AMOUNT
'FITID' = $fitid++ #this is a stopgap
'NAME' = $_.DESCRIPTION
'MEMO' = $memo
}
} |
Export-csv transactions.csv
Get-Type is a function that yields 'CREDIT' or 'DEBIT' depending on the sign of the amount. Get-Amount is a function that gives a numeric amount without commas and dollar signs. Those functions are defined at the beginning of the script. Note that, when you call a powershell function, there are no parentheses involved. That was a big jolt to me, but it's actually a feature of powershell.

Powershell Make a function for directories and add a prefix at the end

Make a function that makes 3 directories with the name John_S and add the prefix and appended with the number 1, 2, m
Example
1. John_S1
2. John_S2
3. John_S3
Use a loop (ForEach)
Use a variable for the number of iterations
What I have so far...
$DirName = "John_S"
function mulcheck {New-item "$DirName"}
$i = 1
foreach($DirName in $DirNames)
{$newname = $DirName Rename-Item $($DirName) $newname $i++}
The easiest way to generate the numbers 1 through 3 is with the .. range operator:
foreach($suffix in 1..3){
mkdir "John_S${suffix}"
}
To make the function re-usable with something other than John_S, declare a [string] parameter for the prefix:
function New-Directories([string]$Prefix) {
foreach($suffix in 1..3){
mkdir "${Prefix}${suffix}"
}
}
If I understand your latest comment correctly, you want a function that takes the name of a new folder and checks if a folder with that name altready exists in the rootpath. When that is the case, it should create a new folder with the given name, but with an index number appended to it, so it has a unique name.
For that you can use something like this:
function New-Folder {
[CmdletBinding()]
param (
[Parameter(Mandatory = $false)]
[string]$RootPath = $pwd, # use the current working directory as default
[Parameter(Mandatory = $true)]
[string]$FolderName
)
# get an array of all directory names (name only) of the folders with a similar name already present
$folders = #((Get-ChildItem -Path $RootPath -Filter "$FolderName*" -Directory).Name)
$NewName = $FolderName
if ($folders.Count) {
$count = 1
while ($folders -contains $NewName) {
# append a number to the FolderName
$NewName = "{0}{1}" -f $FolderName, $count++
}
}
# we now have a unique foldername, so create the new folder
$null = New-Item -Path (Join-Path -Path $RootPath -ChildPath $NewName) -ItemType Directory
}
New-Folder -FolderName "John_S"
If you run this several times, you will have created several folders like

Rename Files & Folders Keywords - Using a CSV Look Up File

I would like to rename files and folders based on keywords found in a CSV file.
The CSV holds the search and replace keywords that will make up file and folder names.
Search | Replace
Document | DTX
Processing | PRX
Implementation | IMX
...
Not all the file names include each word in the file name.
Not all the folders will include each word in the folder name
Powershell will have to search the child item ie the folder and file
names.
If it finds the word (match) - Substitute from the CSV
I have looked at these threads to help me:
Using Powershell to recursively rename directories using a lookup file
powershell script to rename all files in directory
http://code.adonline.id.au/batch-rename-files/
I have only managed below snippet
$folder = "C:\Folders" #target folder containing files
$csv = "C:\FileNameKeywords.csv" #path to CSV file
cd ($folder);
Import-Csv ($csv) | foreach {
Rename-Item -Path $_.Path -NewName $_.Filename
}
It only replaces one at a time.
Question:
How can I recursively search and replace in file and Folder Names using a CSV as a look up or reference file.
When you have the need to look up values by other values the usual go-to data structure is a dictionary, or in PowerShell terms a hashtable. Read your CSV into a dictionary like this:
$keywords = #{}
Import-Csv $csv | ForEach-Object {
$keywords[$_.Search] = $_.Replace
}
Then traverse your folder tree and build the new filenames by replacing each key with its respective value:
Get-ChildItem $folder -Recurse | ForEach-Object {
$newname = $_.Name
foreach ($word in $keywords.Keys) {
$newname = $newname.Replace($word, $keywords[$word])
}
if ($_.Name -ne $newname) {
Rename-Item -Path $_.FullName -NewName $newname
}
}
ill give it a shot. I'm assuming search and replace are your headers in this scenario. this in addition to your $folder and $csv variables.
$csvobject=import-csv $csv
Foreach($obj in $csvobject){
$search=$obj.search
$replace=$obj.replace
get-childitem path $folder |where{$_.name -like "$($obj.search)"} | rename-item -newname {$_.name -replace "$search", "$replace"} }
The replace handles regex so u will need to make sure Any special characters are properly escaped.

Powershell - find file by 'file name' and rename based on CSV

I have a set of files (OldName) in a Windows directory that I would like to rename (NewName) based on the following CSV file:
OldName,NewName
Sources/texas play_PGC_Cpgc_entryPoint_Cbp_1.f4v,01 Texas Play.f4v
Sources/texas play_PGC_Cpgc_entryPoint_Dbp_1.f4v,02 First Song.f4v
Sources/texas play_PGC_Cpgc_entryPoint_Ebp_1.f4v,03 Yellow Rose.f4v
I'm not sure how to loop thru the CSV file... finding each file and replacing.
Any thoughts would be appreciated.
First Import Your CSV file into powershell
$AnyVariableName = Import-Csv "$env:USERPROFILE:\Desktop\directoryname.txt"
Note: In my example, the path to the CSV file is on my desktop, but it may be different in yours.
Then use a foreach loop rename the items
foreach ($objName in $AnyVariableName){
Rename-Item $objName.OldName $objName.NewName
}
One way to do it is to create two lists and loop though each of them. The CSV file will be a reference list, so we'll grab the contents and convert it from CSV then store it in a variable
$CSVRef = Get-Content "C:\Path\To\CSV.csv" | ConvertFrom-CSV
Then we'll get the list of files who's names you want to change, and loop through each file. From inside the loop you can run another loop to find the current name in your reference list, and then change it to the new name.
Get-ChildItem "C:\path\to\f4v\files" -Filter *.f4v | ForEach-Object {
#Grab the current item in a variable to access it within the second loop
$CurrentFile = $_
$CSVRef | ForEach-Object {
if ($CurrentFile.Name -ilike $_.OldName) {
Rename-Item $CurrentFile.FullPath $_.NewName
}
}
}
So during the second loop we try to compare the file name with every "OldName" item in the CSV file list. If the OldName matches somewhere in the current file we're looping through then we run Rename-Item and provide it the NewName. It should automatically rename the file.
Combining both examples works great
$CSVRef = Import-Csv "C:\Temp\Filename.txt"
Get-ChildItem "C:\Temp\FileFolder" -Filter *.pdf | ForEach-Object {
$CurrentFile = $_
ForEach ($objName in $CSVRef) {
if ($CurrentFile.Name -ilike $objName.OLDNAME) {
Rename-Item $CurrentFile.FullName $objName.NEWNAME
}
}
}