Leading zero is dropped when using ExportTo-Csv cmdlet - csv

Below is the script I am running.
The script is working fine. It is giving proper output, however it is removing leading zeros from couple of columns. Please suggest how to retain leading zero for the integer fields.
I am using
$res.data.Tables[0] | ConvertTo-Csv -NoType | ForEach-Object {$_.Replace('"','')} | Out-file $fileName -Force"
for exporting data to the CSV file. Please suggest how to retain the leading zeros (at least 2 decimal places).
param([int]$accountingDay = 1, [string]$outputFolder)
$server = "ADMSQL01"
$db = "cc111db"
function exec-storedprocedure($storedProcName,
[hashtable] $parameters=#{},
[hashtable] $outparams=#{},
$conn) {
function put-outputparameters($cmd, $outparams) {
foreach ($outp in $outparams.Keys) {
$p = $cmd.Parameters.Add("#$outp", (get-paramtype $outparams[$outp]))
$p.Direction=[System.Data.ParameterDirection]::Output
$p.Size=4
}
}
function get-outputparameters($cmd,$outparams){
foreach ($p in $cmd.Parameters) {
if ($p.Direction -eq [System.Data.ParameterDirection]::Output) {
$outparams[$p.ParameterName.Replace("#","")]=$p.Value
}
}
}
function get-paramtype($typename) {
switch ($typename) {
'uniqueidentifier' {[System.Data.SqlDbType]::UniqueIdentifier}
'int' {[System.Data.SqlDbType]::Int}
'xml' {[System.Data.SqlDbType]::Xml}
'nvarchar' {[System.Data.SqlDbType]::NVarchar}
default {[System.Data.SqlDbType]::Varchar}
}
}
$close = ($conn.State -eq [System.Data.ConnectionState]'Closed')
if ($close) {
$conn.Open()
}
$cmd = New-Object System.Data.SqlClient.SqlCommand($sql,$conn)
$cmd.CommandType = [System.Data.CommandType]'StoredProcedure'
$cmd.CommandText = $storedProcName
foreach ($p in $parameters.Keys) {
$cmd.Parameters.AddWithValue("#$p",[string]$parameters[$p]).Direction = [System.Data.ParameterDirection]::Input
}
put-outputparameters $cmd $outparams
$ds = New-Object System.Data.DataSet
$da = New-Object System.Data.SqlClient.SqlDataAdapter($cmd)
[Void]$da.fill($ds)
if ($close) {
$conn.Close()
}
get-outputparameters $cmd $outparams
return #{data=$ds;outputparams=$outparams}
}
# setup the 'framework' to use PowerShell with SQL
$sqlConnection = New-Object System.Data.SqlClient.SqlConnection
# SQL Server connection string
$sqlConnection.ConnectionString = $sqlConnection.ConnectionString = 'server=' + $server + ';integrated security=TRUE;database=' + $db
# execute stored procedure
$res=exec-storedprocedure -storedProcName 'sp_OPTUS_summary' -parameters #{inAccountFieldValues=$null;inAccountViewID=1;inAccountLevel=4;inAccountingDay=$accountingDay;inAccountPeriod=3;inUserGroupID=2} -outparams #{} $sqlConnection
if ($res.data.Tables.Count) {
# store results in file
$curYear = Get-Date -Format yyyy
$curMonth = Get-Date -Format MMM
$curTime = (Get-Date -Format s).Replace(':', ' ')
$fileName = $outputFolder + '\scheduled-' + $curYear + '-' + $curMonth + '-' + $accountingDay + '-' + $curTime + '.csv'
$res.data.Tables[0] | ConvertTo-Csv -NoType | ForEach-Object {$_.Replace('"','')} | Out-File $fileName -Force
}

Integers don't have "leading zeroes". If you want to export formatted output you need to convert the respective fields to formatted strings, e.g. like this:
$res.data.Tables[0] |
Select-Object FieldA, FieldB, #{n='FieldC';e={'{0:d2}' -f $_.FieldC}},
#{n='FieldD';e={'{0:d2}' -f $_.FieldD}}, FieldE, ... |
ConvertTo-Csv -NoType |
...

Related

What is the good way to read data from CSV and converting them to JSON?

I am trying to read the data from CSV file which has 2200000 records using PowerShell and storing each record in JSON file, but this takes almost 12 hours.
Sample CSV Data:
We will only concern about the 1st column value's.
Code:
function Read-IPData
{
$dbFilePath = Get-ChildItem -Path $rootDir -Filter "IP2*.CSV" | ForEach-Object{ $_.FullName }
Write-Host "file path - $dbFilePath"
Write-Host "Reading..."
$data = Get-Content -Path $dbFilePath | Select-Object -Skip 1
Write-Host "Reading data finished"
$count = $data.Count
Write-host "Total $count records found"
return $data
}
function Convert-NumbetToIP
{
param(
[Parameter(Mandatory=$true)][string]$number
)
try
{
$w = [int64]($number/16777216)%256
$x = [int64]($number/65536)%256
$y = [int64]($number/256)%256
$z = [int64]$number%256
$ipAddress = "$w.$x.$y.$z"
Write-Host "IP Address - $ipAddress"
return $ipAddress
}
catch
{
Write-Host "$_"
continue
}
}
Write-Host "Getting IP Addresses from $dbFileName"
$data = Read-IPData
Write-Host "Checking whether output.json file exist, if not create"
$outputFile = Join-Path -Path $rootDir -ChildPath "output.json"
if(!(Test-Path $outputFile))
{
Write-Host "$outputFile doestnot exist, creating..."
New-Item -Path $outputFile -type "file"
}
foreach($item in $data)
{
$row = $item -split ","
$ipNumber = $row[0].trim('"')
Write-Host "Converting $ipNumber to ipaddress"
$toIpAddress = Convert-NumbetToIP -number $ipNumber
Write-Host "Preparing document JSON"
$object = [PSCustomObject]#{
"ip-address" = $toIpAddress
"is-vpn" = "true"
"#timestamp" = (Get-Date).ToString("o")
}
$document = $object | ConvertTo-Json -Compress -Depth 100
Write-Host "Adding document - $document"
Add-Content -Path $outputFile $document
}
Could you please help optimize the code or is there a better way to do it. or is there a way like multi-threading.
Here is a possible optimization:
function Get-IPDataPath
{
$dbFilePath = Get-ChildItem -Path $rootDir -Filter "IP2*.CSV" | ForEach-Object FullName | Select-Object -First 1
Write-Host "file path - $dbFilePath"
$dbFilePath # implicit output
}
function Convert-NumberToIP
{
param(
[Parameter(Mandatory=$true)][string]$number
)
[Int64] $numberInt = 0
if( [Int64]::TryParse( $number, [ref] $numberInt ) ) {
if( ($numberInt -ge 0) -and ($numberInt -le 0xFFFFFFFFl) ) {
# Convert to IP address like '192.168.23.42'
([IPAddress] $numberInt).ToString()
}
}
# In case TryParse() returns $false or the number is out of range for an IPv4 address,
# the output of this function will be empty, which converts to $false in a boolean context.
}
$dbFilePath = Get-IPDataPath
$outputFile = Join-Path -Path $rootDir -ChildPath "output.json"
Write-Host "Converting CSV file $dbFilePath to $outputFile"
$object = [PSCustomObject]#{
'ip-address' = ''
'is-vpn' = 'true'
'#timestamp' = ''
}
# Enclose foreach loop in a script block to be able to pipe its output to Set-Content
& {
foreach( $item in [Linq.Enumerable]::Skip( [IO.File]::ReadLines( $dbFilePath ), 1 ) )
{
$row = $item -split ','
$ipNumber = $row[0].trim('"')
if( $ip = Convert-NumberToIP -number $ipNumber )
{
$object.'ip-address' = $ip
$object.'#timestamp' = (Get-Date).ToString('o')
# Implicit output
$object | ConvertTo-Json -Compress -Depth 100
}
}
} | Set-Content -Path $outputFile
Remarks for improving performance:
Avoid Get-Content, especially for line-by-line processing it tends to be slow. A much faster alternative is the File.ReadLines method. To skip the header line, use the Linq.Enumerable.Skip() method.
There is no need to read the whole CSV into memory first. Using ReadLines in a foreach loop does lazy enumeration, i. e. it reads only one line per loop iteration. This works because it returns an enumerator instead of a collection of lines.
Avoid try and catch if exceptions occur often, because the "exceptional" code path is very slow. Instead use Int64.TryParse() which returns a boolean indicating successful conversion.
Instead of "manually" converting the IP number to bytes, use the IPAddress class which has a constructor that takes an integer number. Use its method .GetAddressBytes() to get an array of bytes in network (big-endian) order. Finally use the PowerShell -join operator to create a string of the expected format.
Don't allocate a [pscustomobject] for each row, which has some overhead. Create it once before the loop and inside the loop only assign the values.
Avoid Write-Host (or any output to the console) within inner loops.
Unrelated to performance:
I've removed the New-Item call to create the output file, which isn't necessary because Set-Content automatically creates the file if it doesn't exist.
Note that the output is in NDJSON format, where each line is like a JSON file. In case you actually want this to be a regular JSON file, enclose the output in [ ] and insert a comma , between each row.
Modified processing loop to write a regular JSON file instead of NDJSON file:
& {
'[' # begin array
$first = $true
foreach( $item in [Linq.Enumerable]::Skip( [IO.File]::ReadLines( $dbFilePath ), 1 ) )
{
$row = $item -split ','
$ipNumber = $row[0].trim('"')
if( $ip = Convert-NumberToIP -number $ipNumber )
{
$object.'ip-address' = $ip
$object.'#timestamp' = (Get-Date).ToString('o')
$row = $object | ConvertTo-Json -Compress -Depth 100
# write array element delimiter if necessary
if( $first ) { $row; $first = $false } else { ",$row" }
}
}
']' # end array
} | Set-Content -Path $outputFile
You can optimize the function Convert-NumberToIP like below:
function Convert-NumberToIP {
param(
[Parameter(Mandatory=$true)][uint32]$number
)
# either do the math yourself like this:
# $w = ($number -shr 24) -band 255
# $x = ($number -shr 16) -band 255
# $y = ($number -shr 8) -band 255
# $z = $number -band 255
# '{0}.{1}.{2}.{3}' -f $w, $x, $y, $z # output the dotted IP string
# or use .Net:
$n = ([IPAddress]$number).GetAddressBytes()
[array]::Reverse($n)
([IPAddress]$n).IPAddressToString
}

Data inserted into MySQL through powershell script unexpectedly inserts two rows

I have working script to insert data like computer name, date, ip adress and two other data what my powershell script read from txt file.
I don't know why it makes double insert to mysql server where first insert is good and second don't have data from this txt file.
The script I've posted below successfully inserts data into my database however it seems to insert two separate rows. One with the appropriate data in the End1 and End2 columns and one with incorrect data.
Code to capture and insert data:
$txt = "app_log.txt"
function find_nr {
get-content $txt -ReadCount 1000 |
foreach { $_ -match "P3#X" }
}
$string = find_nr
$separator = "\;"
function seperate {
$string.Split($separator,6)
}
function nr_lines {
$i = 99901; seperate | % {$i++;"$($i-1) `t $_"}
}
function find {
$line = $args[0] | Select-String -Pattern "99905" -CaseSensitive
($line.line.split(' ') |Where-Object {$_.Trim() -ne ''})[1]
}
function liczba {
$result = nr_lines
find $result
}
liczba
function find_sn {
get-content $txt -ReadCount 1000 |
foreach { $_ -match "P2#X" }
}
$string_sn = find_sn
$separator_sn = "\/"
function seperate_sn {
$string_sn.Split($separator_sn,20)
}
function sn_lines {
$i = 55501; seperate_sn | % {$i++;"$($i-1) `t $_"}
}
function find_sn {
$line_sn = $args[0] | Select-String -Pattern "55518" -CaseSensitive
($line_sn.line.split(' ') |Where-Object {$_.Trim() -ne ''})[1]
}
function sn {
$result2 = sn_lines
find_sn $result2
}
sn
[System.Reflection.Assembly]::LoadWithPartialName("MySql.Data")
$name = $env:COMPUTERNAME
$ipv4 = Test-Connection -ComputerName (hostname) -Count 1 | foreach { $_.ipv4address }
$time = (Get-Date).ToString('yyyy-MM-dd HH:mm:ss')
$end1 = liczba
$end2 = sn
Start-Sleep -s 5
[string]$sMySQLUserName = 'user'
[string]$sMySQLPW = 'pass'
[string]$sMySQLDB = 'db'
[string]$sMySQLHost = '1.0.0.0'
[string]$sConnectionString = "server="+$sMySQLHost+";port=3306;uid=" + $sMySQLUserName + ";pwd=" + $sMySQLPW + ";database="+$sMySQLDB+";SslMode=none"
$oConnection = New-Object MySql.Data.MySqlClient.MySqlConnection($sConnectionString)
$Error.Clear()
try
{
$oConnection.Open()
}
catch
{
write-warning ("DB error: $sMySQLDB na ip: $sMySQLHost. Error: "+$Error[0].ToString())
}
$oMYSQLCommand = New-Object MySql.Data.MySqlClient.MySqlCommand
$oMYSQLCommand.CommandText="
INSERT into `db.scanner` (name,ipv4,date,raports,serialnumber) VALUES('$name','$ipv4','$time','$end1','$end2')"
$oMYSQLCommand.Connection=$oConnection
$oMYSQLCommand.ExecuteNonQuery()
$oConnection.Close()
Script starts every 3rd day of the week at the user login.
This is how it looks in mysql db:
ID NAME IP DATE END1 END2
239 S02C-KASA1 1.0.0.1 2021-09-02 08:15:37 0
238 S02C-KASA1 1.0.0.1 2021-09-02 08:15:37 1476 CAO1802176616FC
It appears the inserts are happening at the same time from the data. Does anyone have an idea as to what is causing this?
Edit:
This is a app_log.txt data:
Send : P23#sae\
Resp.: ...P2#X0;1;0;1;1;2;20;9;29/23.00/08.00/05.00/00.00/100.00/101.00/101.00/0/0./0./0./0./0./0./0./29930676.49/BDS13287171EF\
Resp.: .
Send : P24#sa9\
Resp.: ..P3#X2015;2;10;1771;59;0;249.00/0.00/0.00/0.00/0.00/0.00/0.00/80\
Resp.: .
find_nr takes this line: P3#X2015;2;10;1771;59;0;249.00/0.00/0.00/0.00/0.00/0.00/0.00/80\
find_sn takes this line:
P2#X0;1;0;1;1;2;20;9;29/23.00/08.00/05.00/00.00/100.00/101.00/101.00/0/0./0./0./0./0./0./0./29930676.49/BDS13287171EF\

How to call a function within a foreach parallel loop

Good evening.
I'm trying to use parallelism for the first time but I don't understand how to call a function within foreach loop.
I get a series of this error: Cannot bind argument to parameter 'Path' because it is null.
This is what I've done so far:
$FolderPath = "C:\myfolder\"
function AppendLog ($client) {
$so = New-CimSessionOption -Protocol 'DCOM'
$s = New-CimSession -ComputerName $client -SessionOption $so
Add-Content -Path (join-path $folderpath "LOGS.txt") -Value ( (get-date -Format "[yyyy.MM.dd HH:mm:ss]").tostring() + $client + " -PING: OK")
$arch = Get-CimInstance –query "select * from win32_operatingsystem" -CimSession $s | select -expandproperty osarchitecture
Add-Content -Path (join-path $folderpath "LOGS.txt") -Value ( (get-date -Format "[yyyy.MM.dd HH:mm:ss]").tostring() + $client + " -ARCH:" + $arch )
$lastboot = Get-CimInstance –query "select * from win32_operatingsystem" -CimSession $s | select -expandproperty lastbootuptime
Add-Content -Path (join-path $folderpath "LOGS.txt") -Value ( (get-date -Format "[yyyy.MM.dd HH:mm:ss]").tostring() + $client + " -BOOT:" + $lastboot )
}
$funcDef = $function:AppendLog.ToString()
$clients = get-content -path (join-path $folderPath "client.txt")
$clients | ForEach-Object -parallel {
if (test-connection $_ -count 2 -Quiet)
{
$function:AppendLog = $using:funcDef
AppendLog ($_)
}
} -throttlelimit 3
Could you explain me how to pass my path?
My bad on the comment, the error you're getting is most likely coming from your function. The error is being thrown by Join-Path:
PS /> Join-Path $null 'Logs.txt'
Join-Path : Cannot bind argument to parameter 'Path' because it is null.
At line:1 char:11
+ Join-Path $null 'Logs.txt'
+ ~~~~~
+ CategoryInfo : InvalidData: (:) [Join-Path], ParameterBindingValidationException
+ FullyQualifiedErrorId : ParameterArgumentValidationErrorNullNotAllowed,Microsoft.PowerShell.Commands.JoinPathCommand
The reason is because $FolderPath doesn't exist in the scope of your parallel loop. $folderpath should be replace with $using:folderpath inside your function.
As a side note, adding information to the same file on a parallel execution doesn't seem to be a good idea.
Last edit, I understand if this is meant to test how ForEach-Object -Parallel works but again, if the cmdlet allows remote querying / remote execution with multiple hosts at the same time, let the cmdlet handle that for you, it is more efficient.
As for the code, this is what I would use with what you already have:
$FolderPath = "C:\myfolder\"
$sessionOption = New-CimSessionOption -Protocol 'DCOM'
$clients = Get-Content -Path (Join-Path $FolderPath -ChildPath "Client.txt")
$results = $clients | ForEach-Object -Parallel {
$out = #{
Time = [datetime]::Now.ToString('[yyyy.MM.dd HH:mm:ss]')
ComputerName = $_
}
if ($ping = Test-Connection $_ -Count 2 -Quiet)
{
$session = New-CimSession -ComputerName $_ -SessionOption $using:sessionOption
$OSInfo = Get-CimInstance -CimSession $session -ClassName win32_operatingSystem
Remove-CimSession $session
}
$out.Ping = $ping
$out.Arch = $OSInfo.OSArchitecture
$out.LastBoot = $OSInfo.LastBootUpTime
[pscustomobject]$out
} -ThrottleLimit 3
$results | Export-Csv "$myFolder\LOGS.csv" -NoTypeInformation
This will output an object like this below:
Time ComputerName Ping OSArchitecture LastBoot
---- ------------ ---- -------------- --------
[2021.06.19 20:06:00] ComputerName01 True 64-bit 6/16/2021 11:47:16 AM
[2021.06.19 20:07:00] ComputerName02 False
[2021.06.19 20:08:00] ComputerName03 True 64-bit 6/13/2021 11:47:16 AM
[2021.06.19 20:09:00] ComputerName04 True 64-bit 6/14/2021 11:47:16 AM
[2021.06.19 20:10:00] ComputerName05 True 64-bit 6/15/2021 11:47:16 AM
Which can be exported nicely to a CSV instead of a text file. P.D.: sorry for the syntax highlighting :(

Importing a CDATA string literal from a JSON array into PowerShell

So I have a XML value extractor for a large rule database.
I stored the values in a Json file with sections for each file and then the values tied to their names.
However some of the values are a CDATA Snippet.
I can pull the CDATA as a string literal to store in the Json but I CANNOT seem to find a way to get power shell to let me put it back into the XML.
Cannot set "Value" because only strings can be used as values to set XmlNode properties.
At C:\Users\vagrant\Desktop\RuleSetter.ps1:24 char:21
+ $rule.value = $ruleFileSet. ($singleRuleXmlFile.directory.nam ...
+
+ CategoryInfo : NotSpecified: (:) [], SetValueException
+ FullyQualifiedErrorId : XmlNodeSetShouldBeAString`
The Getter looks like this
$directories = Get-ChildItem -dir -path C:\inetpub\wwwroot\
foreach($instName in $directories){
if($instName -notlike 'client' -and $instName -notlike 'bin'-and $instName -notlike 'aspnet_client'-and $instName -notlike 'AccessLibertyLogs'){
$obj = [hashtable]#{}
$head = [hashtable]#{Institution = $instName}
$obj.Add('head',$head)
$files = Get-ChildItem C:\inetpub\wwwroot\$instName\filepath -Include Rules.XML -Recurse
$files |
ForEach-Object {
[xml]$temp = Get-Content -path $_.FullName
[string]$title = $_.Directory.name
[hashtable]$output = #{}
foreach($rule in $temp.RuleCollection.Rules.Rule){
$t = ""
if($rule.value -isnot [string]){
$t = $rule.value.innerXml
}else{
$t = $rule.value
}
$output.Add($rule.name,$t)
}
$obj.Add($title,$output)
}
$transfer = New-Object -TypeName PSObject -Property $obj
$location = "C:\Users\vagrant\desktop\json\" + $instName + ".Json"
$transfer | ConvertTo-Json -Compress | Out-file -FilePath $location
}
}
And the Setter like this
Param(
[string]$location,
[string]$ruleConfigFileLocation)
$allInstConfigFiles = Get-ChildItem -Path C:\Users\vagrant\Desktop\json\
foreach($singleInstFile in $allInstConfigFiles) {
$instaName = $singleInstFile.BaseName
$allRuleXmlFileList = Get-ChildItem C:\inetpub\wwwroot\$instaName\Liberty \Applications\Origination\Configuration -Include Rules.XML -Recurse
ForEach($singleRuleXmlFile in $allRuleXmlFileList) {
[xml]$singleRuleXmlFileContents = Get-Content -path $singleRuleXmlFile.FullName
$singleInstFileContent = Get-Content $singleInstFile.FullName | ConvertFrom-Json
foreach($ruleFileSet in $singleInstFileContent){
foreach($rule in $singleRuleXmlFileContents.RuleCollection.Rules.Rule){
#write-host $ruleFileSet.($singleRuleXmlFile.directory.name).($rule.name)
if($ruleFileSet.($singleRuleXmlFile.directory.name).($rule.name) -as [xml])
{
$rule.value = $ruleFileSet.($singleRuleXmlFile.directory.name).($rule.name).OuterXml
write-host 'skipped'
}else{
write-host 'not skipped'
$rule.value = $ruleFileSet.($singleRuleXmlFile.directory.name).($rule.name)
}
}
}
}
}

How to include folder name from list dynamically into mailto URL?

I have this script. It scans a folder location and maps the names of folders to that of the folder owners which is pulled from a CSV file it then gets the users email address from AD and adds it to a new column as a clickable mailto: link. This is then all output in a table on a HTML page.
Gone through a few iterations of this and now at the final stage.
My issue now is how to pull in the folder name to the mailto body HTML.
Import-Module ActiveDirectory
function Encode($str) {
return ( $str -replace ' ', '%20' -replace '\n', '%0A%0D' )
}
function name($filename, $folderowners, $directory, $output){
$subject = Encode("Folder Access Request")
$body = Encode("Please can I have access to the following folder $directory")
$server = hostname
$date = Get-Date -format "dd-MMM-yyyy HH:mm"
$a = "<style>"
$a = $a + "TABLE{border-width: 1px;border-style: solid;border-color:black;}"
$a = $a + "Table{background-color:#ffffff;border-collapse: collapse;}"
$a = $a + "TH{border-width:1px;padding:0px;border-style:solid;border-color:black;}"
$a = $a + "TR{border-width:1px;padding-left:5px;border-style:solid;border-color:black;}"
$a = $a + "TD{border-width:1px;padding-left:5px;border-style:solid;border-color:black;}"
$a = $a + "body{ font-family:Calibri; font-size:11pt;}"
$a = $a + "</style>"
$c = " <br></br> Content"
$c = $c +"<p>More Content</p>"
$x = ""
$b = Import-Csv $folderowners
$mappings = #{}
$b | % { $mappings.Add($_.FolderName, $_.Owner) }
Get-ChildItem $directory | where {$_.PSIsContainer -eq $True} | select Name, Path, #{n="Owner";e={$mappings[$_.Name]}}, #{n="Email";e={"mailto:"+((Get-ADUser $mappings[$_.Name] -Properties mail).mail)}} | sort -property Name |
ConvertTo-Html -head $a -PostContent $c | % {
$body = Encode("Please can I have access to the following folder " + $_.Name)
$_ -replace '(mailto:)([^<]*)',
"`$2"
} | Out-File $output
}
name "gdrive" "\\server\departmentfolders$\location\gdrive.csv" "x:" "\\server\departmentfolders$\location\gdrive.html"
This now comes out and in the body of the email it shows the path but doesnt include the folder name just the path location \server\departmentfolders$ which is very nearly just need the folder name...
If you want to use variables in the replacement string you have to use double quotes instead of single quotes around the string:
$_ -replace '...', "... $directory ..."
In that case you have to escape other elements in the replacement string, though, namely inner double quotes and back-references ($1, $2) to groups ((...)) in the regular expression:
$_ -replace '(...)(...)', "<a href=`"`$1`$2...`">..."
Also you should encode spaces (%20) and line breaks (%0A%0D) in the URL.
function Encode($str) {
return ( $str -replace ' ', '%20' -replace '\n', '%0A%0D' )
}
The whole thing might look like this:
Import-Module ActiveDirectory
function Encode($str) {
return ( $str -replace ' ', '%20' -replace '\n', '%0A%0D' )
}
function name($filename, $folderowners, $directory, $output) {
$subject = Encode("Folder Access Request")
...
Get-ChildItem $directory | ... |
ConvertTo-Html -head $a -PostContent $c | % {
if ( $_ -match '^<tr><td>([^<]*)' ) {
$body = Encode("Please can I have access to the following folder " +
"{0}\{1}" -f ($directory, $matches[1]))
}
$_ -replace '(mailto:)([^<]*)',
"`$2"
} | Out-File $output
}
...