I am trying to fetch pdf files from a OneDrive, However, the only pdf file is not coming up with my link so that I can share with others via email However I am able to get into the folder wherein my pdf files are stored.
Below is my code which I tried however I am not getting the pdf file.
$linklist = [System.Collections.ArrayList]#()
$prefix="https://abc.sharepoint.com/:b:/r/personal/svc_prod_bcs_user_abc_com/Documents/ads/SETF/"
$a = Get-Content '.\New document 2.json' | ConvertFrom-Json
$a | ForEach-Object {
$TestEvent = #($a.TestEvent)
$Folder = #($a.Folder)
$NEL = #($a.NEL)
for ($i = 0; $i -lt $NEL.Count; $i++) {
$path = 'D:\\OneDriveData\\OneDrive - abc Enterprises\\ads\\SETF\\'+$Folder[$i]+'\\'
$pdf = Get-ChildItem -Path $path -Filter "$TestEvent[$i]" | Sort {$_.LastWriteTime} | select -last 1
$url1 = ''+$Folder[$i]+''+'<br>'+'<br>'
write-host $url1
$null = $linklist.Add($url1)
}
}
"LINKS_VAR=$linklist" | Out-File $ENV:JOB_BASE_NAME'.properties' -Encoding ASCII
Related
I'm having to whip up a process that will read multiple json files created by another process.
I have code that can read a single file, but we're needing to process these results in bulk.
Here's my current code:
$json = Get-ChildItem $filePath -recurse | Where-Object { $_.LastWriteTime -gt [DateTime] $filesNewerThan } | ConvertFrom-Json
$json.delegates | foreach-Object {
foreach ($File in $_.files)
{
[PSCustomObject]#{
LastName = $_.lastName
ZipCode = $File.zipCode
BirthDate = $File.birthdate
Address = $File.Address}
}
}
Right now I'm getting an error about an "invalid JSON primitive" which what I'm guessing is an issue where I don't have "Get-Content" specified in my code.
Wondering what my issue is with my code.
ConvertFrom-Json currently (as of PowerShell 7.0) doesn't support file-path input, only file-content input (the actual JSON string), which means that you need to involve Get-Content:
$json = Get-ChildItem -File $filePath -Recurse |
Where-Object { $_.LastWriteTime -gt [DateTime] $filesNewerThan } |
ForEach-Object { Get-Content -Raw -LiteralPath $_.FullName | ConvertFrom-Json }
I have a zip file which contains several CSV files inside it. How do I read the contents of those CSV files without extracting the zip files using PowerShell?
I having been using the Read-Archive Cmdlet which is included as part of the PowerShell Community Extensions (PSCX)
This is what I have tried so far.
$path = "$env:USERPROFILE\Downloads\"
$fullpath = Join-Path $path filename.zip
Read-Archive $fullpath | Foreach-Object {
Get-Content $_.Name
}
But when I run the code, I get this error message
Get-Content : An object at the specified path filename.csv does not exist, or has been filtered by the -Include or -Exclude parameter.
However, when I run Read-Archive $fullpath, it lists all the file inside the zip file
There are multiple ways of achieving this:
1. Here's an example using Ionic.zip dll:
clear
Add-Type -Path "E:\sw\NuGet\Packages\DotNetZip.1.9.7\lib\net20\Ionic.Zip.dll"
$zip = [Ionic.Zip.ZipFile]::Read("E:\E.zip")
$file = $zip | where-object { $_.FileName -eq "XMLSchema1.xsd"}
$stream = new-object IO.MemoryStream
$file.Extract($stream)
$stream.Position = 0
$reader = New-Object IO.StreamReader($stream)
$text = $reader.ReadToEnd()
$text
$reader.Close()
$stream.Close()
$zip.Dispose()
It's picking the file by name (XMLSchema1.xsd) and extracting it into the memory stream. You then need to read the memory stream into something that you like (string in my example).
2. In Powershell 5, you could use Expand-Archive, see: https://technet.microsoft.com/en-us/library/dn841359.aspx?f=255&MSPPError=-2147217396
It would extract entire archive into a folder:
Expand-Archive "E:\E.zip" "e:\t"
Keep in mind that extracting entire archive is taking time and you will then have to cleanup the temporary files
3. And one more way to extract just 1 file:
$shell = new-object -com shell.application
$zip = $shell.NameSpace("E:\E.zip")
$file = $zip.items() | Where-Object { $_.Name -eq "XMLSchema1.xsd"}
$shell.Namespace("E:\t").copyhere($file)
4. And one more way using native means:
Add-Type -assembly "system.io.compression.filesystem"
$zip = [io.compression.zipfile]::OpenRead("e:\E.zip")
$file = $zip.Entries | where-object { $_.Name -eq "XMLSchema1.xsd"}
$stream = $file.Open()
$reader = New-Object IO.StreamReader($stream)
$text = $reader.ReadToEnd()
$text
$reader.Close()
$stream.Close()
$zip.Dispose()
Based on 4. solution of Andrey, I propose the following function:
(keep in mind that "ZipFile" class exists starting at .NET Framework 4.5)
Add-Type -assembly "System.IO.Compression.FileSystem"
function Read-FileInZip($ZipFilePath, $FilePathInZip) {
try {
if (![System.IO.File]::Exists($ZipFilePath)) {
throw "Zip file ""$ZipFilePath"" not found."
}
$Zip = [System.IO.Compression.ZipFile]::OpenRead($ZipFilePath)
$ZipEntries = [array]($Zip.Entries | where-object {
return $_.FullName -eq $FilePathInZip
});
if (!$ZipEntries -or $ZipEntries.Length -lt 1) {
throw "File ""$FilePathInZip"" couldn't be found in zip ""$ZipFilePath""."
}
if (!$ZipEntries -or $ZipEntries.Length -gt 1) {
throw "More than one file ""$FilePathInZip"" found in zip ""$ZipFilePath""."
}
$ZipStream = $ZipEntries[0].Open()
$Reader = [System.IO.StreamReader]::new($ZipStream)
return $Reader.ReadToEnd()
}
finally {
if ($Reader) { $Reader.Dispose() }
if ($Zip) { $Zip.Dispose() }
}
}
I wanted to convert a powershell result to html file,
I have an array $f="gsds,jv,hfvw".
I want the html file to print each element of the array in a new line.
How to do it?
Thanks!
I used this code
$f="6e47812,662348,8753478"
Get-Service |ConvertTo-Html -Body "$f"|out-file D:\service.html
And the out put was
6e47812,662348,8753478
Do you mean something like this? :
$test = "abc","def","ghi"
foreach ($row in $test) {Write-Output "<p> $row </p>"}
Output :
<p> abc </p>
<p> def </p>
<p> ghi </p>
cls
$Path = 'C:\temp\test' #Path Directory
Grab a recursive list of all subfolders
$SubFolders = dir $Path -Recurse | Where-Object {$_.PSIsContainer} | ForEach-Object -Process {$_.FullName}
Iterate through the list of subfolders and grab the first file in each
ForEach ($Folder in $SubFolders)
{
$FullFileName = dir $Folder | Where-Object {!$_.PSIsContainer} | Sort-Object {$_.LastWriteTime} -Descending | Select-Object -First 1
For every file grab it's location and output the robocopy command ready for use
ForEach ($File in $FullFileName)
{
$FilePath = $File.DirectoryName
$FileName = $File.Name
write-output "$FilePath $FileName"
}
}
I would like to combine all the csv files in my local folder but it shows empty results. I am trying to take the header of the first file and skip all the headers in the rest of the files in the folder and join them.
get-childItem "C:\Users\*.csv" | foreach {[System.IO.File]::AppendAllText
("C:\Users\finalCSV.csv", [System.IO.File]::ReadAllText($_.FullName))}
$getFirstLine = $true
get-childItem "C:\Users\*.csv" | foreach {
$filePath = $_
$lines = $lines = Get-Content $filePath
$linesToWrite = switch($getFirstLine) {
$true {$lines}
$false {$lines | Select -Skip 1}
}
$getFirstLine = $false
Add-Content "C:\Users\finalCSV.csv" $linesToWrite
}
My end result is that when I open finalCSV.csv it shows no results.
I think you are trying to overwork your solution. Just use Import-Csv and append to an array. Something like this:
$a = #(); ls *.csv | % {$a += (Import-Csv $_.FullName)}; $a
Works even if the columns are in a different order.
I have a long winded script that gets a user from an CSV file, matches user against a CSV filename from a specific directory.
The user is matched against this CSV file in this format <data><samaccountname><text>.csv
The aim here is to get an AD User from a list, then scan a folder with CSV Files in it and match against the user. From there restore the user AD attributes.
The issue here is that the output is always of the last user twice, I have REM out the export at the end so I can see what is on screen first.
Clear-Host
#Get username from users list and match against CSV file name.
$FDate = (get-date).ToString("yyyMMdd")
$Project = "<FolderPath>" #Project name used to setup folders and for reports etc
$ProjectRoot = "<path>\" # Backup folder
$RestorePath = $ProjectRoot + $Project #combined path for restoring
$UsersListFile = $ProjectRoot + '\Userlist.csv' #Userlist
$Results = #{} # Storage for all csv files
$PSObject = New-Object psobject
$Report = #() #For Export-CSV
$Results = gci $RestorePath -Filter '*.csv'
$i = 0
foreach ($File in $Results) {
$i += 1
Write-Host 'Number of passes - '$i
Write-Host 'Current file processing - '$file.Name -for Green
foreach ($User in (import-csv $UsersListFile)) {
$SAM = $User.SamAccountName
Write-Host 'Current User processing - '$SAM -ForegroundColor Magenta
if ($file.Name -match $SAM) {
Write-host "Filename and user $SAM match " -for Yellow
$Row= New-Object psobject
$ROW | Add-Member -type NoteProperty -name Name -value $SAM -force
$Report += $Row
foreach ($Attrib in (import-csv $restorepath\$file)) {
#Write-host 'Attributes in file - ' $attrib.samaccountname $Attrib.mail -for Yellow
#Use this to restore AD User data
}
} else {
Write-Host "No match" -ForegroundColor Red
}
}
}
#$Report | Export-Csv $RestorePath'\Test.csv' -NoTypeInformation -Force
$Report | Sort-Object Name
Updated script to move New-Object psobject to above $Row, so this creates a new object each time, rather then overwriting previous entry.