Powershell copy files based on CSV contents - csv

Morning Powershell users,
I'm trying to copy files listed in a .CSV document from one directory to another based on their file name. My .CSV document is just one column of filenames, it looks like this:
2.jpg
4.jpg
5.jpg
8.jpg
12.jpg
I have a folder of .jpgs from 1-900 in G:\Numbered\ and a destination folder for where the files should go at G:\Selections. This seems like it should be relatively simple but I think I'm missing something in my code. This is what I have so far:
import-CSV G:\fileList1.csv | foreach {copy-item G:\Numbered G:\Selections}
I looked around and I think I may need to use Get-Childitem but I'm not sure where. Any help is greatly appreciated, thank you!

Assuming your CSV look like this:
YourColumnName
2.jpg
4.jpg
n.jpg
Try this:
Import-Csv fileList1.csv | ForEach {Copy-Item "G:\Numbered\$($_.YourColumnName)" G:\Selections }
But, if your CSV has only 1 column without a header, like this:
2.jpg
4.jpg
n.jpg
you should use Get-Content because it's not really a csv file. And your code should be:
Get-Content files.csv | ForEach {Copy-Item G:\Numbered\$_ G:\Selections }

Related

Replace headers on export-csv from selectable list using powershell

fairly new to powershell and I have given myself a bit of a challenge which I think should be possible, I'm just not sure about the best way around it.
We have a user who has a large number of columns in a csv (can vary from 20-50), rows can vary between 1 and 10,000. the data is say ClientName,Address1,Address2,Postcode etc.. (although these can vary wildly depending on the source of the data - external companies) This needs importing into a system using a pre-built routine which looks at the file and needs the database column headers as the csv headers. so say ClientDisplay,Ad_Line1,Ad_Line2,PCode etc..
I was thinking along the lines of either a generic powershell 'mapping' form which could read the headers from ExternalSource.csv and either a DatabaseHeaders.csv (or a direct sql query lookup) display them as columns in a form and then highlight one from each column and a 'link' button, once you have been through all the columns in ExternalSource.csv a 'generate csv' button which takes the mapped headers an appends the correct data columns from ExternalSource.csv
Am I barking up the wrong tree completely trying to use powershell? at the moment its a very time consuming process so just trying to make life easier for users.
Any advice appreciated..
Thanks,
Jon
You can use the Select-Object cmdlet with dynamic columns to shape the data into the form you need.
Something like:
Import-Csv -Path 'source.svc' |
Select-Object Id, Name, #{ Name='Ad_Line1'; Expression={ $_.Address1 } } |
Export-Csv -Path 'target.csv'
In this example, the code #{ Name='Ad_Line1'; Expression={ $_.Address1 } } is a dynamic column, that creates a column with name AD_Line1' and the value ofAddress1`
It is possible to read the column mappings from a file, you will have to write some code to read the file, select the properties and create the format.
A very simple solution could be to read the Select-Object part from another script file, so you can differentiate that part for each import.
A (simple, naive, low performant) solution could look like this (untested code):
# read input file
$input = Import-Csv -Path $inputFile
# read source, target name columns from mapping file
$mappings = Import-Csv -Path $mappingFile | Select Source, Target
# apply transformations
$transformed = $input
foreach($mapping in $mappings) {
# collect the data, add an extra column for each mapping
$transformed = $transformed | Select-Object *, #{ Name = $mapping.Target; Expression = { $_.$($mapping.Source) } }
}
#export transformed data
$transformed | Export-Csv -Path $outputFile
Alternatively; It is possible to convert the data into XML with Import-Csv | Export-CliXml, apply an Xslt template on the Xml to perform a transformation, and save the Xml objects into Csv again with Import-CliXml | Export-Csv.
See this blog by Scott Hansleman on how you can use XSLT with PowerShell.

Renaming CSV column header and merge results with Powershell

So I'm just starting out with this whole Powershell thing and so far so good - until now. I just can't figure out how to do this!
I'm looking at manipulating CSV files which are output from one system (which I can't change at output), renaming some column headers and merging a couple of the results into one column so that it matches the input requirements to upload into another system (again, I can't change those parameters).
So, as an example.
The first file is created:
File1.csv
"A","B","C""1","2","3"
I want a powershell script that will output:
File2.csv
"X","Y""1","23"
So I can import it into another system.
I hope that all makes sense, and thanks in advance for any assistance.
I'm going to assume that your actual/desired formats of your files look like this:
"A","B","C"
"1","2","3"
"X","Y"
"1","23"
rather than having everything in one line. If that's correct you can import File1.csv with Import-Csv, rename and merge columns with calculated properties:
... | Select-Object #{n='X';e={$_.A}}, #{n='Y';e={$_.B + $_.C}} | ...
and write the result to File2.csv with Export-Csv.

Combining multiple .csv-files into one

I got multiple csv files, all with the same header, all in the same folder which I have to combine to create one large csv file. Since every .csv-file has the same header, I guess using it once should be enough.
All files look like this (also delimited by ','):
Header1,Header2,Header3
Data1, Data2, Data3
Data4, Data5, Data6
Can you help out? I'm not very comfortable with Powershell yet, tried out different codes but nothing really helped me out.
Thanks
if all csv's has the same columns, simply:
# Original CSV
$csv = import-csv c:\temp.csv
# 2nd CSV
$csv2 = import-csv c:\temp2.csv
# As simple as:
$csv += $csv2
If you want to import all CSV's in the current folder you can do something like the following:
Get-ChildItem *.csv | % { Import-Csv $_ }
start by copying the first file to an output file.
then see this (question, answer):
https://stackoverflow.com/a/2076557/1331076
it shows you how to remove the first line from a file.
you would have to modify it, so instead of replacing your existing file, you Add-Content to the output file.

PowerShell - Import-Csv - referencing a column by its position

When dealing with csv files using the cmdlet Import-Csv in PowerShell, is there an easy / elegant way to query a column by its position instead of the header name?
Example with this data:
row1, abc
row2, def
row3, xyz
The below command works fine if the file has "header_column2" as the header of the 2nd column:
import-csv data.csv | where {$_.header_column2 -eq "xyz"}
I am looking for something like this (which obviously doesn't work!):
import-csv data.csv | where {$_.[2] -eq "xyz"}
I know that if I don't have headers in the file I could use the -Header parameter to specify those, but I am looking for something quick and handy to write on the fly for various files, where headers are not relevant.
Many thanks!
Since #PetSerAl doesn't understand the difference between comments and answers here at StackOverflow, I'll provide a similar answer.
You can list the properties in the object using .PSObject.Properties. This returns an IEnumerable that doesn't know about indexes, but if you convert it to an array, it will.
#($_.PSObject.Properties)[2].Value

How to replace multiple values in a CSV or html file with PowerShell?

I wrote a script to read a list of folder objects into an xml file showing the properties for the folder. It also creates another xml file at a later point in time. After the delta xml file is created, I import both xml files and compare them based on the folder name to display which folders have been deleted or removed and save the results in html format to a file for viewing. Everything works well, but I want to replace some of the values in the results. The compare-object cmd-let lets me display some attributes, but tells what side the change was on by putting => for a folder added in the delta file or <= for a folder removed in the delta file. I really would like to replace the column name of SideIndicator and the replace the => or <= values with something more intuitive. I played around with useing -Replace {$_ $original, $newvalue} type method. I found guidance on Hey Scripting Guy blog and some other examples around, but none seemed to do what I want. What is the best way to approach this? Currently I'm not storing the compared results, just formatting and converting to HTML. Any advice is appreciated. I can post the code if needed, but it is about 60 lines long and I'm really looking for the best way to accomplish this, not neccesarily someone to write the code.
Thanks!
If you just want to replace column names in the output display, you can create a custom table:
http://gallery.technet.microsoft.com/scriptcenter/ed188912-1a20-4be9-ae4f-8ac46cf2aae4
That is one approach, but I ended up modifying the html report by doing the following:
(Get-Content $WorkingDirectory\FolderAudit_$UseDateTime.htm) |
Foreach-Object {$_ -replace "SideIndicator", "Change Status"} |
Set-Content $WorkingDirectory\FolderAudit_$UseDateTime.htm
(Get-Content $WorkingDirectory\FolderAudit_$UseDateTime.htm) |
Foreach-Object {$_ -replace "=>", "New"} |
Set-Content $WorkingDirectory\FolderAudit_$UseDateTime.htm
(Get-Content $WorkingDirectory\FolderAudit_$UseDateTime.htm) |
Foreach-Object {$_ -replace "<=", "Removed"} |
Set-Content $WorkingDirectory\FolderAudit_$UseDateTime.htm
It didn't seem to work well as one statement that replaced the three different pieces I was looking for. Eventually I'll play around with it and streamline it some, but for now it does what I want.