How to copy a folder and keep the structure? - function

I am creating a function that copies a folder and pastes it with another name.
Therefore, I have my "Main" folder and according to the query I make copies of the folder called like this: "Main 1", "Main 2", "Main 3"
The way I managed to solve this was through this function:
function TDMMonitor.CopyFolder(Origin, Destination : String) : Boolean;
var
aFiles : TStringDynArray;
InFile, OutFile: string;
Begin
aFiles := TDirectory.GetFiles(Origin, '*.*', TSearchOption.soAllDirectories);
for InFile in aFiles do
Begin
OutFile := TPath.Combine(Destination , TPath.GetFileName(InFile));
TFile.Copy(InFile, OutFile, True);
End;
Result := True;
End;
This works! But my problem right now is that the parent folder has subfolders that are not being copied correctly.
I leave a more visual example of the results of my function below:
"Main" folder:
File.txt
File1.txt
Sub-Folder -> File3.txt
"Main 1" folder:
File.txt
File1.txt
File3.txt
How can I maintain the folder structure that the Main folder follows?

TDirectory.Copy works:
TDirectory.Copy('D:\Path\Main', 'D:\Path\Main 1');

TDirectory.GetFiles() returns an array of absolute paths to each file found. The soAllDirectories flag tells it to search through subfolders recursively. So, you will end up with an array of paths at potentially different levels.
TPath.GetFileName() strips off all folder path info, leaving just the file name.
Based on your example, you are searching recursively through C:\Main\ and you want to copy everything to C:\Main 1\. When the search gives you the file C:\Main\Sub-Folder\File3.txt, your use of TPath.GetFileName() discards C:\Main\Sub-Folder\, and so you concatenate C:\Main 1\ with just File3.txt, thus you copy C:\Main\Sub-Folder\File3.txt to C:\Main 1\File3.txt instead of to C:\Main 1\Sub-Folder\File3.txt.
That is why your subfolders are not copying correctly.
Rather than using TPath.GetFileName(), you would need to replace only the Origin portion of each returned absolute path (C:\Main\) with the Destination path (C:\Main 1\), leaving everything else in the path intact. Thus, C:\Main\Sub-Folder\File3.txt would become C:\Main 1\Sub-Folder\File3.txt.
Otherwise, don't use soAllDirectories at all. Recurse through the subfolders manually using TDirectory.GetDirectories() instead, so that you are handling only 1 level of folders at a time.

Related

How to read a file in a different directory in Julia?

Parameters: I'm in directory named algorithm and this algorithm is in a parent directory called src. There is another directory called data that is also in the src parent directory. E.g. the relative path of data is src/data and the relative path of algorithm is src/algorithm. There is also an info.csv file in the data directory.
If I wanted to read the contents of info.csv in a program currently located in the data subdirectory, how would I do that? Doing something like CSV.File("..//data//info.csv"; delim = ";") does not appear to be working.
If you are using a REPL-notebook style workflow why not utilise the functionality of the REPL:
; # To enter shell mode
mv /your/path/file /destination/path/file #or cp if you want a copy
# backspace to return to Julia REPL
The relative path in CSV.File("..//data//info.csv"; delim = ";") is interpreted relative to the current directory path, set by the shell/OS. If you want it to be interpreted relative to the path where the script happens to be located (i.e. the algorithm directory), you can explicitly do that using #__DIR__:
#__DIR__ -> AbstractString
Expand to a string with the absolute path to the directory of the file containing the macrocall.
csvfilepath = joinpath(#__DIR__, "..", "data", "info.csv")
CSV.File(csvfilepath; delim = ";")

What are the return values of os.walk() in python?

I have a directory structure like this:
dir/
└── subdir
My code:
import os
for d in os.walk('dir'):
print(d)
I get the output:
('dir', ['subdir'], [])
('dir/subdir', [], [])
My question is what are those trailing [ ]s ?
There is 1 in the first tuple and 2 in the second.. it confuses me.
It's worth checking out the Python docs for questions like this as they tend to have pretty solid documentation: https://docs.python.org/2/library/os.html#os.walk
Generate the file names in a directory tree by walking the tree either top-down or bottom-up. For each directory in the tree rooted at directory top (including top itself), it yields a 3-tuple (dirpath, dirnames, filenames).
So it will always return a 3-tuple.
For your first directory 'dir', it contains one directory called 'subdir', and it doesn't contain any files so there's an empty list for filenames.
It then has another entry for subdir, which is your 'dir/subdir'. 'subdir' doesn't have any directories or files under it, so you have empty lists for both dirnames and filenames. The key thing is that it always returns a 3-tuple, and the last two elements are always lists, so there are no subdirectories or files, it will return empty lists.

How to copy or move multiple files with same extension?

So I am trying to move a bunch of files with similar extensions from /home/ to /root/
Code I tried is
file copy /home/*.abc.xyz /root/
Also tried
set infile [glob -nocomplain /home/*.abc.xyz ]
if { [llength $infile] > 0 } {
file copy $infile /root/
}
No success.
Your two attempts fail for different reasons:
There is no wildcard expansion in arguments to file copy, or any Tcl command, for that matter: file copy /home/*.abc.xyz /root/. This will look for a single source with a literal * in its filename.
glob -nocomplain /home/*.abc.xyz is ok to collect the sources, but glob returns a list of sources. file copy requires each source to passed as a separate argument, not a single one. To expand a single collection value of source files into a multiple separate arguments, use the Tcl expansion operator {*}
Therefore:
set infiles [glob -nocomplain *.tcl]
if {[llength $infiles]} {
file copy {*}$infiles /tmp/tgt/
}
For a 1-line answer:
file copy {*}[glob /home/*.abc.xyz] /root/.
The file copy (and file rename) commands have two forms (hence the reference to the manual page in the comment). The first form copies a single file to a new target. The second form copies all the file name arguments to a new directory and this form of the command insists that the directory name be the last argument and you may have an arbitrary number of source file names preceding. Also, file copy does not do glob expansion on its arguments, so as you rightly surmised, you also need to use the glob command to obtain a list of the files to copy. The problem is that the glob command returns a list of file names and you passed that list as a single argument, i.e.
file copy $infile /root/
passes the list as a single argument and so the file copy command thinks it is dealing with the first form and attempts to find a file whose name matches that of the entire list. This file probably doesn't exist. Placing the error message in your question would have helped us to know for sure.
So what you want to do is take the list of files contained in the infile variable and expand it into separate argument words. Since this is a common situation, Tcl has some syntax to help (assuming you are not using some ancient version of Tcl). Try using the command:
file copy {*}$infile /root/
in place of your first attempt and see if that helps the situation.

Merge csv in another folder location

I'm trying to merge csv files into one text file using a batch file.
My batch file is located in C:\Users\aallen and the CSV files are located in C:\Users\aallen\Test
The batch file will only work when its located in the same location as the csv.
I have tried the following commands with no joy:
1) cd "C:\Users\aallen\Test" copy *csv test.csv
2) copy "C:\Users\aallen\Test" *csv test.csv
What I'm I missing?
Collecting the information from Question and Comments, you want to combine several CSV files into one, but only keep the headerline once.
more +1 is able to show a file, skipping first lines (see more /?), but more +1 *.csv does only skip the first line of the first file and keeps it at all other files (just the opposite of what you need). So you have to process one file after the other with a for loop and check for first file yourself (can be done with a flag-variable (flag here). Redirect the whole loop to your resultfile.
#echo off
set "first=yes"
(for %%a in ("C:\Users\aallen\Test\*.csv") do (
if defined first (
type "%%a"
set "first="
) else (
more +1 "%%a"
)
))>"d:\new location\test.csv"
Note: more at command line prints just one screen and then pauses until you press a key. But when you use it in a batchfile, it doesn't (well, to be honest, it does, but after ~65000 lines. I hope, your files are shorter)

Algorithm to delete every files in a directory, except some in a given list

Assume we have a directory with structure like this, I marked directories as (+) and files as (-)
rootdir
+a
+a1
-f1
-f2
+a2
-f3
+b
+b1
+b2
-f4
-f5
-f6
+b3
-f7
-f8
and a given list of files like
/a/a1/f1
/b/b1/b2/f5
/b/b3/f7
I am struggling to find the way to remove every files inside root, except the one in the given list. So after the program executed, the root directory should look like this:
rootdir
+a
+a1
-f1
+b
+b1
+b2
-f5
+b3
-f7
This example just for easier to understand the problem. In reality, the given list include around 4 thousands of files. And the root directory has the size of ~15GB with a hundreds of thousands files inside.
That would be easy to search inside a folder, and to remove files that matched in a given list. Let just say we solve the revert issue, to keep files that matched in a given list.
Programs written in Perl/Python are prefer.
First, store your list of files you want to keep inside an associative container like a Python dict or a map of some kind.
Second, simply iterate (in Python, os.walk) over the entire directory structure, and every time you see a file, check if it is in the associative container of paths to keep. If not, delete it (in Python, os.unlink).
Alternatively:
First, create a temporary directory on the same filesystem.
Second, move (os.renames, which generates new subdirectories as needed) all the "keep" files to the temporary directory, with the same structure.
Third, overwrite (os.removedirs followed by os.rename, or just shutil.move) the original directory with the temporary one.
The os.walk path:
import os
keep = set(['/a/a1/f1', '/b/b1/b2/f5', '/b/b3/f7'])
for dirpath, dirnames, filenames in os.walk('./'):
for name in filenames:
path = os.path.join(dirpath, name).lstrip('.')
print('check ' + path)
if path not in keep:
print('delete ' + path)
else:
print('keep ' + path)
It doesn't do anything except inform you.
It don't think os.walk is too slow, and it gives you the option of keeping by regex patterns or any other criteria.
This is a working code for your problem.
import os
def list_files(directory):
for root, dirs, files in os.walk(directory):
for name in files:
yield os.path.join(root, name)
files_to_delete = {'/home/vedang/Desktop/a.out', '/home/vedang/Desktop/ABC/temp.txt'} #Keep a set instead of list for faster lookups
for f in list_files('/home/vedang/Desktop'):
if f in files_to_delete:
os.unlink(f)
Here is a function which accepts a set of files you wish to keep and the root directory from which you wish to begin deleting files.
It's a classic recursive Depth-First-Search that will remove empty directories after deleting all the unwanted files
import os
def delete_files(keep_list:set, curr_dir):
files = os.listdir(curr_dir)
for f in files:
path = f"{curr_dir}/{f}"
if os.path.isfile(path):
if path not in keep_list:
os.remove(path)
elif os.path.islink(path):
os.unlink(path)
elif os.path.isdir(path):
delete_files(keep_list, path)
files = os.listdir(curr_dir)
if not files:
os.rmdir(curr_dir)
here i got a solution in a different aspect,
suppose we are at linux environment,
first,
find .
to get a long list with all file path/folder explained
second, suppose we got the exclude path list, in order to exclude at your volume ( say thousands ) , we could just append these to the previous list, and
| sort | uniq - c |grep -v "^2"
to get the to delete list,
and third
| xargs rm
to actually do the deletion