How to search a a directory tree in TCL for files that end in .so or .a and build a list of those directories - tcl

I'm trying to write a set of TCL scripts that helps setup a user's environment for a set of libraries that are not in their standard LD_LIBRARY_PATH as part of a support solution.
Since the system is rather sparse in terms of what I can install, I don't really have access to any TCL extensions and am trying to do this in base TCL as much as possible. I'm also relatively new to TCL as a language.
What I'd like to do is search through a directory structure and locate the directories that have .so and .a files in them, build a list of those, and, eventually add them to the user's $LD_LIBRARY_PATH variable.
If I were doing this in a shell, I'd just use find with something like this:
find /dir -type f \( -name "*.so" -o -name "*.a" \) | awk -F/ 'sub(FS $NF,x)' | sort -u
I could hard-code the paths, but we want a single set of scripts that can manage several different applications.
Any ideas would be very much appreciated.

Tcllib has a fileutil module that does a recursive find:
package require fileutil
set filenames [::fileutil::findByPattern /dir -glob {*.so *.a}]
foreach filename $filenames {
if {[file isfile $filename]} {
set dirs([file dirname $filename]) 1
}
}
puts [array names dirs]
If you want to use this, but can't install something, you can just take the procedures and add them to your code (with the appropriate attribution) -- http://core.tcl.tk/tcllib/dir?ci=63d99a74f4600441&name=modules/fileutil
Otherwise, just call the system's find command and parse the output (assume that your filenames do not contain newlines).
set filenames [split [exec find /dir -type f ( -name *.so -o -name *.a )] \n]
And the loop to extract the unique directories is similar. As a side benefit, the find invocation is actually easier to read because you don't have to escape all the chars that are special to the shell.

Related

Tcl: Copy all files in directory structure to a single location outside

I need to recursively find all files that end with .vhd inside a directory and then copy all of them to the directory where the tcl script has been called from. This is the same directory that the user is in when the TCL script is invoked.
So what I need to so to find all .vhd files recursively and regardless of where they are, copy all of them into a single folder. I tried to use glob search but it seems perhaps that it cannot do recursive search. How can this be achieved using TCL?
You can use fileutil::findByPattern from tcllib to get the file names and copy them in a loop:
#!/usr/bin/env tclsh
package require fileutil
proc copy_tree {from_dir pat to_dir} {
foreach f [::fileutil::findByPattern $from_dir -glob -- $pat] {
file copy -force -- $f $to_dir
}
}
copy_tree some/dir *.vhd destination/
Unless you can provide for a tcllib installation for your environment, you will have to implement the directory traversal yourself. Below is but one of the many options (adapted from ftw_4 at Tclers' Wiki; this assumes Tcl 8.5+):
proc copy_tree {from_dirs pat to_dir} {
set files [list]
while {[llength $from_dirs]} {
set from_dirs [lassign $from_dirs name]
set from_dirs [list {*}[glob -nocomplain -directory $name -type d *] {*}$from_dirs]
lappend files {*}[glob -nocomplain -directory $name -type f $pat]
}
if {[llength $files]} {
file copy -force -- {*}$files $to_dir
}
}
Some remarks: This implements an iterative, depth-first directory traversal, and
glob is used twice, once to collect sub-directories, once to collect the source files.
file copy is executed just once (for the entire collection of source files), not for every source file.
The -nocomplain flag on glob is convenient, but maybe not useful in your scenario (whatever that might be).

How to delete a specific text file (based on name) in TCL after searching all folders, not just the current directory?

How do I search for a particular file name in the entire directory and then delete it in TCL?
The easy way is using the fileutil::traverse package from tcllib, which safely searches a directory tree for matching files. You can then call file delete with those filenames.
Example:
#!/usr/bin/env tclsh
package require fileutil::traverse
# Look for all foo.txt files regardless of directory.
# Use this to get the OS-specific path delimiter instead of a hardcoded / or \\
set pattern [file join * foo.txt]
fileutil::traverse findFoo . -filter [list string match $pattern]
# I suggest a dry run first to make sure your filter is returning just the
# appropriate filename(s).
puts [findFoo files]
# When satisfied, delete for real.
# file delete [findFoo files]
Perhaps this is the easiest way, although it is not portable:
puts [exec find . -name $filenameToDelete -print]
If that find the right files, you can do this
exec find . -name $filenameToDelete -delete

TCL script to go through all folders in a directory and perform a function

File structure
File1
test.pdb
xyz.txt
File2
test.pdb
xyz.txt
File3
test.pdb
xyz.txt
I want to loop in through all folders in the directory and run the following code which is in textfile autopsf.tcl on Tk console :
package require autopsf
mol new test.pdb
autopsf -mol 0
package require solvate
solvate test_autopsf.psf test_autopsf.pdb -t 5 -o solvate
package require autoionize
autoionize -psf solvate.psf -pdb solvate.pdb -neutralize
I am running the following code at the moment :
for d in ./*/ ; do
source autopsf.tcl
done
If you don't care about the order, you can do:
foreach dir [glob -type d *] {
# safety check
if {![file exists $dir/test.pdb]} continue
# code to do the work here; note that we have directories
}
You can probably factor out the package require calls. Well-designed packages can live together right, and putting them at the top is a useful way to make dependencies evident.
If you want the directories sorted, apply lsort to the output of glob. The default order from glob is whatever the OS gives us the directory entries in, and can depend on all sorts of things (including file ages and so on) so it should not be relied upon in code where a definite of processing matters.
This code worked for me:
foreach file [glob -nocomplain "./*/"] {
cd $file
source autopsf.tcl
cd ..
}

How to `rm -rf *` in TCL

I want to delete all files in a directory using TCL. (I'm using Xilinx Vivado's TCL console under Win 10.) I found that in TCL documentation that
file delete ?-force? ?- -? pathname ?pathname ... ?
should work.
But
file delete -force -- [glob *]
does nothing.
What's the wrong with that?
Make that
file delete -force -- {*}[glob *]
... so that the path names returned by [glob] are turned into multiple arguments to [file delete] (using the expansion operator {*}), rather than one argument representing the one list of a path names (read by [file delete] as one, well complex file path).
Alternatively, on Tcl older than 8.5, use an explicit loop:
foreach path [glob *] {
file delete -force -- $path
}
Additional things for you do consider:
do you need to be concerned about deleting files, not directories? Consider the -type option for the glob command.
if you need to work recursively, don't reinvent the wheel and use tcllib. The fileutil::traverse and fileutil packages are relevant.

TCL- script to output a file which contains size of all the files in the directry and subdirectories

Please help me with the script which outputs the file that contains names of the files in subdirectories and its memory in bytes, the arguement to the program is the folder path .output file should be file name in 1st column and its memory in second column
Note:folder contains subfolders...inside subfolders there are files
.I tried this way
set fp [open files_memory.txt w]
set file_names [glob ../design_data/*/*]
foreach file $file_names {
puts $fp "$file [lindex [exec du -sh $file] 0]"
}
close $fp
Result sample:
../design_data/def/ip2.def.gz 170M
../design_data/lef/tsmc13_10_5d.lef 7.1M
But i want only file name to be printed that is ip2.def.gz , tsmc13_10_5d.lef ..etc(not the entirepath) and file memorry should be aligned
TCL
The fileutil package in Tcllib defines the command fileutil::find, which can recursively list the contents of a directory. You can then use foreach to iterate over the list and get the sizes of each of them with file size, before producing the output with puts, perhaps like this:
puts "$filename\t$size"
The $filename is the name of the file, and the $size is how large it is. You will have obtained these values earlier (i.e., in the line or two before!). The \t in the middle is turned into a TAB character. Replace with spaces or a comma or virtually anything else you like; your call.
To get just the last part of the filename, I'd do:
puts $fp "[file tail $file] [file size $file]"
This does stuff with the full information about the file size, not the abbreviated form, so if you really want 4k instead of 4096, keep using that (slow) incantation with exec du. (If the consumer is a program, or a programmer, writing out the size in full is probably better.)
In addition to Donal's suggestion, there are more tools for getting files recursively:
recursive_glob (from the Tclx package) and
for_recursive_glob (also from Tclx)
fileutil::findByPattern (from the fileutil package)
Here is an example of how to use for_recursive_glob:
package require Tclx
for_recursive_glob filename {../design_data} {*} {
puts $filename
}
This suggestion, in combination with Donal's should be enough for you to create a complete solution. Good luck.
Discussion
The for_recursive_glob command takes 4 arguments:
The name of the variable representing the complete path name
A list of directory to search for (e.g. {/dir1 /dir2 /dir3})
A list of patterns to search for (e.g. {*.txt *.c *.cpp})
Finally, the body of the for loop, where you want to do something with the filename.
Based on my experience, for_recursive_glob cannot handle directories that you don't have permission to (i.e. on Mac, Linux, and BSD platforms, I don't know about Windows). In which case, the script will crash unless you catch the exception.
The recursive_glob command is similar, but it returns a list of filenames instead of structuring in a for loop.