I am trying to search for directories within sub-directories and return any directories that match the wildcard glob search.
The folder structure is as outlined below...
Rootdir
-dir01
-dir_match_01-OLD
-dir_match_01
-dir02
-dir_match_02-OLD
-dir_match_02
-dir03
-dir_match_03-OLD
-dir_match_03
-...
I am searching for directories that would reside in dir01, dir02, dir03 and so on.
I am using the following glob call to recursively search through the directories, which seems to be working correctly...
set rootdir "/home/rootdir/"
set searchstring "*-OLD"
foreach dir [glob -nocomplain -dir $rootdir -type d -- *] {
set result [glob -nocomplain -dir $dir -type d -- $searchstring]
puts $result
}
What I am finding is if I don't use a wildcard in the $searchstring and use an exact directory name that exists I receive the output successfully. But if I then use a wildcard to search for all directories ending in *-OLD It successfully finds them put puts them all out on the same line.
/home/rootdir/dir01/directory01-OLD /home/rootdir/dir01/directory02-OLD /home/rootdir/dir01/directory03-OLD
I have tried to separate the entries by using regsub to replace the whitespace with \n but all it does is remove the whitespace...
/home/rootdir/dir01/directory01-OLD/home/rootdir/dir01/directory02-OLD/home/rootdir/dir01/directory03-OLD
Any suggestions in what I am doing wrong would be much appreciated, thanks.
The most obvious part is that glob always returns a list of names. You'd therefore need to do the innermost loop like this:
foreach dir [glob -nocomplain -dir $rootdir -type d -- *] {
foreach result [glob -nocomplain -dir $dir -type d -- $searchstring] {
puts $result
}
}
However, for a fixed depth search, I think you can do it like this:
foreach dir [glob -nocomplain -dir $rootdir -type d -- */$searchstring] {
puts $dir
}
If you need recursive (full directory tree) search, there are utility commands in Tcllib's fileutil package:
package require fileutil
proc myMatcher {pattern filename} {
# Does the filename match the pattern, and is it a directory?
expr {[string match $pattern $filename] && [file isdir $filename]}
}
set rootdir "/home/rootdir/"
set searchstring "*-OLD"
# Note the use of [list] to create a partial command application
# This is a standard Tcl technique; it's one of the things that [list] is designed to do
foreach dir [fileutil::find $rootdir [list myMatcher $searchstring]] {
puts $dir
}
Related
I want to read all files containing .sdc
The folder includes
alpha.sdc
beta.sdc
gamma.rpt
I try cmd
set a [open "proj/plrs/*.sdc" r]
but it not working
#Andreas has the right ideas.
set files [glob proj/plrs/*.sdc]
set combined ""
foreach file $files {
set fh [open $file r]
append combined [read $fh]
close $fh
}
To use the glob characters with cat, you'll need a shell to interpret them:
set combined [exec sh -c {cat proj/plrs/*.sdc}]
or expand the results of glob
set combined [exec cat {*}[glob proj/plrs/*.sdc]]
You could use tcllib
package require fileutil
set combined [fileutil::cat {*}[glob proj/plrs/*.sdc]]
Note that glob doesn't sort the files like the shell does, so you may want
set files [lsort [glob $pattern]]
Here my command :
foreach fic [glob -nocomplain -dir $dir -types {f d r} *] {
set infofile [list [file tail $fic] [file mtime $fic] [file atime $fic]]
# ...
}
Only I have an error : couldn't read directory "/Users/..." permission denied...
My solution is to add this command : file readable
foreach fic [glob -nocomplain -dir $dir -types {f d} *] {
if {![file readable $fic]} continue
set infofile [list [file tail $fic] [file mtime $fic] [file atime $fic]]
# ...
}
I thought when I added the r -type this kind of error did not appear.It’s a misunderstanding of the documentation ?
Permissions on Windows are complex, to the point where you can only be really sure that you've got permission to read a file immediately after you've successfully opened it for reading. The indications from glob and file readable are not definitive. This is the case on other operating systems, and in any case there's a race condition: the user could change the permissions between checking with file readable and calling some other operation. Because of that, while you can use glob -type r, you should not rely on it. It simply can't be guaranteed to be correct.
The fix for this? Handle errors from calls properly.
foreach fic [glob -nocomplain -dir $dir -types {f d r} *] {
try {
# More efficient than calling [file mtime] and [file atime] separately
file stat $fic data
} on error {} {
# Couldn't actually handle the file. Ignore
continue
}
set infofile [list [file tail $fic] $data(mtime) $data(atime)]
# ...
}
I'm trying to copy a file using a wildcard and it isn't being interpreted correctly.
set projName [lindex $argv 0]
puts "$projName chosen"
set sysdefPath "$projName/$projName.runs/impl_1/*.sysdef"
file copy -force $sysdefPath ./src/generatedFiles/$projName.hdf
I've tried a couple of variations of this but none have worked {*}, (*), [*], {.*}. The result of this places the wildcard (*) in the search path instead of trying to pattern match it.
What is the correct way to perform this?
Output,
$ test.tcl -tclargs proj
# set projName [lindex $argv 0]
# puts "$projName chosen"
proj chosen
# set sysdefPath "$projName/$projName.runs/impl_1/*.sysdef"
# file copy -force $sysdefPath ./src/generatedFiles/$projName.hdf
error copying "proj/proj.runs/impl_1/*.sysdef": no such file or directory
while executing
"file copy -force $sysdefPath ./src/generatedFiles/$projName.hdf"
(file "./src/projTcls/build_bitstream.tcl" line 5)
Your shell will expand file patterns wherever it finds them. Tcl is not like that: you have to explicitly ask for the list of files matching a pattern using the glob command: untested
set pattern $projName/$projName.runs/impl_1/*.sysdef
set sysdefPaths [glob -nocomplain -- $pattern]
switch -exact [llength $sysdefPaths] {
0 {error "No files match $pattern"}
1 {file copy -force [lindex $sysdefPaths 0] ./src/generatedFiles/$projName.hdf}
default {error "Multiple files match $pattern: [list $sysdefPaths]"}
}
I have C file at
C:\SVN\Code\fileio.c
This reads 2 audio files as
tuningFile = fopen("../../simulation/micdata.bin", "rb");
mic1File = fopen("../../simulation/mic1.pcm", "rb");
I need to write TCL script code that will read the C file, and replace these 2 occurrences to
tuningFile = fopen("C:/SVN/simulation/micdata.bin", "rb");
mic1File = fopen("C:/SVN/simulation/mic1.pcm", "rb");
Can anyone give a compact example for something like below:
read file line wise
search for something like tuningFile = fopen(
extracting path from it and change it to absolute path
combine it with *tuningFile = fopen(
replace original line with modified line at same location
Thanks
sedy
The key is that you actually want to replace:
fopen("../../simulation/
with
fopen("C:/SVN/simulation/
That's easily done with string map. The rest of your problem is then just a matter of doing the file I/O, and pretty much any C source file that can be compiled by an ordinary compiler is best processed by loading it all into memory at once:
set filename {C:\SVN\Code\fileio.c}
set mapping [list {fopen("../../simulation/} {fopen("C:/SVN/simulation/}]
# Slurp the file in
set f [open $filename]
set data [read $f]
close $f
# Apply the mapping
set data [string map $mapping $data]
# Make the original a backup
file rename $filename $filename.bak
# Write back with a separate open
set f [open $filename w]
puts -nonewline $f $data
close $f
If you prefer, you can get the filename as an argument using, say, [lindex $argv 0]. The rest of the code doesn't care.
Here's a version that extracts the filename and uses file normalize on it:
set f [open $filename r]
set code [read $f]
close $f
set code [subst -novar -noback [regsub -all {((?:tuningFile|mic1File) = fopen\(")([^"]+)} $code {\1[file normalize "\2"]}]]
Breaking that up,
this command
regsub -all {((?:tuningFile|mic1File) = fopen\(")([^"]+)} $code {\1[file normalize "\2"]}
will find the string tuningFile = fopen("../relative/file (or "mic1file = ...") and replace it with the text
tuningFile = fopen("[file normalize "../relative/file"]
Then we feed that to subst so that embedded commands can be substituted, executing that file normalize command, resulting in the text
tuningFile = fopen("/full/path/to/file
Take 2: handle brackets in C code
$ pwd
/home/jackman/tmp/base/SVN/Code
$ tree ../..
../..
├── SVN
│ └── Code
│ ├── fileio.c
│ └── normalize.tcl
└── simulation
├── mic1.pcm
└── micdata.bin
3 directories, 4 files
$ cat fileio.c
int tuningFix[MAXTUNING];
tuningFile = fopen("../../simulation/micdata.bin", "rb");
mic1File = fopen("../../simulation/mic1.pcm", "rb");
$ cat normalize.tcl
#! tclsh
package require fileutil
set code [fileutil::cat [lindex $argv 0]]
# protect existing brackets
set bracketmap [list \[ \x01 \] \x02]
set code [string map $bracketmap $code]
# normalize filenames
set code [
subst -novar -noback [
regsub -all {((?:tuningFile|mic1File) = fopen\(")([^"]+)} $code {\1[file normalize "\2"]}
]
]
# restore brackets
set code [string map [lreverse $bracketmap] $code]
puts $code
$ tclsh normalize.tcl fileio.c
int tuningFix[MAXTUNING];
tuningFile = fopen("/home/jackman/tmp/base/simulation/micdata.bin", "rb");
mic1File = fopen("/home/jackman/tmp/base/simulation/mic1.pcm", "rb");
package require fileutil
set filename C:/SVN/Code/fileio.c
set mapping [list {fopen("../../simulation/} {fopen("C:/SVN/simulation/}]
proc replace {mapping data} {
string map $mapping $data
}
::fileutil::updateInPlace $filename [list replace $mapping]
Should work too. (Definition of mapping nicked from Donal.) updateInPlace calls the command prefix in its second argument, passes the contents of the file to that command, and updates the file with the result from the command.
This is very nearly the same procedure as in Donal's answer, expressed in higher-level code. If you want a backup copy, do this before calling updateInPlace:
file copy $filename [file rootname $filename].bak
Documentation: fileutil package, list, proc, set, string
based on great help from all users who commented, I was able to do the task as
proc replaceFileTemp {} {
global pth_fileio_orig
# create backup for copy back
set pth_backup [file rootname $pth_fileio_orig].bak
file copy $pth_fileio_orig $pth_backup
#get current file path
set thisFilePth [ dict get [ info frame [ info frame ] ] file ]
# get folder for current file
set thisFileFolderPth [file dirname $thisFilePth]
# set the replacement string/path
set replacementPth [file dirname $thisFileFolderPth]
# obtain original string to be replaced
set origPth "../../simulation/toplevel"
# download package for file manipulation
package require fileutil
set mapping [list $origPth $replacementPth]
proc replace {mapping data} {
string map $mapping $data
}
# replace original string with replacement string for all occurrences in file
::fileutil::updateInPlace $pth_fileio_orig [list replace $mapping]
}
# set the path to toplevel C file
set pth_fileio_orig [file normalize "../../../fileio.c"]
replaceFileTemp
I have a main TCL proc that sources tons of other tcl procs in other folders and subsequent subdirectories. For example, in the main proc it has:
source $basepath/folderA/1A.tcl
source $basepath/folderA/2A.tcl
source $basepath/folderA/3A.tcl
source $basepath/folderB/1B.tcl
source $basepath/folderB/2B.tcl
source $basepath/folderB/3B.tcl
and it seems kind of stupid to do it that way when I always know I will source everything in folderA and folderB. Is there a function (or simple way) that'll allow me to just source all the .tcl files in an entire folder?
Building on ramanman's reply, heres a routine that tackles the problem using the built in TCL file commands and which works it way down the directory tree recursively.
# findFiles
# basedir - the directory to start looking in
# pattern - A pattern, as defined by the glob command, that the files must match
proc findFiles { basedir pattern } {
# Fix the directory name, this ensures the directory name is in the
# native format for the platform and contains a final directory seperator
set basedir [string trimright [file join [file normalize $basedir] { }]]
set fileList {}
# Look in the current directory for matching files, -type {f r}
# means ony readable normal files are looked at, -nocomplain stops
# an error being thrown if the returned list is empty
foreach fileName [glob -nocomplain -type {f r} -path $basedir $pattern] {
lappend fileList $fileName
}
# Now look for any sub direcories in the current directory
foreach dirName [glob -nocomplain -type {d r} -path $basedir *] {
# Recusively call the routine on the sub directory and append any
# new files to the results
set subDirList [findFiles $dirName $pattern]
if { [llength $subDirList] > 0 } {
foreach subDirFile $subDirList {
lappend fileList $subDirFile
}
}
}
return $fileList
}
It gets trivial with tcllib on board:
package require fileutil
foreach file [fileutil::findByPattern $basepath *.tcl] {
source $file
}
Perhaps a little more platform independent and using builtins commands instead of piping to a process:
foreach script [glob [file join $basepath folderA *.tcl]] {
source $script
}
Repeat for folderB.
If you have more stringent selection criteria, and don't worry about running on any other platforms, using find may be more flexible.
Here is one way:
set includes [open "|find $basedir -name \*.tcl -print" r]
while { [gets $includes include] >= 0 } {
source $include
}
close $includes
Based on a previous answer, this version handles cycles created by symbolic links and in the process eliminates duplicate files due to symbolic links as well.
# findFiles
# basedir - the directory to start looking in
# pattern - A pattern, as defined by the glob command, that the files must match
proc findFiles {directory pattern} {
# Fix the directory name, this ensures the directory name is in the
# native format for the platform and contains a final directory seperator
set directory [string trimright [file join [file normalize $directory] { }]]
# Starting with the passed in directory, do a breadth first search for
# subdirectories. Avoid cycles by normalizing all file paths and checking
# for duplicates at each level.
set directories [list]
set parents $directory
while {[llength $parents] > 0} {
# Find all the children at the current level
set children [list]
foreach parent $parents {
set children [concat $children [glob -nocomplain -type {d r} -path $parent *]]
}
# Normalize the children
set length [llength $children]
for {set i 0} {$i < $length} {incr i} {
lset children $i [string trimright [file join [file normalize [lindex $children $i]] { }]]
}
# Make the list of children unique
set children [lsort -unique $children]
# Find the children that are not duplicates, use them for the next level
set parents [list]
foreach child $children {
if {[lsearch -sorted $directories $child] == -1} {
lappend parents $child
}
}
# Append the next level directories to the complete list
set directories [lsort -unique [concat $directories $parents]]
}
# Get all the files in the passed in directory and all its subdirectories
set result [list]
foreach directory $directories {
set result [concat $result [glob -nocomplain -type {f r} -path $directory -- $pattern]]
}
# Normalize the filenames
set length [llength $result]
for {set i 0} {$i < $length} {incr i} {
lset result $i [file normalize [lindex $result $i]]
}
# Return only unique filenames
return [lsort -unique $result]
}
Same idea as schlenk:
package require Tclx
for_recursive_glob scriptName $basepath *.tcl {
source $scriptName
}
If you only want folderA and folderB and not other folders under $basepath:
package require Tclx
for_recursive_glob scriptName [list $basepath/folderA $basepath/folderB] *.tcl {
source $scriptName
}
The answer by Joseph Bui works well except that it skips files in the initial folder.
Change:
set directories [list]
To:
set directories [list $directory]
to fix