I used the tail command to capture the name from a file name before the ".".
However, some of my files has name that looks like that (ie: abc.txt.gz).
I only want abc and not abc.txt. Please advise.
Current Output
main_name = abc.txt
What I want for final output:
main_name = abc
My script is as below:
foreach xxx $filename {
puts "main_name = [file rootname [file tail $xxx]]"
}
```````
Repeatedly running the file rootname command will remove an extension on every iteration. So by continuing that until the output is the same as the input, you know all extensions have been removed:
foreach name [glob *] {
set tail [file tail $name]
set root [file rootname $tail]
while {$root ne $tail} {
set tail $root
set root [file rootname $tail]
}
puts "main name = $root"
}
A different approach: since the "." is not platform specific:
foreach name [glob *] {
set tail [file tail $name]
set dotIdx [string first "." $tail]
if {$dotIdx > -1} {
set tail [string range $tail 0 $idx-1]
}
puts "main name = $tail"
}
Related
I want to split this path so that I have the directory name as a variable. But he is interrupted.
path to splitt:
/home/user/T.A.T.E.-ano_ays-(ff-A-a)-ownage
code:
bind pub "-|-" !aaa pub:aaa
proc pub:aaa { nick uhost hand chan arg } {
set checkpath "/home/user/T.A.T.E.-ano_ays-(ff-A-a)-ownage"
set dirname [file rootname [file tail $checkpath]]
putnow "PRIVMSG $chan :dirname $dirname"
}
output:
dirname T.A.T.E
correct would be:
dirname T.A.T.E.-ano_ays-(ff-A-a)-ownage
How can fix the output from dirname
This is what file split was made for:
% lindex [file split /home/user/T.A.T.E.-ano_ays-(ff-A-a)-ownage] end
T.A.T.E.-ano_ays-(ff-A-a)-ownage
file tail without postprocessing via file rootname would also work:
% file tail /home/user/T.A.T.E.-ano_ays-(ff-A-a)-ownage
T.A.T.E.-ano_ays-(ff-A-a)-ownage
file rootname cuts the trailing filepath component at (not including) the last dot ..
Here my command :
foreach fic [glob -nocomplain -dir $dir -types {f d r} *] {
set infofile [list [file tail $fic] [file mtime $fic] [file atime $fic]]
# ...
}
Only I have an error : couldn't read directory "/Users/..." permission denied...
My solution is to add this command : file readable
foreach fic [glob -nocomplain -dir $dir -types {f d} *] {
if {![file readable $fic]} continue
set infofile [list [file tail $fic] [file mtime $fic] [file atime $fic]]
# ...
}
I thought when I added the r -type this kind of error did not appear.It’s a misunderstanding of the documentation ?
Permissions on Windows are complex, to the point where you can only be really sure that you've got permission to read a file immediately after you've successfully opened it for reading. The indications from glob and file readable are not definitive. This is the case on other operating systems, and in any case there's a race condition: the user could change the permissions between checking with file readable and calling some other operation. Because of that, while you can use glob -type r, you should not rely on it. It simply can't be guaranteed to be correct.
The fix for this? Handle errors from calls properly.
foreach fic [glob -nocomplain -dir $dir -types {f d r} *] {
try {
# More efficient than calling [file mtime] and [file atime] separately
file stat $fic data
} on error {} {
# Couldn't actually handle the file. Ignore
continue
}
set infofile [list [file tail $fic] $data(mtime) $data(atime)]
# ...
}
I am having one problem as stated below
Problem Description
I am having some "sv" extension files , I am using "glob" to extract the matching files , Now in these matching files , I need to split them and extract the elements and create different lists.
For example
set files [glob -type f modified_rtl_files/*/*{wrapper_hsm}*]
This command gives me these two files :
modified_rtl_files/mbist_wrapper/mbist_wpper_hsm_pkram.sv modified_rtl_files/mbi_wrapper/mbist_wrapper_hsm_sysram.sv
Now I am extracting the filename with the below command
foreach element $files {
set c [split [ file tail [ file rootname $element ] ] "_" ]
echo $c
}
This is giving me
pkram
sysram
But I need to set them to two independent list
$mem1
$mem2
$mem1 should be pkram
$mem2 should be sysram
Whenever I am trying to create something like this , both the elements are getting printed
foreach element $files {
set c [split [ file tail [ file rootname $element ] ] "_" ]
set d [ lindex $c 3]
set mem1 [ lindex $d 0 ]
puts $mem1
}
It is printing both
pkram
sysram
I want independent variables .
Try adding the values you want into an array instead of numbered variables:
set files {modified_rtl_files/mbist_wrapper/mbist_wpper_hsm_pkram.sv modified_rtl_files/mbi_wrapper/mbist_wrapper_hsm_sysram.sv}
set mem {}
foreach f $files {
lappend mem [file rootname [lindex [split $f _] end]]
}
puts [lindex $mem 0] ;# => pkram
puts [lindex $mem 1] ;# => sysram
Or use the lmap command to make it more concise:
set mem [lmap f $files {file rootname [lindex [split $f _] end]}]
You can do generated names with set:
foreach element [lsort $files] {
set mem[incr i] [lindex [split [file tail [file rootname $element]] "_"] 3]
puts [set mem$i]; # <<< How to *READ* a variable with a generated name
}
(Always sort the output of glob if you care about the order at all; it's not guaranteed to come in any order at all or with any consistency).
BUT DON'T DO THIS!
Generated variable names are a pain to work with precisely because they can't be used with the $ language syntax. You're much better using (associative) arrays because those do work with that:
foreach element [lsort $files] {
set mem([incr i]) [lindex [split [file tail [file rootname $element]] "_"] 3]
puts $mem($i)
}
The values will not be in mem1 and mem2, but rather mem(1) and mem(2).
i have to perform following operation..
copy file from one location to another
search a word in the given file
and move the file pointer to beginning of that line
place the data in that location which are copied from other file...
3 files are as follows:
C:\program Files(X86)\Route\*.tcl
C:\Sanity_Automation\Route\*.tcl
C:\Script.tcl
First i need to copy files from Route folder in Program Files to
Sanity_Automation\Route*.tcl
Then i need to search "CloseAllOutputFile keyword in
C:/Sanity_Automation/Route/SystemTest.tcl
once found, move cursor to the beginning of that line where "CloseAllOutputFile " keyword found.
and place data found on script.tcl to that location.
Firstly, that first "file" is actually a pattern. We need to expand that to a list of real filenames. We do that with glob.
# In braces because there are backslashes
set pattern {C:\Program Files(X86)\Route\*.tcl}
# De-fang the backslashes
set pattern [file normalize $pattern]
# Expand
set sourceFilenames [glob $pattern]
Then we want to copy them. We could do this with:
set target {C:\Sanity_Automation\Route\}
file copy {*}$sourceFilenames [file normalize $target]
But really we also want to build up a list of moved files so that we can process them in the next step. So we do this:
set target {C:\Sanity_Automation\Route\}
foreach f $sourceFilenames {
set t [file join $target [file tail $f]]
file copy $f $t
lappend targetFilenames $t
}
OK, now we're going to do the insertion processing. Let's start by getting the data to insert:
set f [open {C:\Script.tcl}]
set insertData [read $f]
close $f
Now, we want to go over each of the files, read them in, find where to do the insertion, actually do the insertion if we find the place, and then write the files back out. (You do text edits by read/modify-in-memory/write rather than trying to modify the file directly. Always.)
# Iterating over the filenames
foreach t $targetFilenames {
# Read in
set f [open $t]
set contents [read $f]
close $f
# Do the search (this is the easiest way!)
if {[regexp -indices -line {^.*CloseAllOutputFile} $contents where]} {
# Found it, so do the insert
set idx [lindex $where 0]
set before [string range $contents 0 [expr {$idx-1}]]
set after [string range $contents $idx end]
set contents $before$insertData$after
# We did the insert, so write back out
set f [open $t "w"]
puts -nonewline $f $contents
close $f
}
}
Normally, I'd do the modify as part of the copy, but we'll do it your way here.
Try this:
set sourceDir [file join / Files(x86) Route]
set destinationDir [file join / Sanity_Automation Route]
# Read the script to be inserted
set insertFnm [file join / Script.tcl]
set fil [open $insertFnm]
set insertData [read $fil]
close $fil
# Loop around all the Tcl scripts in the source directory
foreach inFnm [glob [file join $sourceDir *.tcl]] {
# Determine the name of the output file
set scriptName [file tail $inFnm]
set outFnm [file join $destinationDir $scriptName]
# Open source and destination files, for input and output respectively
set inFil [open $inFnm]
set outFil [open $outFnm w]
while {![eof $inFil]} {
set line [gets $inFil]
if {[string match *CloseAllOutputFile* $line]} {
puts $outFil $insertData
puts $outFil ""; # Ensure there's a newline at the end
# of the insertion
}
puts $outFil $line
}
# Close input and output files
close $inFil
close $outFil
}
It seems to work for me.
I have a main TCL proc that sources tons of other tcl procs in other folders and subsequent subdirectories. For example, in the main proc it has:
source $basepath/folderA/1A.tcl
source $basepath/folderA/2A.tcl
source $basepath/folderA/3A.tcl
source $basepath/folderB/1B.tcl
source $basepath/folderB/2B.tcl
source $basepath/folderB/3B.tcl
and it seems kind of stupid to do it that way when I always know I will source everything in folderA and folderB. Is there a function (or simple way) that'll allow me to just source all the .tcl files in an entire folder?
Building on ramanman's reply, heres a routine that tackles the problem using the built in TCL file commands and which works it way down the directory tree recursively.
# findFiles
# basedir - the directory to start looking in
# pattern - A pattern, as defined by the glob command, that the files must match
proc findFiles { basedir pattern } {
# Fix the directory name, this ensures the directory name is in the
# native format for the platform and contains a final directory seperator
set basedir [string trimright [file join [file normalize $basedir] { }]]
set fileList {}
# Look in the current directory for matching files, -type {f r}
# means ony readable normal files are looked at, -nocomplain stops
# an error being thrown if the returned list is empty
foreach fileName [glob -nocomplain -type {f r} -path $basedir $pattern] {
lappend fileList $fileName
}
# Now look for any sub direcories in the current directory
foreach dirName [glob -nocomplain -type {d r} -path $basedir *] {
# Recusively call the routine on the sub directory and append any
# new files to the results
set subDirList [findFiles $dirName $pattern]
if { [llength $subDirList] > 0 } {
foreach subDirFile $subDirList {
lappend fileList $subDirFile
}
}
}
return $fileList
}
It gets trivial with tcllib on board:
package require fileutil
foreach file [fileutil::findByPattern $basepath *.tcl] {
source $file
}
Perhaps a little more platform independent and using builtins commands instead of piping to a process:
foreach script [glob [file join $basepath folderA *.tcl]] {
source $script
}
Repeat for folderB.
If you have more stringent selection criteria, and don't worry about running on any other platforms, using find may be more flexible.
Here is one way:
set includes [open "|find $basedir -name \*.tcl -print" r]
while { [gets $includes include] >= 0 } {
source $include
}
close $includes
Based on a previous answer, this version handles cycles created by symbolic links and in the process eliminates duplicate files due to symbolic links as well.
# findFiles
# basedir - the directory to start looking in
# pattern - A pattern, as defined by the glob command, that the files must match
proc findFiles {directory pattern} {
# Fix the directory name, this ensures the directory name is in the
# native format for the platform and contains a final directory seperator
set directory [string trimright [file join [file normalize $directory] { }]]
# Starting with the passed in directory, do a breadth first search for
# subdirectories. Avoid cycles by normalizing all file paths and checking
# for duplicates at each level.
set directories [list]
set parents $directory
while {[llength $parents] > 0} {
# Find all the children at the current level
set children [list]
foreach parent $parents {
set children [concat $children [glob -nocomplain -type {d r} -path $parent *]]
}
# Normalize the children
set length [llength $children]
for {set i 0} {$i < $length} {incr i} {
lset children $i [string trimright [file join [file normalize [lindex $children $i]] { }]]
}
# Make the list of children unique
set children [lsort -unique $children]
# Find the children that are not duplicates, use them for the next level
set parents [list]
foreach child $children {
if {[lsearch -sorted $directories $child] == -1} {
lappend parents $child
}
}
# Append the next level directories to the complete list
set directories [lsort -unique [concat $directories $parents]]
}
# Get all the files in the passed in directory and all its subdirectories
set result [list]
foreach directory $directories {
set result [concat $result [glob -nocomplain -type {f r} -path $directory -- $pattern]]
}
# Normalize the filenames
set length [llength $result]
for {set i 0} {$i < $length} {incr i} {
lset result $i [file normalize [lindex $result $i]]
}
# Return only unique filenames
return [lsort -unique $result]
}
Same idea as schlenk:
package require Tclx
for_recursive_glob scriptName $basepath *.tcl {
source $scriptName
}
If you only want folderA and folderB and not other folders under $basepath:
package require Tclx
for_recursive_glob scriptName [list $basepath/folderA $basepath/folderB] *.tcl {
source $scriptName
}
The answer by Joseph Bui works well except that it skips files in the initial folder.
Change:
set directories [list]
To:
set directories [list $directory]
to fix