TCL- script to output a file which contains size of all the files in the directry and subdirectories - tcl

Please help me with the script which outputs the file that contains names of the files in subdirectories and its memory in bytes, the arguement to the program is the folder path .output file should be file name in 1st column and its memory in second column
Note:folder contains subfolders...inside subfolders there are files
.I tried this way
set fp [open files_memory.txt w]
set file_names [glob ../design_data/*/*]
foreach file $file_names {
puts $fp "$file [lindex [exec du -sh $file] 0]"
}
close $fp
Result sample:
../design_data/def/ip2.def.gz 170M
../design_data/lef/tsmc13_10_5d.lef 7.1M
But i want only file name to be printed that is ip2.def.gz , tsmc13_10_5d.lef ..etc(not the entirepath) and file memorry should be aligned
TCL

The fileutil package in Tcllib defines the command fileutil::find, which can recursively list the contents of a directory. You can then use foreach to iterate over the list and get the sizes of each of them with file size, before producing the output with puts, perhaps like this:
puts "$filename\t$size"
The $filename is the name of the file, and the $size is how large it is. You will have obtained these values earlier (i.e., in the line or two before!). The \t in the middle is turned into a TAB character. Replace with spaces or a comma or virtually anything else you like; your call.
To get just the last part of the filename, I'd do:
puts $fp "[file tail $file] [file size $file]"
This does stuff with the full information about the file size, not the abbreviated form, so if you really want 4k instead of 4096, keep using that (slow) incantation with exec du. (If the consumer is a program, or a programmer, writing out the size in full is probably better.)

In addition to Donal's suggestion, there are more tools for getting files recursively:
recursive_glob (from the Tclx package) and
for_recursive_glob (also from Tclx)
fileutil::findByPattern (from the fileutil package)
Here is an example of how to use for_recursive_glob:
package require Tclx
for_recursive_glob filename {../design_data} {*} {
puts $filename
}
This suggestion, in combination with Donal's should be enough for you to create a complete solution. Good luck.
Discussion
The for_recursive_glob command takes 4 arguments:
The name of the variable representing the complete path name
A list of directory to search for (e.g. {/dir1 /dir2 /dir3})
A list of patterns to search for (e.g. {*.txt *.c *.cpp})
Finally, the body of the for loop, where you want to do something with the filename.
Based on my experience, for_recursive_glob cannot handle directories that you don't have permission to (i.e. on Mac, Linux, and BSD platforms, I don't know about Windows). In which case, the script will crash unless you catch the exception.
The recursive_glob command is similar, but it returns a list of filenames instead of structuring in a for loop.

Related

Reading cmd arguments in TCL file

I am trying to run a tcl script through .bat file. I want to read some cmd arguments in the tcl script. Below is my code:
Command to run:
D:\Cadence\Sigrity2021.1\tools\bin\PowerSI.exe -tcl abcd.tcl %new_var%.spd %new_file_name%
Below is how I am trying to read the variable in the tcl file:
sigrity::open document [lindex $argv 0] {!}
It open up the Cadence Sigrity, but I see the below error:
How do I read cmd argument in tcl?
If you have no other way to do it that you can find (and it sounds like that might be the case) then you can fake it by writing a helper file with content like this, filling in the real arguments in the appropriate places:
# Name of script to call
set ::argv0 "abcd.tcl"
# Arguments to pass
set ::argv {}
lappend ::argv "%new_var%.spd"
lappend ::argv "%new_file_name%"
# Number of arguments (rarely used)
set ::argc [llength $::argv]
# Do the call
source $::argv0
Then you can pass that file to PowerSI and it will set things up and chain to the real file. It's messy, but practical.
If you're writing this from Tcl, use the list command to do the quoting of the strings (instead of putting them in double quotes) as it will do exactly the right thing for you. If you're writing the file from another language, you'll want to make sure you put backslashes in before \, ", $ and [ characters. The fiddlyness of doing that depends on your language.

How to copy or move multiple files with same extension?

So I am trying to move a bunch of files with similar extensions from /home/ to /root/
Code I tried is
file copy /home/*.abc.xyz /root/
Also tried
set infile [glob -nocomplain /home/*.abc.xyz ]
if { [llength $infile] > 0 } {
file copy $infile /root/
}
No success.
Your two attempts fail for different reasons:
There is no wildcard expansion in arguments to file copy, or any Tcl command, for that matter: file copy /home/*.abc.xyz /root/. This will look for a single source with a literal * in its filename.
glob -nocomplain /home/*.abc.xyz is ok to collect the sources, but glob returns a list of sources. file copy requires each source to passed as a separate argument, not a single one. To expand a single collection value of source files into a multiple separate arguments, use the Tcl expansion operator {*}
Therefore:
set infiles [glob -nocomplain *.tcl]
if {[llength $infiles]} {
file copy {*}$infiles /tmp/tgt/
}
For a 1-line answer:
file copy {*}[glob /home/*.abc.xyz] /root/.
The file copy (and file rename) commands have two forms (hence the reference to the manual page in the comment). The first form copies a single file to a new target. The second form copies all the file name arguments to a new directory and this form of the command insists that the directory name be the last argument and you may have an arbitrary number of source file names preceding. Also, file copy does not do glob expansion on its arguments, so as you rightly surmised, you also need to use the glob command to obtain a list of the files to copy. The problem is that the glob command returns a list of file names and you passed that list as a single argument, i.e.
file copy $infile /root/
passes the list as a single argument and so the file copy command thinks it is dealing with the first form and attempts to find a file whose name matches that of the entire list. This file probably doesn't exist. Placing the error message in your question would have helped us to know for sure.
So what you want to do is take the list of files contained in the infile variable and expand it into separate argument words. Since this is a common situation, Tcl has some syntax to help (assuming you are not using some ancient version of Tcl). Try using the command:
file copy {*}$infile /root/
in place of your first attempt and see if that helps the situation.

How to search a a directory tree in TCL for files that end in .so or .a and build a list of those directories

I'm trying to write a set of TCL scripts that helps setup a user's environment for a set of libraries that are not in their standard LD_LIBRARY_PATH as part of a support solution.
Since the system is rather sparse in terms of what I can install, I don't really have access to any TCL extensions and am trying to do this in base TCL as much as possible. I'm also relatively new to TCL as a language.
What I'd like to do is search through a directory structure and locate the directories that have .so and .a files in them, build a list of those, and, eventually add them to the user's $LD_LIBRARY_PATH variable.
If I were doing this in a shell, I'd just use find with something like this:
find /dir -type f \( -name "*.so" -o -name "*.a" \) | awk -F/ 'sub(FS $NF,x)' | sort -u
I could hard-code the paths, but we want a single set of scripts that can manage several different applications.
Any ideas would be very much appreciated.
Tcllib has a fileutil module that does a recursive find:
package require fileutil
set filenames [::fileutil::findByPattern /dir -glob {*.so *.a}]
foreach filename $filenames {
if {[file isfile $filename]} {
set dirs([file dirname $filename]) 1
}
}
puts [array names dirs]
If you want to use this, but can't install something, you can just take the procedures and add them to your code (with the appropriate attribution) -- http://core.tcl.tk/tcllib/dir?ci=63d99a74f4600441&name=modules/fileutil
Otherwise, just call the system's find command and parse the output (assume that your filenames do not contain newlines).
set filenames [split [exec find /dir -type f ( -name *.so -o -name *.a )] \n]
And the loop to extract the unique directories is similar. As a side benefit, the find invocation is actually easier to read because you don't have to escape all the chars that are special to the shell.

How to copy multiple files with Tcl file command

From Tcl online manual I see that Tcl's file copy command can take multiple source files as argument:
file copy ?-force? ?--? source ?source ...? targetDir
However, I have the following code:
set flist [list a.txt b.txt]
file copy $flist [file join D:\\ test dest]
And get this error message:
error copying "a.txt b.txt": no such file or directory
How do I properly pass a file list as source argument to the file copy command?
The right way to do this is to use expansion:
file copy {*}$flist {D:\test\dest}
The {*} substitutes the words of the list given by what follows it as separate words; it's precisely right here.
I've also written the destination directory as a brace-quoted literal.
Still on Tcl 8.4 or before? Upgrade! Or use this:
eval file copy $flist [list {D:\test\dest}]
It's quite a lot harder to use eval right than {*}, so really upgrade.
Or even do:
foreach f $flist {
file copy $f {D:\test\dest}
}
Given that IO operations will dominate the performance, you shouldn't notice any speed difference for doing it this way.
The problem is the list is passed as a whole to the command instead of individual elements. Use {*} operator to break the list down to its individual elements.
The short answer is don't use a list the way you have done.
This works in your example:
set flist "a.txt b.txt"
file copy $flist [file join D:\\ test dest]
More correct would be to use the list expansion {*} syntax.

Need a Tcl library to read/write configuration files

My Tcl application should read and store a lot of configurations parameters. I'd like to use regular disk file as a storage rather than registry or something else.
It would be great to store parameters hierarchically. All my parameters are strings, numbers, and lists of them. Configuration file(s) may be placed in directory (not only user's home). Normally application expects configuration file in the current directory.
Do you know any ready-to-use Tcl library?
More general question: what is the "Tcl-way" to read/write application configuration?
Thanks.
If the configuration does not necessarily need to be human-readable, I suggest you consider Sqlite -- it began as a Tcl extension, and therefore Tcl's Sqlite bindings are more mature than any other language's.
See: http://www.sqlite.org/tclsqlite.html
If you don't need random access (that is, configuration files are not huge and each can be slurped completely at once) and don't require processing by external tools, you could just use flat text files containing, say, Tcl lists. The "trick" is that in Tcl each value must have a valid string representation (when asked) and can be reconstructed from its string representation. You get that for free, that is, no special package is required and all you have to provide is some sort of structure to bind serialized values to their names.
To demonstrate:
set a "a string"
set b 536
set c {this is a list {with sublist}}
proc cf_write {fname args} {
set fd [open $fname w]
chan config $fd -encoding utf-8
set data [list]
foreach varName $args {
upvar 1 $varName var
lappend data [list $varName $var]
}
puts $fd $data
close $fd
}
proc cf_read fname {
set fd [open $fname]
chan config $fd -encoding utf-8
set data [read $fd]
close $fd
set data
}
set cfile [file join [file dir [info script]] conf.txt]
cf_write $cfile a b c
foreach entry [cf_read $cfile] {
lassign $entry name value
puts "$name: $value"
}
You'll get this output:
a: a string
b: 536
c: this is a list {with sublist}
Now if you feel like having something more fancy or "interoperable", look at YAML or JSON (you'll need to write a serializer for this one though) or INI formats--all available from Tcllib and hence are plain Tcl.
Even more fancier could be using XML via TDOM (an expat-based C extension). SQLite, which has been already proposed, is even more capable than that (provides random access to the data, is able to operate on huge data arrays). But it seems that for your task these tools appear to be too heavy-weight.
Note that my example deliberately opts to show how to store/restore an arbitrary ad-hoc list of variables so the cf_write procedure builds the Tcl list to be stored by itself. Of course, no one prevents you from building one yourself, providing for creation of hierarchical structures of arbitrary complexity. One caveat is that in this case you might (or might not) face a problem of deconstructing the restored list. But if you'll stick to a general rule of each element being a name/value pair as in my example, the deconstruction shouldn't be hard.
tcllib contains a package inifile for handling windows .ini file format configuration files. As it's part of tcllib it should be avaialble on all platforms (I've just checked and it loads ok on my Solaris 8 box). It allows you to both read and write .ini files and access the configuration by section and key.