Using globs in Perl replace one liner in TCL script - tcl

I want to run one Perl one liner in TCL script as below:
exec perl -i -pe {s/SUBSTRING/REPLACING_STRING/g} testFile;
This works fine. But, if I want to modify all the files like below:
exec perl -i -pe {s/SUBSTRING/REPLACING_STRING/g} *;
it gives me the error message:
Can't open '*': No such file or directory.
while executing
exec perl -i -pe {s/SUBSTRING/REPLACING_STRING/g} *;
I tried bracing the '*', but did not solve the problem. Requesting for help...

Assuming that the files a, b, and c are present in the current working directory, executing echo * in the shell prints a b c. This is because the shell command evaluator recognizes wildcard characters and splices in a list of zero or more file names where the wildcard expression was found.
Tcl's command evaluator does not recognize wildcard characters, but passes them unsubstituted to the command that was invoked. If that command can work with wildcards it will do so. The exec command doesn't, which means it will pass the wildcard expression as is to the shell command named by the command string.
Testing this, we get
% exec echo *
*
because what we asked the shell to execute was simply
echo *
If we want a wildcard expression expanded to a list of file names, we need an explicit call to the glob command:
% exec echo [glob *]
"a b c"
which still isn't quite right, since the list wasn't automatically spliced into the command string: instead the shell got
echo {a b c}
(Note: I’m faking echo on Windows here, the actual output might be different.)
To both expand and splice the list of file names, we need this:
% exec echo {*}[glob *]
a b c
The {*} prefix tells the Tcl command evaluator to substitute the following argument as if the words resulting from it were arguments in the original command line.
echo a b c
This example, with a more concise explanation than I've given here, is in the documentation for exec:
"If you are converting invocations involving shell globbing, you should remember that Tcl does not handle globbing or expand things into multiple arguments by default. Instead you should write things like this:"
exec ls -l {*}[glob *.tcl]
PS:
If one has loaded the fileutil package:
package require fileutil
this can be written as a one-liner in Tcl too:
foreach file [glob *] {::fileutil::updateInPlace $file {apply {str {regsub -all SUBSTRING $str REPLACING_STRING}}}}
or with line breaks and indentation for readability:
foreach file [glob *] {
::fileutil::updateInPlace $file {
apply {str {
regsub -all SUBSTRING $str REPLACING_STRING
}}
}
}
Documentation: apply, exec, fileutil package, foreach, glob, package, regsub, {*}

Related

Reading cmd arguments in TCL file

I am trying to run a tcl script through .bat file. I want to read some cmd arguments in the tcl script. Below is my code:
Command to run:
D:\Cadence\Sigrity2021.1\tools\bin\PowerSI.exe -tcl abcd.tcl %new_var%.spd %new_file_name%
Below is how I am trying to read the variable in the tcl file:
sigrity::open document [lindex $argv 0] {!}
It open up the Cadence Sigrity, but I see the below error:
How do I read cmd argument in tcl?
If you have no other way to do it that you can find (and it sounds like that might be the case) then you can fake it by writing a helper file with content like this, filling in the real arguments in the appropriate places:
# Name of script to call
set ::argv0 "abcd.tcl"
# Arguments to pass
set ::argv {}
lappend ::argv "%new_var%.spd"
lappend ::argv "%new_file_name%"
# Number of arguments (rarely used)
set ::argc [llength $::argv]
# Do the call
source $::argv0
Then you can pass that file to PowerSI and it will set things up and chain to the real file. It's messy, but practical.
If you're writing this from Tcl, use the list command to do the quoting of the strings (instead of putting them in double quotes) as it will do exactly the right thing for you. If you're writing the file from another language, you'll want to make sure you put backslashes in before \, ", $ and [ characters. The fiddlyness of doing that depends on your language.

invoke a tcl script from another tcl script with multiple arguements

i am invoking script(tclscript) from the current script seeing this "invalid command name error" the tcl script just checks the proper version of package is installed or not.
#!/bin/tclsh
# i am doing this for multiple packages in a loop
set list {/usr/local/script}
lappend list -check
lappend list -package
lappend list tcl-devel
lappend list version
[eval exec $list]
output:
invalid command name "
checking the version [ ok ] #expected output
-checks successful! #expected output
"
while executing
"[eval exec $list]"
dont understand why i get this "invalid command name error"can anyone help
The problem is that you've successfully run the command, have got the results back, and are then trying to use those results as the name of a command because you put [brackets] around the eval exec. Either remove the brackets, or put a command name before them so that you use the result as an argument.
set list …
# Leaving out the details of how you build the list
eval exec $list
set list …
# Leaving out the details of how you build the list
set result [eval exec $list]
puts "result is \"$result\""

How to zip multiple files through tcl script in Linux box?

I have set of code in tcl where I'm trying to achieve to zip the files but I'm getting below error
zip warning: name not matched: a_1.txt a_2.txt a_3.txt a_4.txt
On other hand I'm doing same thing from command prompt I'm able to execute successfully.
#!/usr/local/bin/tclsh
set outdir /usr/test/
set out_files abc.10X
array set g_config { ZIP /usr/bin/zip }
set files "a_1.txt a_2.txt a_3.txt a_4.txt"
foreach inp_file $files {
append zipfiles "$inp_file "
}
exec $g_config(ZIP) $outdir$out_files zipfiles
Tcl really cares about the boundaries between words, and doesn't split things up unless asked to. This is good as it means that things like filenames with spaces in don't confuse it, but in this case it causes you some problems.
To ask it to split the list up, precede the read of the word from the variable with {*}:
exec $g_config(ZIP) $outdir$out_files {*}$files
This is instead of this:
exec $g_config(ZIP) $outdir$out_files $files
# Won't work; uses "strange" filename
or this:
exec $g_config(ZIP) $outdir$out_files zipfiles
# Won't work; uses filename that is the literal "zipfiles"
# You have to use $ when you want to read from a variable and pass the value to a command.
Got a very old version of Tcl where {*} doesn't work? Upgrade to 8.5 or 8.6! Or at least use this:
eval {exec $g_config(ZIP) $outdir$out_files} $files
(You need the braces there in case you put a space in outdir…)

How to Convert Regex Pattern Match to Lowercase for URL Standardization/Tidying

I am currently trying to convert all links and files and tags on my site from UPPERCASE.ext and CamelCase.ext to lowercase.ext.
I can match the links in pages using a regular expression match for href="[^"]*" and src="[^"]*"
This seems to work fine for identifying the link and images in the HTML.
However what I need to do with this is to take the match and run a ToLowercase() function on the matches. Since I have a lot of pages that I'd like to parse through, I'm looking to make a short shell script that will run on a specified directory and pattern match the specified regexes and perform a lowercase operation on them.
Perl one-liner to rename all regular files to lowercase:
perl -le 'use File::Find; find({wanted=>sub{-f && rename($_, lc)}}, "/path/to/files");'
If you want to be more specific about what files are renamed you could change -f to a regex or something:
perl -le 'use File::Find; find({wanted=>sub{/\.(txt|htm|blah)$/i && rename($_, lc)}}, "/path/to/files");'
EDIT: Sorry, after rereading the question I see you also want to replace occurrences within files as well:
find /path/to/files -name "*.html" -exec perl -pi -e 's/\b(src|href)="(.+)"/$1="\L$2"/gi;' {} \;
EDIT 2: Try this one as the find command uses + instead of \; which is more efficient since multiple files are passed to perl at once (thanks to #ikegami from another post). It also It also handles both ' and " around the URL. Finally, it uses {} instead of // for substitutions since you are substituting URLs (maybe the /s in the URL are confusing perl or your shell?). It shouldn't matter, and I tried both on my system with the same effect (both worked fine), but it's worth a shot:
find . -name "*.html" -exec perl -pi -e \
'$q=qr/"|\x39/; s{\b(src|href)=($q?.+$q?)\b}{$1=\L$2}gi;' {} +
PS: I also have a Macbook and tested these using bash shell with Perl versions 5.8.9 and 5.10.0.
With bash, you can declare a variable to only hold lower case values:
declare -l varname
read varname <<< "This Is LOWERCASE"
echo $varname # ==> this is lowercase
Or, you can convert a value to lowercase (bash version 4, I think)
x="This Is LOWERCASE"
echo ${x,,} # ==> this is lowercase
you want this?
kent$ echo "aBcDEF"|sed 's/.*/\L&/g'
abcdef
or this
kent$ echo "aBcDEF"|awk '$0=tolower($0)'
abcdef
with your own regex:
kent$ echo 'FOO src="htTP://wWw.GOOGLE.CoM" BAR BlahBlah'|sed -r 's/src="[^"]*"/\L&/g'
FOO src="http://www.google.com" BAR BlahBlah
You could use sed with -i (in-place edit):
sed -i'' -re's/(href|src)="[^"]*"/\L&/g' /path/to/files/*

how to pass command line parameter containing '<' to 'exec'

$ date > '< abcd'
$ cat '< abcd'
<something>
$ tclsh8.5
% exec cat {< abcd}
couldn't read file " abcd": no such file or directory
whoops. This is due to the the specification of 'exec'.
If an arg (or pair of args) has one of the forms described below then it is used by exec to control the flow of input and output among the subprocess(es). Such arguments will not be passed to the subprocess(es). In forms such as “< fileName”, fileName may either be in a separate argument from “<” or in the same argument with no intervening space".
Is there a way to work around this?
Does the value have to be passed as an argument? If not, you can use something like this:
set strToPass "< foo"
exec someProgram << $strToPass
For filenames, you can (almost always) pass the fully qualified name instead. The fully qualified name can be obtained with file normalize:
exec someProgram [file normalize "< foo"] ;# Odd filename!
But if you need to pass in an argument where < (or >) is the first character, you're stuck. The exec command always consumes such arguments as redirections; unlike with the Unix shell, you can't just use quoting to work around it.
But you can use a helper program. Thus, on Unix you can do this:
exec /bin/sh -c "exec someProgram \"$strToPass\""
(The subprogram just replaces itself with what you want to run passing in the argument you really wanted. You might need to use string map or regsub to put backslashes in front of problematic metacharacters.)
On Windows, you have to write a batch file and run that, which has a lot of caveats and nasty side issues, especially for GUI applications.
One simple solution: ensure the word does not begin with the redirection character:
exec cat "./< abcd"
One slightly more complex:
exec sh -c {cat '< abcd'}
# also
set f {< abcd}
exec sh -c "cat '$f'"
This page on the Tcl Wiki talks about the issue a bit.
Have you tried this?
% exec {cat < abcd}
Try:
set myfile "< abcd"
exec cat $myfile