How to pass a tcl script a string and interpret it properly - tcl

I'm trying to modify a tcl script that pushes bitfiles onto fpgas using xilinx's xsct tool. Here's what it looks like:
connect hw_server TCP:127.0.0.1:3122
targets -set -filter {jtag_cable_name =~ "Digilent JTAG-HS2 21xxx" && name =~ "xc7*"}
fpga [lindex $argv 0]
after 100
targets -set -filter {jtag_cable_name =~ "Digilent JTAG-HS2 21xxx" && name =~ "Micro*"}
loadhw system.hdf
stop
dow [lindex $argv 1]
con -block
Now that I have multiple FPGAs, I'd like to make the jtag_cable_name an argument. I've tried this to no avail:
connect hw_server TCP:127.0.0.1:3121
targets -set -filter {jtag_cable_name =~ [eval [lindex[$argv 0]] && name =~ "xc7*"}
fpga [lindex $argv 1]
after 100
targets -set -filter {jtag_cable_name =~ [eval [lindex[$argv 0]] && name =~ "Micro*"}
loadhw system.hdf
stop
dow [lindex $argv 2]
con -block
the call to the .tcl script looks like:
load_fpga.tcl "Digilent JTAG-HS2 21xxx" my_bitfile.bit my_elf.elf
How can I correctly pass the string and keep it in quotes like the original is?

What you don't realise is that in tcl, {Hello World} is a string.
If you are familiar with languages like Perl or Ruby than you would be familiar with the concept of literal and interpolated strings. In tcl, there are three syntaxes for strings:
Anything that doesn't contain a whitespace (space, tab, newline) is a string. Also, whitespace may be escaped. An escaped whitespace is not considered whitespace. The following are strings:
hello
hello\ world
Anything grouped by " are interpolated strings. Interpolated strings may contain variables or commands which will be evaluated and substituted. The following strings all say "hello world":
set x hello
"hello world"
"$x world"
"[set x] world"
Anything grouped by {} are literal strings. Literal strings aren't interpolated. That is, no variables or commands are evaluated. The following are literal strings:
{hello world} ;# hello world
{$x world} ;# $x world
There are two things you can do to get what you want
Replace {} with "". This will simply make the literal string into an interpolated string:
"jtag_cable_name =~ [eval [lindex[$argv 0]] && name =~ \"xc7*\""
note how you need to escape the " inside the string.
Use the subst command. The subst command performs substitutions on a string:
[subst {jtag_cable_name =~ [eval [lindex[$argv 0]] && name =~ "xc7*"}]
The small advantage of this is you don't need to escape the ". Read the manual page for subst. It's highly flexible allowing you to select what type of things you want to substitute. For example it allows you to evaluate only commands but not the $ syntax.

You probably want:
set jtag_cable_name [lindex $argv 0]
# ...
targets -set -filter [format {jtag_cable_name =~ "%s" && name =~ "xc7*"} $jtag_cable_name]
You can assign all the command line args like this:
lassign $argv jtag_cable_name bit_filename elf_filename

Related

Expect - avoid sending escape prompt sequences via ssh

The script is intended to retrieve the contents of some directory when it is getting full.
For development, the 'full' was set at 15%, the directory is /var/crash.
expect "#*" {
foreach part $full {
puts "part: $part"
set dir [split $part]
puts "dir: $dir [llength $dir]"
set d [lindex $dir 0]
puts "d: $d"
send -s -- "ls -lhS $d\n"
expect "#*" { puts "for $dir :: $expect_out(buffer)"}
}
}
send "exit\r"
The output of the script is:
part: /var/crash 15%
dir: {/var/crash} 15% 2
d: /var/crash
send: sending "ls -lhS \u001b[01;31m\u001b[K/var\u001b[m\u001b[K/crash\n" to { exp7 }
expect: does "" (spawn_id exp7) match glob pattern "#*"? no
expect: does "ls -lhS \u00071;31m\u0007/var\u0007\u0007/" (spawn_id exp7) match glob pattern "#*"? no
expect: does "ls -lhS \u00071;31m\u0007/var\u0007\u0007/crash\r\n" (spawn_id exp7) match glob pattern "#*"? no
As can be seen, although $d is /var/crash, when it is sent via ssh it becomes something like \u001b[01;31m\u001b[K/var\u001b[m\u001b[K/crash.
I cannot change the remote machine definitions for the command prompt.
How to get rid of these escape sequences that are sent?
Edit: Info about $full as requested
The proc analyze just tries to filter meaningful data.
proc analyze_df {cmd txt} {
set full [list]
set lines [split $txt \n]
foreach l $lines {
if {[string match $cmd* $l]} { continue }
set lcompact [regsub -all {\s+} $l " "]
set data [split $lcompact]
if {[string match 8?% [lindex $data 4]] \
|| [string match 9?% [lindex $data 4]] \
|| [string match 1??% [lindex $data 4]] \
|| [string match 5?% [lindex $data 4]] \
|| [string match 1?% [lindex $data 4]] } {
lappend full "[lindex $data 5] [lindex $data 4]"
}
}
return $full
}
The extract about the $full that was missing.
set command0 "df -h | grep /var"
send -- "$pass\r"
expect {
-nocase "denied*" {puts "$host denied"; continue}
-nocase "Authentication failed*" {puts "$host authentication failed"; continue}
"$*" {send -s -- "$command0\n"}
timeout {puts "$host TIMEOUT"; continue}
}
expect "$*" {puts "$host -> $expect_out(buffer)" }
set full [analyze_df $command0 $expect_out(buffer)]
Taking the suggestion received, perhaps it's grep that is adding the escape sequences, no?
You don't show how $full gets its value. But it must already have the escape codes. When printing $d those escape codes are interpreted by the terminal, so they may not be obvious. But Expect/Tcl definitely doesn't insert them. This is also confirmed by the braces around the first element when you print $dir. If this element was plain /var/crash, there would be no braces.
Your remark about the command prompt would suggest that $full may be taken from there. Maybe you cannot permanently change the remote machine's command prompt, but you should be able to change it for your session by setting the PS1 environment variable.
Another trick that may help in such situations is to do set env(TERM) dumb before spawning the ssh command. If the prompt (or other tools) correctly use the tput command to generate their escape codes, a dumb terminal will result in empty strings. This won't work if the escape codes are hard-coded for one specific TERM. But that's a bug on the remote side.
If you're absolutely stuck with that input data (and can't tell things to not mangle it with those ANSI terminal colour escape codes) then you can strip them out with:
set dir [split [regsub -all {\u001b[^a-zA-z]*[a-zA-Z]} $part ""]]
This makes use of the fact that the escape sequences start with the escape character (encoded as \u001b) and continue to the first ASCII letter. Replacing them all with the empty string should de-fang them cleanly.
You are recommended to try things like altering the TERM environment variable before calling spawn so that you don't have to do such cleaning. That tends to be easier than attempting to "clean up" the data after the fact.

How to match a string and print the next word afterthat?

Lets say i have the following script and have to look for .model and print the next two word before (. The following is the contents of the file that I need to read.
.model Q2N2222 NPN(Is=14.34f Xti=3 Eg=1.11 Vaf=74.03 Bf=255.9 Ne=1.307
Ise=14.34f Ikf=.2847 Xtb=1.5 Br=6.092 Nc=2 Isc=0 Ikr=0 Rc=1
+ Cjc=7.306p Mjc=.3416 Vjc=.75 Fc=.5 Cje=22.01p Mje=.377 Vje=.75
+ Tr=46.91n Tf=411.1p Itf=.6 Vtf=1.7 Xtf=3 Rb=10)
* National pid=19 case=TO18
* 88-09-07 bam creation
*$
.model Q2N3904 NPN(Is=6.734f Xti=3 Eg=1.11 Vaf=74.03 Bf=416.4 Ne=1.259
.model Q2N3906 PNP(Is=1.41f Xti=3 Eg=1.11 Vaf=18.7 Bf=180.7 Ne=1.5 Ise=0
Here is the code i have written so far. But i couldnt get any. Need the help
proc find_lib_parts {f_name} {
set value [string first ".lib" $f_name]
if {$value != -1} {
#open the file
set fid [ open $f_name "r"]
#read the fid and split it in to lines
set infos [split [read $fid] "\n"]
close $fid
set res {}
append res "MODEL FOUND:\n"
if {[llength $line] > 2 && [lindex $line 0] eq {model}} {
#lappend res [lindex $data 2] \n
lappend res [split $line "("]\n
}
if {[llength $line] > 2 && [lindex $line 0] eq {MODEL}} {
#lappend res [lindex $data 2] \n
lappend res [split $line "("]\n
}
}
return $res
In this case, a regular expression is by far the simplest way of doing such a search. Assuming the words are always on the same line, it's easy:
proc find_lib_parts {f_name} {
set fid [open $f_name]
set infos [split [read $fid] "\n"]
close $fid
set found {}
foreach line $infos {
if {[regexp {\.model\s+(\w+\s+\w+)\(} $line -> twoWords]} {
lappend found $twoWords
}
}
return $found
}
For your input data sample, that'll produce a result like this:
{Q2N2222 NPN} {Q2N3904 NPN} {Q2N3906 PNP}
If there's nothing to find, you'll get an empty list. (I assume you pass filenames correctly anyway, so I omitted that check.)
The regular expression, which should virtually always be enclosed in {braces} in Tcl, is this:
\.model\s+(\w+\s+\w+)\(
It's relatively simple. The pieces of it are:
\.model — literal “.model” (with an escape of the . because it is a RE metacharacter)
\s+ — some whitespace
( — start a capturing group (the bit we put into the twoWords variable)
\w+ — a “word”, one or more alphanumeric (or underscore) characters
\s+ — some whitespace
\w+ — a “word”, one or more alphanumeric (or underscore) characters
) — end of the capturing group
\( — literal “(”, escaped
The regexp command matches this, returning whether or not it matched (effectively boolean without the -all option, which we're not using here), and assigning the various groups to the variables named afterwards, -> for the whole matched string (yes, that's a legal variable name; I like to use it for regexp variables that dump info I don't want) and twoWords for the interesting substring.

reading file with "[" and manipulation each line TCL

I have file with the below lines (file.list):
insert_buffer [get_ports { port }] BUFF1 -new_net net -new_cell cell
I'm reading the file with the below script (read.tcl):
#! /usr/local/bin/tclsh
foreach arg $argv {
set file [open $arg r]
set data [ read $file ]
foreach line [ split $data "\n" ] {
puts $line
set name [lindex $line [expr [lsearch -all $line "-new_cell"]+1]]
puts $name
}
close $file
}
while running the above script (read.tcl file.list) I get error since I have "[" in file.list and script think its a beginning of TCL command.
list element in braces followed by "]" instead of space
while executing
"lsearch -all $line "-new_cell""
("foreach" body line 5)
invoked from within
"foreach line [ split $data "\n" ] {
How can I read the file correctly and overcome the "[" symbol?
How can I read the file correctly and overcome the "[" symbol?
I don't really understand why you are doing what you are doing (processing one Tcl script by another), but you have to make sure that each line is a valid Tcl list before submitting it to lsearch.
lsearch -all [split $line] "-new_cell"
Only split will turn an arbitrary string (containing characters special to Tcl) into a valid Tcl list.
This is one of the few times in Tcl that you need to worry about what type of data you have. $line holds a string. Don't use list commands on strings because there's no guarantee that an arbitrary string is a well-formed list.
Do this:
set fields [split $line]
# don't use "-all" here: you want a single index, not a list of indices.
set idx [lsearch -exact $fields "-new_cell"]
if {$idx == -1} {
do something here if there's no -new_cell in the line
} else {
set name [lindex $fields $idx+1]
}
In order to apply a list operation on the variable, it has to be a valid list. The variable $line is not a valid list.
It is better to use regexp rather than lsearch
regexp -- {-new_cell\s+(\S+)} $x match value
puts $value
Output :
cell

How to escape special characters while using lsearch?

How to escape special characters(e.g "[]") while using search?
Consider the following scenario:
>> set L { a b c [] }
>> a b c []
>> lsearch $L b
>> 1
>> lsearch $L "[]"
>> -1
I'm looking to get 3 when I run lsearch $L "[]"
When looking for fixed strings rather than patterns, it is easiest to use the -exact option to lsearch. You also need to make sure Tcl doesn't do substitution on the search string, for example by enclosing it inside curly braces. Otherwise you'll tell Tcl to look for an empty string (the result of executing an empty command string):
lsearch -exact $L {[]}

How to define a variable with argument expansion

The following command runs as expected:
lappend {*}{arr 1}
puts [lindex $arr 0]
Now I am trying to make a variable of "{*}{arr 1}" like this:
set X "{*}{arr 1}"
lappend $X
But this does not work, seems $X is taken as one whole value, argument expansion is not effective.
So is it a requirement that argument expansion can not be through variable?
The {*} is a syntactic feature of Tcl (from Tcl 8.5 onwards) just as […], "…" or $ is. You have to write it in the script in order for it to count as argument expansion; otherwise it's just a sequence of three characters.
If you want something like
set X "{*}{arr 1}"
lappend $X
to work, you need to pass it through eval:
set X "{*}{arr 1}"
eval lappend $X
Note that this then means that X actually contains a script fragment; this can have all sort of “interesting” consequences. Try this for size:
set X "{*}{arr 1};puts hiya"
eval lappend $X
Use of eval in modern Tcl is usually a sign that you're going about stuff the wrong way; the key use in old scripts was for doing things similar to that which we'd use {*} for now.
No, within double quotes, { and } actually lose their meaning, so will {*}. Notice that puts "{}" and puts {} are different.
The closest I can think of to do what you're trying to do would be to use something like this:
set X {arr 1}
lappend {*}$X
So if you now execute puts [lindex $arr 0], you get 1 as output.