How to access a list from proc in global space - tcl

I need LIST outside of proc for further procession. But puts $LIST shows an error message no such variable.
I have also tried upvar 0# LIST LIST instead of global with the same result.
I suspect, the troublemaker is calling proc with "list ..... If I ommit "list" in calling proc, the command global does what it should,
but of course the code as a whole isn't working properly anymore.
proc receiver {chan} {
global LIST
set data [gets $chan]
set LIST [split $data ,]
}
puts $LIST
set chan [open com5 r]
fconfigure $chan -mode "9600,n,8,1" -blocking 1 -buffering none -translation binary
fileevent $chan readable [list receiver $chan]
How can I get access to LIST in the global space outside of proc?

The problem is partially that the variable hasn't been written at all yet when the puts command is called, and partially that you are not actually used to working asynchronously.
You need to wait for something to arrive before you can print the variable out. The vwait command is ideal for this (as it runs the Tcl event loop while waiting). We can tell it to wait for the (global) LIST variable to be written to: when it has been, we can safely read it.
proc receiver {chan} {
global LIST
set data [gets $chan]
set LIST [split $data ","]
}
set chan [open com5 r]
fconfigure $chan -mode "9600,n,8,1" -blocking 1 -buffering none -translation binary
fileevent $chan readable [list receiver $chan]
vwait LIST
puts $LIST

I think you must declare LIST as global in root namespace:
proc receiver {chan} {
global LIST
set data [gets $chan]
set LIST [split $data ,]
}
global LIST
puts $LIST
set chan [open com5 r]
fconfigure $chan -mode "9600,n,8,1" -blocking 1 -buffering none -translation binary
fileevent $chan readable [list receiver $chan]

It's almost ok but the variables incoming over the serial port are
updated every second
If that is the requirement, then:
(1) Have the value returned from the channel be printed in the callback proc: receiver
(2) Enter the event loop just once, w/o binding to a global variable List or the like.
proc receiver {chan} {
set data [gets $chan]
puts [split $data ","]; # (1) print the return value
}
set chan [open com5 r]
fconfigure $chan -mode "9600,n,8,1" -blocking 1 -buffering none -translation binary
fileevent $chan readable [list receiver $chan]
vwait forever
puts "Quitted from event loop ..."
This will enter into an event loop that is bound to an undefined variable forever, not set from within your script. So it will not quit unless you stop the executable (e.g., tclsh) or unless you do not provide for an explicit ending condition, e.g.:
proc receiver {chan} {
global counter
set data [gets $chan]
puts [split $data ","]; # (1) print the return value
if {[incr counter] == 5} {
global forever
set forever 1; # (2) Exit the event loop, after receiver having been called 5 times
}
}

Related

Check abnormal connection drop before writing

I am new to TCL scripting and writing a production code to open a socket to our server and write command and then read its output. Below is my code:
set chan [socket 123.345.45.33 23]
fconfigure $chan -buffering line
foreach item [dict keys $command] {
set cmd [dict get $command $item]
set res [Data_write $chan "get data $cmd"]
}
Where Data_write procedure is mentioned below:
proc Data_write { channel data } {
if {[eof $channel]} {
close $channel
ST_puts "Data_write: Error while writing to chanel"
return -1
} else {
puts $channel $data
return 0
}
}
I am not sure that how can we achive the validations below:
set chan [socket 123.345.45.33 23] - socket connection open is success
fconfigure on the channel is success
How to know before any write that any abnormal connection drop has happen on channel?
set chan [socket 123.345.45.33 23] - socket connection open is success
fconfigure on the channel is success
These are simple enough: if there's a failure, you get a Tcl error, which is a sort of exception. You can use catch or try to trap the error if you want:
try {
set chan [socket 123.345.45.33 23]
fconfigure $chan -buffering line
} on error msg {
puts "a serious problem happened: $msg"
# Maybe also exit here...
}
How to know before any write that any abnormal connection drop has happen on channel?
The bad news is that you can't know this. The OS itself won't really know this until you do one of two things: write to the channel, or read from the channel. (There are sometimes hints available, such as fileevent events firing, but they're not certain at all.) Instead, you need to trap the error when actually you do the write. See above for the general pattern.
Remember: Tcl operations throw errors when they fail, and EOF is not an error when reading, but is an error when writing.
Use socket -async and the readable and writeable fileevents to make the whole connection process event oriented.
In the writable event you can check the status of the connection using fconfigure $channel -error. If something failed in the connection, the socket is made writable and the error condition presented on the error property. If this is empty then you can configure the readable event and start processing data from the socket. In any readable event handler you should check for eof after reading and disable the readable event handler or close the socket once eof is seen as a socket in eof state becomes constantly readable.
This roughly works out to be:
proc OnWritable {chan} {
set err [fconfigure $chan -error]
if {$err ne "" } {
puts stderr "ERROR: $err"
exit 1
}
fconfigure $chan -blocking 0 -buffering none -encoding binary -translation lf
fileevent $chan writable {}
fileevent $chan readable [list OnReadable $chan]
}
proc OnReadable {chan} {
set data [read $chan]
puts "[string length $data]: $data"
if {[eof $chan]} {
fileevent $chan readable {}
puts "closed $chan"
set ::forever "closed"
}
}
set sock [socket -async $address $port]
fileevent $sock writable [list OnWriteable $sock]
vwait ::forever

How to save incoming variables from serial port into a list

I want to save 7 variables incoming over a serial port. The transmission starts with an empty line, followed by 7 lines, each consisting of a single variable. No blanks but a carriage return at every line end. Each variable can also consists of blanks. This is carried out repeatedly.
If the empty line would cause a problem, it coud be omitted in my external device.
#!/ usr /bin/env wish
console show
set Term(Port) com5
set Term(Mode) "9600,n,8,1"
set result [list]
set data {}
proc receiver {chan} {
set data [gets $chan]
concat {*}[split $data \n]
set ::result [split $data "\n"]
#puts $data
#puts $::result
#foreach Element $::result {
#puts $Element}
#puts "Element 0 [lindex $::result 0]"
#puts "Element 1 [lindex $::result 1]"
return
}
set chan [open $Term(Port) r+]
fconfigure $chan -mode $Term(Mode) -translation binary -buffering none -blocking 0
fileevent $chan readable [list receiver $chan]
puts $data shows the following:
START
ChME3
562264
Lok3
Lok4
Lok6
All the 7 variables are visible but with empty lines inbetween. The empty line between "Lok4" and "Lok6" seems to be ok, since this is a variable consisting of blanks.
I tried to create a list with set ::result [split $data "\n"]. But that isn't working properly. With foreach Element $::result {puts $Element} the console shows the 7 variables:
START
ChME3
562264
Lok3
Lok4
.
Lok6
I have inserted the point between Lok4 and Lok6 manually here in the blockquote just for display purposes. In reality it's a variable consisting of only blanks.
Despite it looks like a list, if I try
puts "Element 0 [lindex $::result 0]"
puts "Element 1 [lindex $::result 1]"
it shows
Element 0 START
Element 1
Element 0 ChME3
Element 1
Element 0 562264
and so on.
Element 1 remains empty and Element 0 is consecutively assigned with each variable.
So it is clearly not a list. But I wonder, why foreach Element $::result {puts $Element}seems to work? What do I have to change to get a real list?
but I'm unable to retrieve it. Or do I have to create an own new list?
The result is retrieved using gets and turned into a list using split in this one step:
[split [gets $chan] {}]
To stash this list away, assign the list value to a variable that is scoped beyond the surrounding proc, e.g., a global or namespace variable:
set ::result [split [gets $chan] {}]
In context:
proc receiver {chan} {
set data [gets $chan]
set ::result [concat {*}[split $data \n]]
# set ::result [split [gets $chan] {}]
# puts $::result; # debug print-out
return
}
GUI integration
I have already created such a GUI where I want to put these variables
into labels
Connect your label widget to the global variable ::result, so the label becomes updated upon changes to the variable in proc receiver.
label .l -textvar ::result

In a tcl script how can i use puts to write a string to the console and to a file at the same time?

# Prints the string in a file
puts $chan stderr "$timestamp - Running test: $test"
# Prints the string on a console
puts "$timestamp - Running test: $test"
Is there a way I can send the output of puts to the screen and to a log file at the same time? Currently I have both the above two lines one after the other in my script to achieve this.
Or is there any other solution in tcl ?
Use the following proc instead of puts:
proc multiputs {args} {
if { [llength $args] == 0 } {
error "Usage: multiputs ?channel ...? string"
} elseif { [llength $args] == 1 } {
set channels stdout
} else {
set channels [lrange $args 0 end-1]
}
set str [lindex $args end]
foreach ch $channels {
puts $ch $str
}
}
Examples:
# print on stdout only
multiputs "1"
# print on stderr only
multiputs stderr "2"
set brieflog [open brief.log w]
set fulllog [open detailed.log w]
# print on stdout and in the log files
multiputs stdout $brieflog $fulllog "3"
This isn't something I've used extensively, but it seems to work (Tcl 8.6+ only):
You need the channel transform tcl::transform::observe package:
package require tcl::transform::observe
Open a log file for writing and set buffering to none:
set f [open log.txt w]
chan configure $f -buffering none
Register stdout as a receiver:
set c [::tcl::transform::observe $f stdout {}]
Anything written to the channel $c will now go to both the log file and stdout.
puts $c foobar
Note that it would seem to make more sense to have the channel transformation on top of stdout, with the channel to the log file as receiver, but I haven't been able to make that work.
Documentation:
chan,
open,
package,
puts,
set,
tcl::transform::observe (package)

fileevent and after in same event loop

To parse a log file I want to do something like this
tail the file
after some time write parsed data and do other things
Here is my (sample) script
#!/usr/bin/tclsh
proc readfile {fd} {
while {[gets $fd line] >= 0} {
puts $line
}
}
proc writefile {} {
puts xxx
flush stdout
}
if {$::argc > 0} {
set fd [open [list | tail --follow=name --retry [lindex $argv 0] 2>#1]]
} else {
set fd stdin
}
after 3000 writefile
fileevent $fd readable [list readfile $fd]
vwait done
close $fd
Tailing works fine but the script for after isn't triggered.
Any idea what I'm doing wrong?
In the readfile proc, you are using a while which causing it to get stuck in it and that is why the after is not triggered.
#!/usr/bin/tclsh
proc readfile {fd} {
global done
puts "READ FILE CALLED..."
gets $fd line; # Removed 'while' loop here
puts "->>>>$line<<<<<<<<<"
### Your condition to exit the event handler####
### set done 1; #### Changing 'done' value to 1 after that condition ####
### So that the event handler will exit ####3
}
proc writefile {} {
puts "WRITE FILE CALLED"
puts xxx
flush stdout
}
if {$::argc > 0} {
set fd [open [list | tail --follow=name --retry [lindex $argv 0] 2>#1]]
} else {
set fd stdin
}
after 3000 writefile
fileevent $fd readable [list readfile $fd]
vwait done
close $fd
Output :
dinesh#dinesh-VirtualBox:~/pgms/tcl$ ./ubi.tcl
WRITE FILE CALLED
xxx
ubi
READ FILE CALLED...
->>>>ubi<<<<<<<<<
cool
READ FILE CALLED...
->>>>cool<<<<<<<<<
working
READ FILE CALLED...
->>>>working <<<<<<<<<

Regarding named pipes behaviour in tcl

I have a question regarding named pipes in tcl.
First I created the pipe with mkfifo:
mkfifo foo
Then execute the following tcl script:
set fifo [open "foo" r]
fconfigure $fifo -blocking 1
proc read_fifo {} {
global fifo
puts "calling read_fifo"
gets $fifo x
puts "x is $x"
}
puts "before file event"
fileevent $fifo readable read_fifo
puts "after file event"
When i run the tcl script it waits for an event without outputting anything.
Then, when I write to the fifo:
echo "hello" > foo
Now, the tcl scripts prints out :
before file event
after file event
Why is 'read_fifo' function call not getting triggered here ?
Could anyone help me in understanding this behaviour.
fileevent relies on the the eventloop, which you don't enter.
fileevent just tells Tcl to call read_fifo when it is readable.
If you want blocking IO, then just call gets. This blocks until an entire line has been read.
set fifo [open "foo" r]
fconfigure $fifo -blocking 1
gets $fifo x
puts "x is $x"
If you do it event-driven, you need fileevent, use non-blocking IO and you have to enter the event-loop (e.g. with vwait forever).
set fifo [open "foo" r]
fconfigure $fifo -blocking 0
proc read_fifo {fifo} {
puts "calling read_fifo"
if {[gets $fifo x] < 0} {
if {[eof $fifo]} {
# Do some cleanup here.
close $fifo
}
}
puts "x is $x"
}
fileevent $fifo readable [list read_fifo $fifo]
vwait forever; #enter the eventloop
Don't mix event-driven with blocking IO. This does not really work.
Note that you don't have to call vwait in Tk, doing so would reenter the event-loop, which is considered bad practice.