How to received proc return data in fileevent - tcl

I am calling a proc through fileevent. that proc returns a line od data.
how to receive this data?
the following code I have written to receive data from pipe when ever data is available. I dont want to block by using direct gets.
proc GetData1 { chan } {
if {[gets $chan line] >= 0} {
return $line
}
}
proc ReadIO {chan {timeout 2000} } {
set x 0
after $timeout {set x 1}
fconfigure $chan -blocking 0
fileevent $chan readable [ list GetData1 $chan ]
after cancel set x 3
vwait x
# Do something based on how the vwait finished...
switch $x {
1 { puts "Time out" }
2 { puts "Got Data" }
3 { puts "App Cancel" }
default { puts "Time out2 x=$x" }
}
# How to return data from here which is returned from GetData1
}
ReadIO $io 5000
# ** How to get data here which is returned from GetData1 ? **

There are probably as many ways of doing this as there are Tcl programmers. Essentially, you shouldn't use return to pass the data back from your fileevent handler as it isn't called in the usual way so you can get at what it returns.
Here are a few possible approaches.
Disclaimer None of these is tested, and I'm prone to typing mistakes, so treat with a little care!
1) Get your fileevent handler to write to a global veriable:
proc GetData1 {chan} {
if {[gets $chan line]} >= 0} {
append ::globalLine $line \n
}
}
.
.
.
ReadIO $io 5000
# ** The input line is in globalLine in the global namespace **
2) Pass the name of a global variable to your fileevent handler, and save the data there
proc GetData2 {chan varByName} {
if {[gets $chan line]} >= 0} {
upvar #0 $varByName var
append var $line \n
}
}
fileevent $chan readable [list GetData1 $chan inputFromChan]
.
.
.
ReadIO $chan 5000
# ** The input line is in ::inputFromChan **
A good choice for the variable here might be an array indexed by $chan, e.g. fileevent $chan readable [list GetDetail input($chan)]
3) Define some kind of class to look after your channels that stashes the data away internally and has a member function to return it
oo::class create fileeventHandler {
variable m_buffer m_chan
constructor {chan} {
set m_chan $chan
fileevent $m_chan readable [list [self object] handle]
set m_buffer {}
}
method data {} {
return $m_buffer
}
method handle {} {
if {[gets $m_chan line]} >= 0 {
append m_buffer $line \n
}
}
}
.
.
.
set handlers($chan) [fileeventHandler new $chan]; # Save object address for later use
ReadIO $io 5000
# Access the input via [handlers($chan) data]

Related

Reading Hebrew in JSON via native-messaging API and writing to file but Hebrew looks wrong in the file?

I'm using this code to read messages received from a browser extension via the native-messaging API. Fortunately, it works and reads all the data. Unfortunately, I'm doing something wrong or failing to do something right (if there is a difference there) and when the data is written to the file, it appears that the multi-byte characters (Hebrew, at the moment) are being written as individual bytes.
Would you please tell me what needs to be done before writing [dict get $msg doc] to the file? $msg is a big string of HTML. Am I configuring stdin incorrectly or does the data being written to file need to be encoded?
Instead of seeing Hebrew, the file shows items like this ×”Ö¸×ָֽרֶץ×.
According to the MDN document the incoming message is:
On the application side, you use standard input to receive messages
and standard output to send them.
Each message is serialized using JSON, UTF-8 encoded and is preceded
with an unsigned 32-bit value containing the message length in native
byte order.
Thank you.
#!/usr/bin/env tclsh
package require json
proc coread {reqBytes} {
set ::extMsg {}
set remBytes $reqBytes
while {![chan eof stdin]} {
yield
append ::extMsg [set data [read stdin $remBytes]]
set remBytes [expr {$remBytes - [string length $data]}]
if {$remBytes == 0} {
return $reqBytes
}
}
throw {COREAD EOF} "Unexpected EOF"
}
proc ExtRead {} {
chan event stdin readable coro_stdin
while {1} {
set ::extMsg {}
chan puts $::fp_log "starting coread 4"
if { [coread 4] != 4 || [binary scan $::extMsg iu len] != 1 } {
exit
}
if { [coread $len] != $len } {
exit
}
set msg [::json::json2dict $::extMsg]
set book [dict get $msg book]
set path "../${book}.html"
set fp [open $path w]
chan puts $fp [dict get $msg doc]
close $fp
}
}
proc CloseConnect {} {
set ::forever 0
}
proc Listen {} {
# Listening on stdin
set ::forever 1
chan configure stdin -buffering full -blocking 0 -encoding iso8859-1 -translation crlf
coroutine coro_stdin ExtRead
vwait forever
}
set extMsg {}
Listen
I don't really want to answer my own question but this is too much for a comment; and I'll be happy to delete it if someone provides a better one.
As #Abra pointed out, I had the encoding configured wrong on stdin. I thought all that needed done then was to use [string bytelength $data] instead of [string length $data] but that only worked for a few iterations. I do not understand why [coread 4] was ending up reading more than 4 bytes such that $remBytes was negative. It resulted in misreading the data and the coroutine waited for far more data than there was. So, I moved the configuring of stdin into proc extRead and changed it from binary to utf-8 for each part of the read. Binary for the 4 bytes indicating the length of the message, and utf-8 for the message. And, I had to use [string length $data] for the binary read of 4 bytes and [string bytelength $data] for reading the message; and this resulted in $remBytes always equaling zero upon completion of reading both the length and the message, such that if {$remBytes == 0} works rather than needing the earlier change mentioned in my comment of if {$remBytes <= 0}.
It works and the Hebrew now appears correctly. But it bothers me that I don't understand why [string bytelength $data] would not be 4 when reading 4 bytes. I write the values out to a log/debugging file at various steps to verify this.
I should add, here, that I had a similar question in the past which #Schelte Bron answered but I don't think I followed his advice correctly or unknowingly reverted to an old version of my code. He recommended:
The solution is to configure stdin to binary and then apply the utf-8
decoding later:
I don't know why I didn't start with that but he probably provided the answer three months ago.
#!/usr/bin/env tclsh
package require json
proc coread {reqBytes readType} {
set ::extMsg {}
set remBytes $reqBytes
while {![chan eof stdin]} {
yield
append ::extMsg [set data [read stdin $remBytes]]
if { $readType } {
set remBytes [expr {$remBytes - [string bytelength $data]}]
} else {
set remBytes [expr {$remBytes - [string length $data]}]
}
if {$remBytes == 0} {
return $reqBytes
}
}
throw {COREAD EOF} "Unexpected EOF"
}
proc ExtRead {} {
chan event stdin readable coro_stdin
while {1} {
chan configure stdin -buffering full -blocking 0 -encoding binary
set ::extMsg {}
if { [coread 4 0] != 4 || [binary scan $::extMsg iu len] != 1 } {
exit
}
chan configure stdin -buffering full -blocking 0 -encoding utf-8 -translation crlf
if { [coread $len 1] != $len } {
exit
}
set msg [::json::json2dict $::extMsg]
set book [dict get $msg book]
set path "../${book}.html"
set fp [open $path w]
chan puts $fp [dict get $msg doc]
close $fp
}
}
proc CloseConnect {} {
set ::forever 0
}
proc Listen {} {
# Listening on stdin
set ::forever 1
coroutine coro_stdin ExtRead
vwait forever
}
set extMsg {}
Listen

TCL: Redirect output of proc to a file

I need to redirect output of a proc to a file. The "redirect" command isn't working for the Tcl interpreter that my tool uses. So I'm trying "exec echo [proc_name]" instead, which was suggested in one of the threads on this site. But this doesn't work, the file ${dump_dir}/signal_list.txt comes out empty,
proc dump_signals {{dump_dir "."}} {
upvar build build
puts "Dumping signals.."
set my_file [open ${dump_dir}/signal_list.txt w]
exec echo [get_signals] > $my_file
}
'get_signals' is a proc, which calls another proc,
proc puts_name_and_value {name} {
set value [value %h $name]
puts "$name $value"
}
proc get_signals {} {
# Get list of signals
set signal_list {test.myreg test.myreg2}
foreach signal $signal_list {
puts_name_and_value $signal
}
}
My workaround for now is this, writing to the file in the bottom level proc by upvar'ing the file variable. This works but isn't the most clean way of doing this. Please let me know how to cleanly redirect the output of a proc to a file.
proc puts_name_and_value {name} {
upvar my_file my_file
set value [value %h $name]
puts $my_file "$name $value"
#puts "$name $value"
}
proc get_signals {} {
upvar my_file my_file
# Get list of signals
set signal_list {test.myreg test.myreg2}
foreach signal $signal_list {
puts_name_and_value $signal
}
}
proc dump_signals {{dump_dir "."}} {
upvar build build
puts "Dumping signals.."
set my_file [open ${dump_dir}/signal_list.txt w]
get_signals
}
Your dump_signals proc writes to standard output, and doesn't return anything. So of course trying to use a shell to redirect its output to a file isn't going to result in anything in the file.
One solution is to use tcl's transchan API with chan push and chan pop to temporarily override what stdout goes to. Example:
#!/usr/bin/env tclsh
proc redirector {fd args} {
switch -- [lindex $args 0] {
initialize {
# Sanity check
if {[lindex $args 2] ne "write"} {
error "Can only redirect an output channel"
}
return {initialize write finalize}
}
write {
puts -nonewline $fd [lindex $args 2]
}
finalize {
close $fd
}
}
}
proc writer_demo {} {
puts "line one"
puts "line two"
}
proc main {} {
chan push stdout [list redirector [open output.txt w]]
writer_demo
chan pop stdout
}
main
Running this script will produce a file output.txt with the contents of writer_demo's puts calls instead of having them go to standard output like normal.
You could also just pass the file handle to write to as an argument to your functions (Instead of using upvar everywhere):
proc puts_name_and_value {out name} {
set value [value %h $name]
puts $out "$name $value"
}
proc get_signals {{out stdout}} {
# Get list of signals
set signal_list {test.myreg test.myreg2}
foreach signal $signal_list {
puts_name_and_value $out $signal
}
}
proc dump_signals {{dump_dir "."}} {
upvar build build
puts "Dumping signals.."
set my_file [open ${dump_dir}/signal_list.txt w]
get_signals $my_file
}

TCL / Write a tabulated list to a file

I have a variable, let's say xx, with a list of index 0 and index 1 values. I want to modify a script (not mine) which previously defines a function, pptable, i.e.,
proc pptable {l1 l2} {
foreach i1 $l1 i2 $l2 {
puts " [format %6.2f $i1]\t[format %6.2f $i2]"
}
}
so that it displays the output into two columns using
pptable [lindex $xx 1] [lindex $xx 0]
However, I want to write the output directly to a file. Could you tell me how I can send the data to a file instead to the display?
One of the neatest ways of doing this is to stack on a channel transform that redirects stdout to where you want it to go. This works even if the write to stdout happens from C code or in a different thread as it plugs into the channel machinery. The code is a little bit long (and requires Tcl 8.6) but is reliable and actually mostly very simple.
package require Tcl 8.6; # *REQUIRED* for [chan push] and [chan pop]
proc RedirectorCallback {targetHandle op args} {
# The switch/lassign pattern is simplest way of doing this in one procedure
switch $op {
initialize {
lassign $args handle mode
# Sanity check
if {$mode ne "write"} {
close $targetHandle
error "this is just a write transform"
}
# List of supported subcommands
return {initialize finalize write}
}
finalize {
lassign $args handle
# All we need to do here is close the target file handle
close $targetHandle
}
write {
lassign $args handle buffer
# Write the data to *real* destination; this does the redirect
puts -nonewline $targetHandle $buffer
# Stop the data going to *true* stdout by returning empty string
return ""
# If we returned the data instead, this would do a 'tee'
}
default {
error "unsupported subcommand"
}
}
}
# Here's a wrapper to make the transform easy to use
proc redirectStdout {file script} {
# Stack the transform onto stdout with the file handle to write to
# (which is going to be $targetHandle in [redirector])
chan push stdout [list RedirectorCallback [open $file "wb"]]
# Run the script and *definitely* pop the transform after it finishes
try {
uplevel 1 $script
} finally {
chan pop stdout
}
}
How would we actually use this? It's really very easy in practice:
# Exactly the code you started with
proc pptable {l1 l2} {
foreach i1 $l1 i2 $l2 {
puts " [format %6.2f $i1]\t[format %6.2f $i2]"
}
}
# Demonstrate that stdout is working as normal
puts "before"
# Our wrapped call that we're capturing the output from; pick your own filename!
redirectStdout "foo.txt" {
pptable {1.2 1.3 1.4} {6.9 6.8 6.7}
}
# Demonstrate that stdout is working as normal again
puts "after"
When I run that code, I get this:
bash$ tclsh8.6 stdout-redirect-example.tcl
before
after
bash$ cat foo.txt
1.20 6.90
1.30 6.80
1.40 6.70
I believe that's precisely what you are looking for.
You can do this with less code if you use Tcllib and TclOO to help deal with the machinery:
package require Tcl 8.6
package require tcl::transform::core
oo::class create WriteRedirector {
superclass tcl::transform::core
variable targetHandle
constructor {targetFile} {
set targetHandle [open $targetFile "wb"]
}
destructor {
close $targetHandle
}
method write {handle buffer} {
puts -nonewline $targetHandle $buffer
return ""
}
# This is the wrapper, as a class method
self method redirectDuring {channel targetFile script} {
chan push $channel [my new $targetFile]
try {
uplevel 1 $script
} finally {
chan pop $channel
}
}
}
Usage example:
proc pptable {l1 l2} {
foreach i1 $l1 i2 $l2 {
puts " [format %6.2f $i1]\t[format %6.2f $i2]"
}
}
puts "before"
WriteRedirector redirectDuring stdout "foo.txt" {
pptable {1.2 1.3 1.4 1.5} {6.9 6.8 6.7 6.6}
}
puts "after"
I assume you don't want or can't modify the existing script and proc pptable, correct?
If so, there are different options, depending on your exact situation:
Redirect stdout: tclsh yourscript.tcl > your.out
Redefine puts (for a clearly defined scope):
rename ::puts ::puts.orig
proc puts args {
set fh [open your.out w];
::puts.orig $fh $args;
close $fh
}
# run pptable, source the script
This theme has been covered before, e.g., tcl stop all output going to stdout channel?
Rewire Tcl's stdout channel (not necessarily recommended):
close stdout
open your.out w
# run pptable, source the script
This has also been elaborated on before, e.g. Tracing stdout and stderr in Tcl

In a tcl script how can i use puts to write a string to the console and to a file at the same time?

# Prints the string in a file
puts $chan stderr "$timestamp - Running test: $test"
# Prints the string on a console
puts "$timestamp - Running test: $test"
Is there a way I can send the output of puts to the screen and to a log file at the same time? Currently I have both the above two lines one after the other in my script to achieve this.
Or is there any other solution in tcl ?
Use the following proc instead of puts:
proc multiputs {args} {
if { [llength $args] == 0 } {
error "Usage: multiputs ?channel ...? string"
} elseif { [llength $args] == 1 } {
set channels stdout
} else {
set channels [lrange $args 0 end-1]
}
set str [lindex $args end]
foreach ch $channels {
puts $ch $str
}
}
Examples:
# print on stdout only
multiputs "1"
# print on stderr only
multiputs stderr "2"
set brieflog [open brief.log w]
set fulllog [open detailed.log w]
# print on stdout and in the log files
multiputs stdout $brieflog $fulllog "3"
This isn't something I've used extensively, but it seems to work (Tcl 8.6+ only):
You need the channel transform tcl::transform::observe package:
package require tcl::transform::observe
Open a log file for writing and set buffering to none:
set f [open log.txt w]
chan configure $f -buffering none
Register stdout as a receiver:
set c [::tcl::transform::observe $f stdout {}]
Anything written to the channel $c will now go to both the log file and stdout.
puts $c foobar
Note that it would seem to make more sense to have the channel transformation on top of stdout, with the channel to the log file as receiver, but I haven't been able to make that work.
Documentation:
chan,
open,
package,
puts,
set,
tcl::transform::observe (package)

How to print newly updated lines in tcl

I need to print the lines which are newly added in file.
My code looks as follows:
proc dd {} {
global line_number
set line_number 0
set a [open "pkg.v" r]
#global count
while {[gets $a line]>=0} {
incr line_number
global count
set count [.fr.lst2 size]
puts "enter $count"
if {[eof $a]} {
#.fr.lst2 insert end "$line"
# set count [.fr.lst2 size]
close $a
} elseif {$count > 0} {
.fr.lst2 delete 0 end
if {$count+1} {
.fr.lst2 insert end "$line"
puts "i am $count"
}
} else {
.fr.lst2 insert end "$line"
puts "i am not"
}
}
puts "$count"
}
Assuming we're talking about lines written to the end of a log file on any Unix system (Linux, OSX, etc.) then it's trivially done with the help of tail:
# Make the pipeline to read from 'tail -f'; easy easy stuff!
set mypipe [exec |[list tail -f $theLogfile]]
# Make the pipe be non-blocking; usually a good idea for anything advanced
fconfigure $mypipe -blocking 0
# Handle data being available by calling a procedure which will read it
# The procedure takes two arguments, and we use [list] to build the callback
# script itself (Good Practice in Tcl coding)
fileevent $mypipe readable [list processLine $mypipe .fr.lst2]
proc processLine {pipeline widget} {
if {[gets $pipeline line] >= 0} {
# This is probably too simplistic for you; adapt as necessary
$widget insert end $line
} elseif {[eof $pipeline]} { # Check for EOF *after* [gets] fails!
close $pipeline
}
}