Why does expect-tcl use the semicolon to trim the output? - output

My script garbage piece of a configuration from the network devices by telnet for further modifications. Some of pieces of configurations contain encoded data with semicolon character. However, in a strange way, expect cuts the output after the first semicolon character, with that all the information gets into the log. For example, the configuration contains the following lines, they are also in the log:
snmp-agent
snmp-agent local-engineid 000007DB7F00000100000DBD
snmp-agent community write cipher %#%#f:x6"^s,6.L~~BE~%c*0S6NH2#Y_W4I`NP6,W}VF'NN86NKSYoixJc$>;88sTj2yu2*/NTS6%#%#
snmp-agent community read cipher %#%#]$OdG*7WdV#{aSD9vx"DH+]]*_[8D+2\u%7Ozr<,W3zP+]`HBK(\=oJuKL'IT|+w*3o4]iH+%#%#
snmp-agent sys-info version v1 v2c
undo snmp-agent sys-info version v3
I tried two different ways, but the result is unchanged:
1)
expect {
"Error: Failed to authenticate." {exit}
">" {send "disp current-configuration | include snmp\r"}
}
expect "$device>" {
set results [regexp -all -inline {[^\r\n]+} $expect_out(buffer)]
puts "Length of output buffer is : [llength $results]"
for {set i 1} {$i<[llength $results]-1} {incr i} {
set confline [lindex $results $i]
puts "\$confline\[$i\] = $confline\r"
expect {
"Error: Failed to authenticate." {exit}
">" {send "disp current-configuration | include snmp\r"}
}
expect "$device>" {
set outfl [open "$SCROOT/$model-$device.out" w]
puts $outfl $expect_out(buffer)
flush $outfl
close $outfl
And this is what happens at the output:
snmp-agent
snmp-agent local-engineid 000007DB7F00000100000DBD
snmp-agent community write cipher %#%#f:x6"^s,6.L~~BE~%c*0S6NH2#Y_W4I`NP6,W}VF'NN86NKSYoixJc$>
Who knows how to solve this problem? Please help me to do it.
UPD: I extended the match condition and got the expected result. Thank Colin Macleod for the hint.

You are trying to read all the data up to the next prompt by doing
expect ">" {
but the output data contains a ">" character so expect stops when it sees that. It's just coincidence that the next character happens to be ";".
You can probably work around this by writing your pattern to match ">" only when it is the first non-white-space character on a line, e.g.
expect "\n\s*>" {

Related

How to clearerr of stdin in Tcl after ctrl+d?

I recently asked a question about reopening stdin in C after passing EOF and now want the same behavior when using Tcl.
I can't seem to find a Tcl commmand doing what C clearerr would do. How can I pass ctrl+d to stdin at one time and later "reopen" stdin from the Tcl script? (Compiling an external library using C is cheating!)
Currently using Windows and thus ctrl+z but I assume they work similarly enough not to make a difference in this case. Here is some sample code:
set var {}; # declare var to hold the line
gets stdin var; # read a line
if {[string length $var]>0} {puts $var}; # print if read
if {[eof stdin]} { # if end-of-file reached
puts {read from stdin was canceled. reopening just for fun}; # some debug message
puts -nonewline "eof reached for stdin. enter something more to echo: "; flush stdout
# clearerr() ???
gets stdin var
if {[string length $var]>0} {puts $var}
}
EDIT: Reading about fileevent I believe I can come up with a solution where user does not enter EOF at all to transition between stdin and GUI control.
How can I pass ctrl+d to stdin at one time and later "reopen" stdin from the Tcl script?
I am not sure whether this expectation makes sense from a Tcl POV. If [eof] is caught on a channel, the Tcl channel for stdin is not closed (unless done so explicitly using [close], or Tcl shuts down completely), so there is no need to reopen it. Watch:
proc isReadable { f } {
# The channel is readable; try to read it.
set status [catch { gets $f line } result]
if { $status != 0 } {
# Error on the channel
puts "error reading $f: $result"
set ::DONE 2
} elseif { $result >= 0 } {
# Successfully read the channel
puts "got: $line"
} elseif { [eof $f] } {
# End of file on the channel
puts "end of file; just continue working"
# set ::DONE 1
} elseif { [fblocked $f] } {
# Read blocked. Just return
} else {
# Something else
puts "can't happen"
set ::DONE 3
}
}
fconfigure stdin -blocking false
fileevent stdin readable [list isReadable stdin]
# Launch the event loop and wait for the file events to finish
vwait ::DONE
This is just a standard snippet from Tcl documentation, also used in How to check if stdin is readable in TCL?. Aside, some comments from the answers and comments to your question at How to restart stdin after Ctrl+D? apply to Tcl as well. See Brad's comment using open or seek stdin 0 end, provided that the source of stdin is seekable.
I believe I have a found a pure-TCL way around this problem: change the EOF character to something other than Ctrl-Z, read a dummy line (to remove the Ctrl-Z from the input buffer) and then reset the EOF character back to Ctrl-Z. Wrapped up in a procedure:
proc clearEOF {} {
fconfigure stdin -eofchar { "\x01" "" }
gets stdin dummy
fconfigure stdin -eofchar { "\x1a" "" }
}
The choice of \x01 is somewhat arbitrary: essentially anything that is not likely to be in the input buffer alongside the Ctrl-Z should do.
Note: This has only been tested on Windows 10 with TCL 8.6.9.
Original Test Program
puts "Enter lines then Ctrl-Z <RETURN> to end"
while { [ gets stdin line ] >= 0 } {
puts "Read: $line"
}
puts "Reached EOF"
puts "eof=[eof stdin]"
puts "Enter another line"
puts "gets=[gets stdin line]"
puts "Read: $line"
The wish is that after having read a number of lines, terminated by the EOF-marker (Ctrl-Z), you can then read another line. In practice, the EOF-state is not cleared, and the second call to gets does not wait for input and immediately returns -1 (=EOF):
Enter lines then Ctrl-Z <RETURN>
Line1
Read: Line1
Line2
Read: Line2
^Z
Reached EOF
eof=1
Enter another line <-- This does not wait
gets=-1
Read:
Note: despite the TCL documentation including (my emphasis):
read ?-nonewline? fileID
Reads all the remaining bytes from fileID, and returns that string. If -nonewline is set, then the last character will be discarded if it is a newline. Any existing end of file condition is cleared before the read command is executed.
replacing the gets with something like set line [ read stdin ] makes no difference. Both commands return immediately. Having multiple repetitions of either command makes no difference: once TCL (and/or Windows1) thinks we've hit EOF, we stay at EOF!
My Solution
After some playing around, trying every file-manipulation command I could find that TCL posses, I came up with the following:
puts "Enter lines then Ctrl-Z <RETURN>"
while { [ gets stdin line ] >= 0 } {
puts "Read: $line"
}
puts "Reached EOF"
puts "eof=[eof stdin]"
puts "Reset EOF char"
fconfigure stdin -eofchar { "\x01" "" }
puts "eof=[eof stdin]"
puts "Reading dummy line"
puts "gets=[gets stdin line]"
fconfigure stdin -eofchar { "\x1a" "" }
puts "Enter another line"
puts "gets=[gets stdin line]"
puts "Read: $line"
The output of this version does wait for more input:
Enter lines then Ctrl-Z <RETURN>
Line 1
Read: Line 1
Line 2
Read: Line 2
^Z
Reached EOF
eof=1
Reset EOF char
eof=0 <-- EOF has been cleared
Reading dummy line
gets=1 <-- Read 1 character: the Ctrl-Z
Enter another line
More text <-- Waits for this to be typed
gets=9
Read: More text
My assumption of what's happening is that changing the EOF-character does reset the EOF status (whether this happens "in TCL" or "in Windows" I'm unsure). With a different EOF-marker in place, we can read the line containing the Ctrl-Z that has been left in the input buffer. (Depending on what you entered either side of the Ctrl-Z, this would normally also contain an end-of-line marker). With the Ctrl-Z disposed of, we can reset the EOF-character back to Ctrl-Z and carry on reading from stdin as normal.
1 This issue on Microsoft's WSL GitHub page suggests that it could be Windows that is at fault: once the Ctrl-Z in the buffer, it always returns EOF, even when clearerr() is used. My reading of "Another bane for xplat programmers for the last 30 years, Ctrl-D on Unix and Ctrl-Z on Windows don't work the same." is that although the issue is against WSL, the problem is in Windows itself. Interestingly, the final comment (at time of writing) states "Fixed in Windows Insider Build 18890", but one might still need to call clearerr().

Expect Script - How can the first and last line of a file be validated

I'm using Expect to create a CSR file on a remote system. I'm capturing the output from the system and placing it into a file on my local PC (where it is needed).
I need to validate the first and last line of this file to make sure the file looks like the following:
-----BEGIN CERTIFICATE REQUEST-----
.
.
.
-----END CERTIFICATE REQUEST-----
Originally I was only looking for the last line (or so I thought) by looping through the lines of the file looking for -----END CERTIFICATE REQUEST-----
set fp [ open $csrname ]
while {[gets $fp line] != -1} {
if { $line == "-----END CERTIFICATE REQUEST-----" } {
puts "The Certificate Signing Request file \"$csrname\" has been succesfully created"
} else {
puts "The certificate file is invalid."
puts $line
exit 41
}
}
I have a flaw in my logic because I end up in the error leg of that if statement and exit.
How can I validate just the first and last line of the file?
A CSR isn't really all that long; just a few kilobytes at most. We can validate the lot in one go!
# Load *everything* from a file at once
set f [open $csrname]
set contents [read $f]
close $f
# Validate it using this regular expression:
set RE {^-----BEGIN CERTIFICATE REQUEST-----\n.*\n-----END CERTIFICATE REQUEST-----\n*$}
if {![regexp $RE $contents]} {
puts "The certificate file is invalid."
exit 1
# This spot is unreachable, of course...
}
puts "The Certificate Signing Request file \"$csrname\" has been succesfully created"
I'd recommend checking that the bit between the separators is only using valid characters too (it's base64-encoded PKCS#10), but that's rather more complicated once you go beyond the basics. Probably best to just confirm that you've not got truncation or something that just isn't a CSR at all.
Donal has a nice answer. Here are a couple of alternatives:
set first [exec sed {1q} $csrname]
set last [exec sed -n {$p} $csrname]
or
set f [open $csrname]
set lines [split [read -nonewline $f] \n]
close $f
set first [lindex $lines 0]
set last [lindex $lines end]
In either case, you can
if {$first eq "-----BEGIN CERTIFICATE REQUEST-----" &&
$last eq "-----END CERTIFICATE REQUEST-----"} {...}
Also you can refer to http://wiki.tcl.tk/1466
[eof] is used to determine whether the channel has reached the end of input.

Get line number using grep

I would like to get the line number using grep command, but I am getting the error message when search pattern is not a single word:
couldn't read file "Pattern": no such file or directory
How should be the proper usage of the grep? The code is here:
set status [catch {eval exec grep -n '$textToGrep' $fileName} lineNumber]
if { $status != 0 } {
#error
} else {
puts "lineNumber = $lineNumber"
}
Also if the search pattern is not matched at all, the returned value is : "child process exited abnormally"
Here is the simple test case:
set textToGrep "<BBB name=\"BBBRM\""
file contents:
<?xml version="1.0"?>
<!DOCTYPE AAA>
<AAA>
<BBB name="BBBRM" />
</AAA>
Well, I also get problems with your code and a single word pattern!
First of all, I don't think you need the eval command, because catch itself does an evaluation of its first argument.
Then, the problem is that you put the $textToGrep variable in exec inside single quotes ', which have no meaning to Tcl.
Therefore, if the content of textToGrep is foo, you are asking grep to search for the string 'foo'. If that string, including the single quotes, is not found in the file, you get the error.
Try to rewrite your first line with
set status [catch {exec grep -n $textToGrep $fileName} lineNumber]
and see if it works. Also, read the exec man page, which explains well these problems.
If your system has tcllib install, you can use the fileutil::grep command from the fileutil package:
package require fileutil
set fileName data.xml
set textToGrep {<BBB +name="BBBRM"}; # Update: Add + for multi-space match
set grepResult [::fileutil::grep $textToGrep $fileName]
foreach result $grepResult {
# Example result:
# data.xml:4: <BBB name="BBBRM" />
set lineNumber [lindex [split $result ":"] 1]
puts $lineNumber
# Update: Get the line, squeeze the spaces before name=
set line [lindex [split $result ":"] 2]
regsub { +name=} $line " name=" line
puts $line
}
Discussion
When assigning value to textToGrep, I used the curly braces, thus allowing double quote inside without having to escape them.
the result of the ::fileutil::grep command is a lits of strings. Each string contains the file name, line number, and the line itself; separated by colon.
One way to extract the line number is to first split the string (result) into pieces, using the colon as a separator. Next, I use lindex to grab the second item (index=1, since list is zero-base).
I have updated the code to account for case where there are multiple spaces before name=
There are two problems here:
Pattern matching does not work.
grep exits with error child process
exited abnormally when pattern is not found
The first problem is because you are not enclosing the textToGrep within double quotes(instead of single quotes). So your code should be:
[catch {exec grep -n "$textToGrep" $fileName} lineNumber]
Second problem is because of the exit status of grep command. grep exits with error when the pattern is not found. Here is the try on a shell:
# cat file
pattern
pattern with multiple spaces
# grep pattern file
pattern
pattern with multiple spaces
# echo $?
0
# grep nopattern file
# echo $?
1
EDIT:
In your case you have special characters such as < and > (which have special meaning on a shell).
set textToGrep "<BBB name=\"BBBRM\""
regsub -all -- {<} "$textToGrep" "\\\<" textToGrep
regsub -all -- {>} "$textToGrep" "\\\>" textToGrep
set textToGrep {\<BBB name="BBBRM"}
catch {exec grep -n $textToGrep $fileName} status
if {![regexp "child process" $status]} {
puts $status
} else {
puts "no word found"
}
I think you should do regular expression with child process. Just check above code if it works. In if statement you can process the status command as you like.
With the given example (in your post) the above code works only you need to use backslash for the "<" in the textToGrep variable

TCL Flush Ignore Backspace

Is there a way to ignore backspaces when performing a flush in tcl to capture user input?
I am performing a function where I capture the user input in a variable to be used in another command at a later time. So I perform the following function.
puts -nonewline "What is the username? "
flush stdout
set usrnm [gets stdin]
So let's say using that command as long as I don't use a backspace everything works the way I expect it however if I do use a backspace a "\x7F" is added as a character. Is there a way for the backspace to not be treated as a character?
That seems to depend on your terminal; when I try that code with these key sequences:
BackspaceabcReturn
abcBackspacedReturn
Then I get a length 3 string (measured via string length) in the usrnm variable in both cases. This is what I'd expect when the terminal is properly in cooked mode (the usual default). Since a \x7f is probably not a valid character in a user name anyway, I'd guess that you could filter it out:
set usrnm [string map {\x7f ""} $usrnm]
The only way to be absolutely sure that the character isn't there is to put the terminal in to raw mode (and probably no-echo too) and do all the character input processing yourself. That's a huge amount of work relative to the size of problem; a post-filter seems more sensible to me (and I still wonder what's up with your terminal).
[EDIT]: To put your terminal back into cooked mode, do:
exec stty -raw <#stdin
I just ran into this recently and I wrote a procedure to handle the char 127 character (backspace). If any other input cleansing needs to happen you can do it here too, such as removing special characters. I have a feeling this can be more elegant but it does work.
proc cleanInput {str} {
set return ""
for {set i 0} {$i < [string length $str]} {incr i} {
set char [string index $str $i]
set asc [scan $char %c]
if {$asc == 127} { #backspace
if {[string length $return] > 0} {
set return [string range $return 0 [expr "[string length $return] - 2"]]
}
} else {
append return $char
}
}
return $return
}

error: [eof $FILE_NAME]

I'm trying to write a script in TCL,
I get an error in the line: ![eof $logfile_fd]
The error is: invalid command name "!0"
What may cause this, and how can I fix it?
if {[file exists $logfile] != 1} {
puts "Error existence of $logfile"
return -1
}
if {[catch {set logfile_fd [open "$logfile" r]} err]} {
puts "Error opening \"$logfile\" $err"
return -1
}
seek $logfile_fd 0
![eof $logfile_fd]
I tried to use another solution:
while {[gets $logfile_fd line] >= 0} {
...do something with $line...
}
But I got an other error:
list element in quotes followed by ")\nindexRecordsRead:" instead of space
whilst
)\nindexRecordsRead:
is some text inside $logfile_fd ... I think TCL tries to executes it or something... It works fine for each other line till this line...
Thanks!
I'm not sure what you are trying to do. eof is testing for an end of file condition - using it "bare" like that doesn't do anything. Tcl is evaluating the [eof $logfile_fd] to 0, and then trying to execute the command !0, which doesn't exist.
It does work if you have something like:
if {![eof $logfile_fd]} {
//do something
}
or, if you want to store the results for later, you can do:
set isEndOfFile [expr {![eof $logfile_fd]}]
But, executing like you are, I'm not aware of any side effects you might be wanting to get without using the return value (other than throwing an error if the file descriptor is invalid).
Just need to put this, before working with $line
set bugfree_line [regexp -all -inline {\S+} $line]