How to get the complete output from expect when the internal buffer size of expect_out(buffer) size exceeds? - tcl

I dont know whats happening but i am not getting the complete output from the remote command executed possibly because expects internal buffer is getting execceded.
proc SendCommands { Commands } {
global prompt log errlog
foreach element [split $Commands ";"] {
expect {
-re $prompt
{send -- "$element\r"}
}
set outcome "$expect_out(buffer)"
foreach line [split $outcome \n] {
foreach word [regexp -all -inline {\S+} $line] {
if {( [string index [string trimleft $line " "] 0 ] == "%")} {
puts "$outcome"
exit 1
}
}
}
puts "$outcome"
}
}
set timeout 30
foreach host [ split $hosts "\;" ] {
spawn ssh -o "StrictHostKeyChecking no" "$username#$host"
match_max 10000000
expect {
timeout { send_user "\nFailed to get password prompt\n"; exit 1 }
eof { send_user "\nSSH failure for $host\n"; exit 1 }
"*?assword:*"
{
send -- "$password\r"
}
}
expect {
timeout { send_user "\nLogin incorrect\n"; exit 1 }
eof { send_user "\nSSH failure for $host\n"; exit 1 }
-re "$prompt"
{ send -- "\r" }
}
set timeout 300
SendCommands "$Commands"
}
this is how i am executing it :
./sendcommand aehj SWEs-elmPCI-B-01.tellus comnet1 "terminal length 0;show int description" "(.*#)$"
i am getting the complete output only when i remove log user 0 but when i use the puts command in the fucnction sendcommands above i get about 90 percent of it with 10 percent
of the trailing data at the end is not shown.
one way i am thinking is to use negation of regex in expect but it doesn't seem to work.
expect {
-re ! $prompt
{puts $expect_outcome(buffer)}
}
EDIT :I get all the output once when its executed about 5 or 7 times

After a little search i came up with this and seems to work but let me know of any execptions or better answers :
I set match_max = 1000 then
expect {
-re $prompt
{send -- "$element\r"}
full_buffer {
append outcome $expect_out(buffer)
exp_continue
}
}
append outcome $expect_out(buffer)
puts $outcome
but still when i set match_max = 10000 or 100 it fails again

Related

Expect script failed with "spawn id not open" when calling "expect eof"

I am trying to call an expect script from bash and have it return exit code in case of e.g. Connection Timeout.
Basically the bash script is calling expect like this:
if [[ -f $1 ]]; then
do something
else
script.exp $device
# here I want to evaluate the exit code but it is always 0
fi
I found some previous questions about it and that the right way to catch exit codes is:
expect eof
catch wait result
exit [lindex \$result 3].
The expect script:
#!/usr/bin/expect
exp_internal 1
log_user 1
set timeout -1
set hostname [lindex $argv 0]
set prompt "sftp>";
if { $hostname == "blah" } {
set username "user1"
set password "pass1"
} else {
set username "user2"
set password "pass2"
}
spawn sftp $username#$hostname "-o ConnectTimeout=3"
expect {
"timeout" {
puts "A timeout occured"
exp_continue
}
"password:" {
send "$password\r"
exp_continue
}
"sftp>" {
send "cd cfg\n"
expect $prompt
send "get * /some/dir/$hostname\n"
expect $prompt
send "get running-config /some/dir/$hostname-config\n"
expect $prompt
send "exit\r";
exp_continue
}
}
expect eof
catch wait result
exit [lindex \$result 3]
When I just test the expect script to see what happens, I get this error:
expect: spawn id exp4 not open
while executing
"expect eof"
I've tried moving expect eof around but it does not change the behavior. It states that the spawned process id is not open, which I guess is correct as it has exited with Timed Out/Connection closed?
Remove exp_continue after send "exit\r", otherwise EOF will be consumed by the expect { ... } block and another expect eof would fail.
Change lindex \$result 3 to lindex $result 3 as in the comments.
(Incorporating the previous comments)
Add it into the previous expect command, but with no body:
expect {
timeout {...}
password: {...}
sftp> {...}
eof
}
wait result
if {[lindex $result 2] == 0} {
exit [lindex $result 3]
} else {
error "system error: errno [lindex $result 3]"
}

Tcl / Expect script driven by name pipe blocks/buffers output unexpectedly

I am trying to write an expect script that reacts to input from reading a pipe. Consider this example in file "contoller.sh":
#!/usr/bin/env expect
spawn bash --noprofile --norc
set timeout 3
set success 0
send "PS1='Prompt: '\r"
expect {
"Prompt: " { set success 1 }
}
if { $success != 1 } { exit 1 }
proc do { cmd } {
puts "Got command: $cmd"
set success 0
set timeout 3
send "$cmd\r"
expect {
"Prompt: " { set success 1 }
}
if { $success != 1 } { puts "oops" }
}
set cpipe [open "$::env(CMDPIPE)" r]
fconfigure $cpipe -blocking 0
proc read_command {} {
global cpipe
if {[gets $cpipe cmd] < 0} {
close $cpipe
set cpipe [open "$::env(CMDPIPE)" r]
fconfigure $cpipe -blocking 0
fileevent $cpipe readable read_command
} else {
if { $cmd == "exit" } {
exp_close
exp_wait
exit 0
} elseif { $cmd == "ls" } {
do ls
} elseif { $cmd == "pwd" } {
do pwd
}
}
}
fileevent $cpipe readable read_command
vwait forever;
Suppose you do:
export CMDPIPE=~/.cmdpipe
mkfifo $CMDPIPE
./controller.sh
Now, from another terminal try:
export CMDPIPE=~/.cmdpipe
echo ls >> ${CMDPIPE}
echo pwd >> ${CMDPIPE}
In the first terminal the "Got command: ls/pwd" lines are printed immediately as soon as you press enter on each echo command, but there is no output from the spawned bash shell (no file listing and current directory). Now, try it once more:
echo ls >> ${CMDPIPE}
Suddenly output from the first two commands appears but 3rd command (second ls) is not visible. Keep going and you will notice that there is a "lag" in displayed output which seems to be "buffered" and then dumped at once later.
Why is this happening and how can I fix it?
According to fifo(7):
Normally, opening the FIFO blocks until the other end is opened also.
So, in the proc read_command, it's blocking on set cpipe [open "$::env(CMDPIPE)" r] and does not get the chance to display the spawned process's output until you echo ... >> ${CMDPIPE} again.
To work it around, you can open the FIFO (named pipe) in non-blocking mode:
set cpipe [open "$::env(CMDPIPE)" {RDONLY NONBLOCK} ]
This is also mentioned in fifo(7):
A process can open a FIFO in nonblocking mode. In this case, opening for read-only will succeed even if no one has opened on the write side yet ...
The following is the simplified version of your code and it works fine for me (tested on Debian 9.6).
spawn bash --norc
set timeout -1
expect -re {bash-[.0-9]+[#$] $}
send "PS1='P''rompt: '\r"
# ^^^^
expect "Prompt: "
proc do { cmd } {
send "$cmd\r"
if { $cmd == "exit" } {
expect eof
exit
} else {
expect "Prompt: "
}
}
proc read_command {} {
global cpipe
if {[gets $cpipe cmd] < 0} {
close $cpipe
set cpipe [open cpipe {RDONLY NONBLOCK} ]
fileevent $cpipe readable read_command
} else {
do $cmd
}
}
set cpipe [open cpipe {RDONLY NONBLOCK} ]
fileevent $cpipe readable read_command
vwait forever

Not able to send more than 44 characters in serial port using TCL

Im trying to send commands of length over 44 characters, for which I am getting some garbage at the serial port.
But if I send the commands with less than 44 characters, then I am able to see what I send in the serial port.
Any idea why this is happening? And any solution?
following is the code I do:
proc open_com {} {
if {$::GSW_SERIAL_TYPE == "COM"} {
if { [catch {set ::gComPort [open \\\\.\\COM15 RDWR]}] } {
LogMesg "Error: Failed to open serial connection $::gComPort"
exit
}
fconfigure $::gComPort -mode $::gSerialPortSpeed,n,8,1 -blocking 1 -buffering none -translation binary -ttycontrol {BREAK 0} -handshake none
fileevent $::gComPort readable [list rd_chid $::gComPort]
}
}
proc rd_chid {chid} {
if { [catch {gets $chid msg} err] } {
if {[eof $chid]} {
puts "ERROR:EOF RETURNING"
close $chid
}
return
}
if { [string first "\[fal\]:" [string tolower $msg]] != -1 } {
set ::gPromptRcved 1
}
}

How to fix 'send: spawn id exp4 not open while executing "exp_send -s "$cmd\r"" ' error

My TCL file has code like below,
proc executeCmd {cmd {file ""}} {
set out ""
set output ""
set send_slow {20 0.1}
set adminFlag 0
exp_send -s "$cmd\r"
for {set i1 0} {$i1 < 12} {incr i} {
set intimeout 0
expect {
# other options to check 'hostname', 'more', 'press any to continue' regexes
# ...
-regexp {^(.*)Press any key to continue.*$} {
set output [cleanOutput $expect_out(buffer)]
if {[regexp -- {\w+} $file]} {
append out $output
flush $fo
flush $clf
} else {
append out $output
}
exp_sleep 0.1
exp_send -s " "
exp_continue
}
timeout {
#log_msg INFO "TIMED OUT...."
puts "TIMED OUT"
set intimeout 1
puts "Executing $cmd >>> waiting for response from $hostname"
}
}
if {$intimeout} {
exp_send -s " "
} else {
break
}
}
return $out
}
spawn $plinkLoc -telnet $routerIP -P $routerPort
set out [executeCmd "term width 0"]
After executing this TCL through command prompt I am facing error saying,
send: spawn id exp4 not open
while executing
"exp_send -s "$cmd\r""
(procedure "executeCmd" line 28)
invoked from within
"executeCmd "term width 0""
invoked from within
"set out [executeCmd "term width 0"]""
This line 28 in TCL code is 'set' statement which is prior to 'executeCmd' proc, have updated the file for query purpose.
You have to either pass spawn_id to the procedure or declare the spawn_id as global.
Add this line inside procedure
global spawn_id

How to login to switch and execute commands if the prompt is varying?

I have a expect script :
expect - "$#" << 'END_OF_FILE'
set username [lindex $argv 0]
set hosts [lindex $argv 1]
set Passwordfile [lindex $argv 2 ]
set Commands "[lindex $argv 3 ];\r"
set prompt [lindex $argv 4 ]
# log_user 0
#exp_internal 1
if { [llength $argv] != 5} {
puts "usage: username hostname password commands prompt"
exit 1
}
set force_conservative 0 ;# set to 1 to force conservative mode even if
;# script wasn't run conservatively originally
if {$force_conservative} {
set send_slow {1 .1}
proc send {ignore arg} {
sleep .1
exp_send -s -- $arg
}
}
#
#COMMENTS
#SendCommands function sends the all the commands passed to it.All the commands passed to it must be separated by a semicolon and put a \r at last
set pfile [ open "$Passwordfile" "r"]
proc SendCommands { Commands } {
global prompt log errlog
foreach element [split $Commands ";"] {
expect {
-re $prompt
{send -- "$element\r"}
}
}
}
set timeout 20
foreach host [ split $hosts "\;" ] {
spawn ssh -o "StrictHostKeyChecking no" "$username#$host"
match_max 1000000
set expect_out(buffer) {}
expect {
timeout { send_user "\nFailed to get password prompt\n"; exit 1 }
eof { send_user "\nSSH failure for $host\n"; exit 1 }
"*?assword:*"
{
send -- "[read $pfile]\r"
seek $pfile 0 start
}
}
expect {
timeout { send_user "\nLogin incorrect\n"; exit 1 }
eof { send_user "\nSSH failure for $host\n"; exit 1 }
-re "$prompt"
{ send -- "\r" }
}
set timeout 60
close "$pfile"
SendCommands "$Commands"
}
END_OF_FILE
i can execute it like :
./scriptname username hostname passwordfile "commnmad1;commnad2;command3" "(.*#|.*>)$"
but if i change modes by executing enable command i will get password prompt instead of
the usual prompt (# or >). how can i make sure that the below code is executed if the command is enable or i get a password prompt.
expect {
timeout { send_user "\nFailed to get password prompt\n"; exit 1 }
eof { send_user "\nSSH failure for $host\n"; exit 1 }
"*?assword:*"
{
send -- "[read $pfile]\r"
seek $pfile 0 start
}
}
You probably want
expect {
timeout { send_user "\nFailed to get password prompt\n"; exit 1 }
eof { send_user "\nSSH failure for $host\n"; exit 1 }
"*?assword:*" {
send -- "[read -nonewline $pfile]\r"
seek $pfile 0 start
exp_continue
}
$prompt
}
I added the -nonewline option to the read command. When you send the password, exp_continue will keep you in this expect "loop" until one of the other conditions is met, including the prompt.