I have been dealing with this problem for almost a month now, and I feel frustrated, Any help would be greatly appreciated.
I am trying to write a widget for my takenote command. The purpose of the widget is to feed all the markdown files in ~/notes folder into fzf so that the user can select one of them and starts editing it.
After the user types takenote and presses <tab> I expect the widget to run.
Here is the _takenote.zsh widget definition:
#compdef takenote
local file=$( find -L "$HOME/notes/" -print 2> /dev/null | fzf-tmux +m )
zle reset-prompt
compadd $file
return 1
Unfortunately, the above code doesn't work because of zle reset-prompt, if I remove it then the result would be like this:
And after selecting the file it would turn into:
Which as you see will corrupt the prompt and the command itself.
It appears to me that what I need to do is do a zle reset-prompt
before calling compadd but this can only work when I bind the function to a key otherwise, I will get the following error:
widgets can only be called when ZLE is active
I finally found a workaround for the issue. Although I am not satisfied with the strategy since it is not self contained in the widget itself, but it works. The solution involves trapping fzf-completion after it is invoked and calling zle reset-prompt.
For registering the trap add the following snippet to your .zshrc file (see Zsh menu completion causes problems after zle reset-prompt
):
TMOUT=1
TRAPALRM() {
if [[ "$WIDGET" =~ ^(complete-word|fzf-completion)$ ]]; then
# limit the reset-prompt functionality to the `takenote` script
if [[ "$LBUFFER" == "takenote "* ]]; then
zle reset-prompt
fi
fi
}
The _takenote widget:
#compdef takenote
local file=$( find -L "$HOME/notes/" -print 2> /dev/null | fzf-tmux +m )
compadd $file
return 0
p.s: I would still love to move the trap inside the widget, and avoid registering it in the init script (.zshrc)
After two days, I finally managed to find a hint on how to achieve it thanks to the excellent fzf-tab-completion project:
https://github.com/lincheney/fzf-tab-completion/blob/c91959d81320935ae88c090fedde8dcf1ca70a6f/zsh/fzf-zsh-completion.sh#L120
So actually, all that you need to do is:
#compdef takenote
local file=$( find -L "$HOME/notes/" -print 2> /dev/null | fzf-tmux +m )
compadd $file
TRAPEXIT() {
zle reset-prompt
}
return 0
And it finally works. Cheers!
I was getting the same error when trying to use bindkey for a widget to use vim to open the fzf selected file. Turns out I have to open the file in function1 and then have a function2 calling function1 and then reset-prompt to avoid this widgets can only be called when ZLE is active error. Like you said, it is really frustrating and took me almost a day to figure out!
Example code:
## use rg to get file list
export FZF_DEFAULT_COMMAND='rg --files --hidden'
## file open (function1)
__my-fo() (
setopt localoptions pipefail no_aliases 2> /dev/null
local file=$(eval "${FZF_DEFAULT_COMMAND}" | FZF_DEFAULT_OPTS="--height ${FZF_TMUX_HEIGHT:-40%} --reverse $FZF_DEFAULT_OPTS --preview 'bat --color=always --line-range :500 {}'" $(__fzfcmd) -m "$#" | while read item; do
echo -n "${(q)item}"
done)
local ret=$?
if [[ -n $file ]]; then
$EDITOR $file
fi
return $ret
)
## define zsh widget(function2)
__my-fo-widget(){
__my-fo
local ret=$?
zle reset-prompt
return $ret
}
zle -N __my-fo-widget
bindkey ^p __my-fo-widget
Context: I'm making my own i3-Bar script to read output from other (asynchronous) scripts running in background, concatenate them and then echo them to i3-Bar itself.
The way I'm passing outputs is in plain files, and I guess (logically) the problem is that the files are sometimes read and written at the same time. The best way to reproduce this behavior is by suspending the computer and then waking it back up - I don't know the exact cause of this, I can only go on what I see from my debug log files.
Main Code: Added comments for clarity
#!/usr/bin/env bash
cd "${0%/*}";
trap "kill -- -$$" EXIT; #The bg. scripts are on a while [ 1 ] loop, have to kill them.
rm -r ../input/*;
mkdir ../input/; #Just in case.
for tFile in ./*; do
#Run all of the available scripts in the current directory in the background.
if [ $(basename $tFile) != "main.sh" ]; then ("$tFile" &); fi;
done;
echo -e '{ "version": 1 }\n['; #I3-Bar can use infinite array of JSON input.
while [ 1 ]; do
input=../input/*; #All of the scripts put their output in this folder as separate text files
input=$(sort -nr <(printf "%s\n" $input));
output="";
for tFile in $input; do
#Read and add all of the files to one output string.
if [ $tFile == "../input/*" ]; then break; fi;
output+="$(cat $tFile),";
done;
if [ "$output" == "" ]; then
echo -e "[{\"full_text\":\"ERR: No input files found\",\"color\":\"#ff0000\"}],\n";
else
echo -e "[${output::-1}],\n";
fi;
sleep 0.2s;
done;
Example Input Script:
#!/usr/bin/env bash
cd "${0%/*}";
while [ 1 ]; do
echo -e "{" \
"\"name\":\"clock\"," \
"\"separator_block_width\":12," \
"\"full_text\":\"$(date +"%H:%M:%S")\"}" > ../input/0_clock;
sleep 1;
done;
The Problem
The problem isn't the script itself, but the fact, that i3-Bar receives a malformed JSON input (-> parse error), and terminates - I'll show such log later.
Another problem is, that the background scripts should run asynchronously, because some need to update every 1 second nad some only every 1 minute, etc. So the use of a FIFO isn't really an option, unless I create some ugly inefficient hacky stuff.
I know there is a need for IPC here, but I have no idea how to effieciently do this.
Script output from randomly crashing - waking up error looks the same
[{ "separator_block_width":12, "color":"#BAF2F8", "full_text":"192.168.1.104 "},{ "separator_block_width":12, "color":"#BAF2F8", "full_text":"100%"}],
[{ "separator_block_width":12, "color":"#BAF2F8", "full_text":"192.168.1.104 "},,],
(Error is created by the second line)
As you see, the main script tries to read the file, doesn't get any output, but the comma is still there -> malformed JSON.
The immediate error is easy to fix: don't append an entry to output if the corresponding file is empty:
for tFile in $input; do
[[ $tFile != "../input/*" ]] &&
[[ -s $tFile ]] &&
output+="$(<$tFile),"
done
There is a potential race condition here, though. Just because a particular input file exists doesn't mean that the data is fully written to it yet. I would change your input scripts to look something like
#!/usr/bin/env bash
cd "${0%/*}";
while true; do
o=$(mktemp)
printf '{"name": "clock", "separator_block_width": 12, "full_text": %(%H:%M:%S)T}\n' > "$o"
mv "$o" ../input/0_clock
sleep 1
done
Also, ${output%,} is a safer way to trim a trailing comma when necessary.
I have a unix script in which I am calling functions.
I want the function should return immediately if any of the command failed in between.
But checking $? after every command I can not do. Is there any other way to do this.
Maybe running the script from a file line by line (as long of course as each of your functions are one line long).
Maybe the following script can be a starting point:
#!/bin/sh
while read l
do
eval "$l || break"
done <<EOF
echo test | grep e
echo test2 | grep r
echo test3 grep 3
EOF
This is another idea after my previous answer. It works with bash script and requires your functions to be quite simple (pipes may cause some issues):
#!/bin/bash
set -o monitor
check() {
[ $? -eq 0 ] && exit
}
trap check SIGCHLD
/bin/echo $(( 1+1 ))
/bin/echo $(( 1/0 ))
/bin/echo $(( 2+2 ))
Furthermore: functions need to be external command (this is why I use /bin/echo rather than echo). Regards.
Is there a POSIX Compliant way to limit the scope of a variable to the function it is declared in? i.e.:
Testing()
{
TEST="testing"
}
Testing
echo "Test is: $TEST"
should print "Test is:". I've read about the declare, local, and typeset keywords, but it doesn't look like they are required POSIX built-ins.
It is normally done with the local keyword, which is, as you seem to know, not defined by POSIX. Here is an informative discussion about adding 'local' to POSIX.
However, even the most primitive POSIX-compliant shell I know of which is used by some GNU/Linux distributions as the /bin/sh default, dash (Debian Almquist Shell), supports it. FreeBSD and NetBSD use ash, the original Almquist Shell, which also supports it. OpenBSD uses a ksh implementation for /bin/sh which also supports it. So unless you're aiming to support non-GNU non-BSD systems like Solaris, or those using standard ksh, etc., you could get away with using local. (Might want to put some comment right at the start of the script, below the shebang line, noting that it is not strictly a POSIX sh script. Just to be not evil.) Having said all that, you might want to check the respective man-pages of all these sh implementations that support local, since they might have subtle differences in how exactly they work. Or just don't use local:
If you really want to conform fully to POSIX, or don't want to mess with possible issues, and thus not use local, then you have a couple options. The answer given by Lars Brinkhoff is sound, you can just wrap the function in a sub-shell. This might have other undesired effects though. By the way shell grammar (per POSIX) allows the following:
my_function()
(
# Already in a sub-shell here,
# I'm using ( and ) for the function's body and not { and }.
)
Although maybe avoid that to be super-portable, some old Bourne shells can be even non-POSIX-compliant. Just wanted to mention that POSIX allows it.
Another option would be to unset variables at the end of your function bodies, but that's not going to restore the old value of course so isn't really what you want I guess, it will merely prevent the variable's in-function value to leak outside. Not very useful I guess.
One last, and crazy, idea I can think of is to implement local yourself. The shell has eval, which, however evil, yields way to some insane possibilities. The following basically implements dynamic scoping a la old Lisps, I'll use the keyword let instead of local for further cool-points, although you have to use the so-called unlet at the end:
# If you want you can add some error-checking and what-not to this. At present,
# wrong usage (e.g. passing a string with whitespace in it to `let', not
# balancing `let' and `unlet' calls for a variable, etc.) will probably yield
# very very confusing error messages or breakage. It's also very dirty code, I
# just wrote it down pretty much at one go. Could clean up.
let()
{
dynvar_name=$1;
dynvar_value=$2;
dynvar_count_var=${dynvar_name}_dynvar_count
if [ "$(eval echo $dynvar_count_var)" ]
then
eval $dynvar_count_var='$(( $'$dynvar_count_var' + 1 ))'
else
eval $dynvar_count_var=0
fi
eval dynvar_oldval_var=${dynvar_name}_oldval_'$'$dynvar_count_var
eval $dynvar_oldval_var='$'$dynvar_name
eval $dynvar_name='$'dynvar_value
}
unlet()
for dynvar_name
do
dynvar_count_var=${dynvar_name}_dynvar_count
eval dynvar_oldval_var=${dynvar_name}_oldval_'$'$dynvar_count_var
eval $dynvar_name='$'$dynvar_oldval_var
eval unset $dynvar_oldval_var
eval $dynvar_count_var='$(( $'$dynvar_count_var' - 1 ))'
done
Now you can:
$ let foobar test_value_1
$ echo $foobar
test_value_1
$ let foobar test_value_2
$ echo $foobar
test_value_2
$ let foobar test_value_3
$ echo $foobar
test_value_3
$ unlet foobar
$ echo $foobar
test_value_2
$ unlet foobar
$ echo $foobar
test_value_1
(By the way unlet can be given any number of variables at once (as different arguments), for convenience, not showcased above.)
Don't try this at home, don't show it to children, don't show it your co-workers, don't show it to #bash at Freenode, don't show it to members of the POSIX committee, don't show it to Mr. Bourne, maybe show it to father McCarthy's ghost to give him a laugh. You have been warned, and you didn't learn it from me.
EDIT:
Apparently I've been beaten, sending the IRC bot greybot on Freenode (belongs to #bash) the command "posixlocal" will make it give one some obscure code that demonstrates a way to achieve local variables in POSIX sh. Here is a somewhat cleaned up version, because the original was difficult to decipher:
f()
{
if [ "$_called_f" ]
then
x=test1
y=test2
echo $x $y
else
_called_f=X x= y= command eval '{ typeset +x x y; } 2>/dev/null; f "$#"'
fi
}
This transcript demonstrates usage:
$ x=a
$ y=b
$ f
test1 test2
$ echo $x $y
a b
So it lets one use the variables x and y as locals in the then branch of the if form. More variables can be added at the else branch; note that one must add them twice, once like variable= in the initial list, and once passed as an argument to typeset. Note that no unlet or so is needed (it's a "transparent" implementation), and no name-mangling and excessive eval is done. So it seems to be a much cleaner implementation overall.
EDIT 2:
Comes out typeset is not defined by POSIX, and implementations of the Almquist Shell (FreeBSD, NetBSD, Debian) don't support it. So the above hack will not work on those platforms.
I believe the closest thing would be to put the function body inside a subshell.
E.g. try this
foo()
{
( x=43 ; echo $x )
}
x=42
echo $x
foo
echo $x
This is actually built into the design of POSIX function declarations.
If you would like a variable declared in the parent scope, to be accessible in a function, but leave its value in the parent scope unchanged, simply:
*Declare your function using an explicit subshell, i.e., use a
subshell_function() (with parentheses), not
inline_function() { with braces ;}
The behavior of inline grouping vs. subshell grouping is consistent throughout the entire language.
If you want to "mix and match", start with an inline function, then nest subshell functions as necessary. It's clunky, but works.
Here is a function that enables scoping:
scope() {
eval "$(set)" command eval '\"\$#\"'
}
Example script:
x() {
y='in x'
echo "$y"
}
y='outside x'
echo "$y"
scope x
echo "$y"
Result:
outside x
in x
outside x
If you'd like to journey down to Hell with me, I've made a more elaborated implementation of the eval concept.
This one automatically keeps an account of your quasi-scoped variables, can be called with a more familiar syntax, and properly unsets (as opposed to merely nulling) variables when leaving nested scopes.
Usage
As you can see, you push_scope to enter a scope, _local to declare your quasi-local variables, and pop_scope to leave a scope. Use _unset to unset a variable, and pop_scope will re-unset it when you back out into that scope again.
your_func() {
push_scope
_local x="baby" y="you" z
x="can"
y="have"
z="whatever"
_unset z
push_scope
_local x="you"
_local y="like"
pop_scope
pop_scope
}
Code
All of the gibberish variable name suffixes are to be extra-safe against name collisions.
# Simulate entering of a nested variable scope
# To be used in conjunction with push_scope(), pop_scope(), and _local()
push_scope() {
SCOPENUM_CEDD88E463CF11E8A72A3F9E5F08767D=$(( $SCOPENUM_CEDD88E463CF11E8A72A3F9E5F08767D + 1 ))
}
# Store the present value of the specified variable(s), allowing use in a new scope.
# To be used in conjunction with push_scope(), pop_scope(), and _local()
#
# Parameters:
# $# : string; name of variable to store the value of
scope_var() {
for varname_FB94CFD263CF11E89500036F7F345232 in "${#}"; do
eval "active_varnames_FB94CFD263CF11E89500036F7F345232=\"\${SCOPE${SCOPENUM_CEDD88E463CF11E8A72A3F9E5F08767D}_VARNAMES}\""
# echo "Active varnames: ${active_varnames_FB94CFD263CF11E89500036F7F345232}"
case " ${active_varnames_FB94CFD263CF11E89500036F7F345232} " in
*" ${varname_FB94CFD263CF11E89500036F7F345232} "* )
# This variable was already stored in a previous call
# in the same scope. Do not store again.
# echo "Push \${varname_FB94CFD263CF11E89500036F7F345232}, but already stored."
:
;;
* )
if eval "[ -n \"\${${varname_FB94CFD263CF11E89500036F7F345232}+x}\" ]"; then
# Store the existing value from the previous scope.
# Only variables that were set (including set-but-empty) are stored
# echo "Pushing value of \$${varname_FB94CFD263CF11E89500036F7F345232}"
eval "SCOPE${SCOPENUM_CEDD88E463CF11E8A72A3F9E5F08767D}_VARVALUE_${varname_FB94CFD263CF11E89500036F7F345232}=\"\${${varname_FB94CFD263CF11E89500036F7F345232}}\""
else
# Variable is unset. Do not store the value; an unstored
# value will be used to indicate its unset state. The
# variable name will still be registered.
# echo "Not pushing value of \$${varname_FB94CFD263CF11E89500036F7F345232}; was previously unset."
:
fi
# Add to list of variables managed in this scope.
# List of variable names is space-delimited.
eval "SCOPE${SCOPENUM_CEDD88E463CF11E8A72A3F9E5F08767D}_VARNAMES=\"\${SCOPE${SCOPENUM_CEDD88E463CF11E8A72A3F9E5F08767D}_VARNAMES}${varname_FB94CFD263CF11E89500036F7F345232} \""
;;
esac
unset active_varnames_FB94CFD263CF11E89500036F7F345232
done
unset varname_FB94CFD263CF11E89500036F7F345232
}
# Simulate declaration of a local variable
# To be used in conjunction with push_scope(), pop_scope(), and _local()
#
# This function is a convenience wrapper over scope_var().
#
# Can be called just like the local keyword.
# Example usage: _local foo="foofoofoo" bar="barbarbar" qux qaz=""
_local() {
for varcouple_44D4987063D111E8A46923403DDBE0C7 in "${#}"; do
# Example string: foo="barbarbar"
varname_44D4987063D111E8A46923403DDBE0C7="${varcouple_44D4987063D111E8A46923403DDBE0C7%%=*}"
varvalue_44D4987063D111E8A46923403DDBE0C7="${varcouple_44D4987063D111E8A46923403DDBE0C7#*=}"
varvalue_44D4987063D111E8A46923403DDBE0C7="${varvalue_44D4987063D111E8A46923403DDBE0C7#${varcouple_44D4987063D111E8A46923403DDBE0C7}}"
# Store the value for the previous scope.
scope_var "${varname_44D4987063D111E8A46923403DDBE0C7}"
# Set the value for this scope.
eval "${varname_44D4987063D111E8A46923403DDBE0C7}=\"\${varvalue_44D4987063D111E8A46923403DDBE0C7}\""
unset varname_44D4987063D111E8A46923403DDBE0C7
unset varvalue_44D4987063D111E8A46923403DDBE0C7
unset active_varnames_44D4987063D111E8A46923403DDBE0C7
done
unset varcouple_44D4987063D111E8A46923403DDBE0C7
}
# Simulate unsetting a local variable.
#
# This function is a convenience wrapper over scope_var().
#
# Can be called just like the unset keyword.
# Example usage: _unset foo bar qux
_unset() {
for varname_6E40DA2E63D211E88CE68BFA58FE2BCA in "${#}"; do
scope_var "${varname_6E40DA2E63D211E88CE68BFA58FE2BCA}"
unset "${varname_6E40DA2E63D211E88CE68BFA58FE2BCA}"
done
}
# Simulate exiting out of a nested variable scope
# To be used in conjunction with push_scope(), pop_scope(), and _local()
pop_scope() {
eval "varnames_2581E94263D011E88919B3D175643B87=\"\${SCOPE${SCOPENUM_CEDD88E463CF11E8A72A3F9E5F08767D}_VARNAMES}\""
# Cannot iterate over $varnames by setting $IFS; $IFS does not work
# properly on zsh. Workaround using string manipulation.
while [ -n "${varnames_2581E94263D011E88919B3D175643B87}" ]; do
# Strip enclosing spaces from $varnames.
while true; do
varnames_old_2581E94263D011E88919B3D175643B87="${varnames_2581E94263D011E88919B3D175643B87}"
varnames_2581E94263D011E88919B3D175643B87="${varnames_2581E94263D011E88919B3D175643B87# }"
varnames_2581E94263D011E88919B3D175643B87="${varnames_2581E94263D011E88919B3D175643B87% }"
if [ "${varnames_2581E94263D011E88919B3D175643B87}" = "${varnames_2581E94263D011E88919B3D175643B87}" ]; then
break
fi
done
# Extract the variable name for the current iteration and delete it from the queue.
varname_2581E94263D011E88919B3D175643B87="${varnames_2581E94263D011E88919B3D175643B87%% *}"
varnames_2581E94263D011E88919B3D175643B87="${varnames_2581E94263D011E88919B3D175643B87#${varname_2581E94263D011E88919B3D175643B87}}"
# echo "pop_scope() iteration on \$SCOPE${SCOPENUM_CEDD88E463CF11E8A72A3F9E5F08767D}_VARVALUE_${varname_2581E94263D011E88919B3D175643B87}"
# echo "varname: ${varname_2581E94263D011E88919B3D175643B87}"
if eval "[ -n \""\${SCOPE${SCOPENUM_CEDD88E463CF11E8A72A3F9E5F08767D}_VARVALUE_${varname_2581E94263D011E88919B3D175643B87}+x}"\" ]"; then
# echo "Value found. Restoring value from previous scope."
# echo eval "${varname_2581E94263D011E88919B3D175643B87}=\"\${SCOPE${SCOPENUM_CEDD88E463CF11E8A72A3F9E5F08767D}_VARVALUE_${varname_2581E94263D011E88919B3D175643B87}}\""
eval "${varname_2581E94263D011E88919B3D175643B87}=\"\${SCOPE${SCOPENUM_CEDD88E463CF11E8A72A3F9E5F08767D}_VARVALUE_${varname_2581E94263D011E88919B3D175643B87}}\""
unset "SCOPE${SCOPENUM_CEDD88E463CF11E8A72A3F9E5F08767D}_VARVALUE_${varname_2581E94263D011E88919B3D175643B87}"
else
# echo "Unsetting \$${varname_2581E94263D011E88919B3D175643B87}"
unset "${varname_2581E94263D011E88919B3D175643B87}"
fi
# Variable cleanup.
unset varnames_old_2581E94263D011E88919B3D175643B87
done
unset SCOPE${SCOPENUM_CEDD88E463CF11E8A72A3F9E5F08767D}_VARNAMES
unset varname_2581E94263D011E88919B3D175643B87
unset varnames_2581E94263D011E88919B3D175643B87
SCOPENUM_CEDD88E463CF11E8A72A3F9E5F08767D=$(( $SCOPENUM_CEDD88E463CF11E8A72A3F9E5F08767D - 1 ))
}
It's possible to simulate local variables in a Posix Shell using a small set of general functions.
The sample code below demonstrates two functions, called Local and EndLocal, which do the trick.
Use Local once to declare all local variables at the beginning of a routine.
Local creates a new scope, and saves the previous definition of each local variable into a new global variable.
Use EndLocal before returning from the routine.
EndLocal restores all previous definitions saved at the current scope, and deletes the last scope.
Also note that EndLocal preserves the previous exit code.
All functions are short, and use descriptive names, so they should be relatively easy to understand.
They support variables with tricky characters like spaces, single and double quotes.
Caution: They use global variables beginning with LOCAL_, so there's a small risk of collision with existing homonym variables.
The Test routine recursively calls itself 3 times, and modifies a few local and global variables.
The output shows that the A and B local variables are preserved, contrary to the global N variable.
Code:
#!/bin/sh
#-----------------------------------------------------------------------------#
# Manage pseudo-local variables in a Posix Shell
# Check if a variable exists.
VarExists() { # $1=Variable name
eval "test \"\${${1}+true}\" = \"true\""
}
# Get the value of a variable.
VarValue() { # $1=Variable name
eval "echo \"\${$1}\""
}
# Escape a string within single quotes, for reparsing by eval
SingleQuote() { # $1=Value
echo "$1" | sed -e "s/'/'\"'\"'/g" -e "s/.*/'&'/"
}
# Set the value of a variable.
SetVar() { # $1=Variable name; $2=New value
eval "$1=$(SingleQuote "$2")"
}
# Emulate local variables
LOCAL_SCOPE=0
Local() { # $*=Local variables names
LOCAL_SCOPE=$(expr $LOCAL_SCOPE + 1)
SetVar "LOCAL_${LOCAL_SCOPE}_VARS" "$*"
for LOCAL_VAR in $* ; do
if VarExists $LOCAL_VAR ; then
SetVar "LOCAL_${LOCAL_SCOPE}_RESTORE_$LOCAL_VAR" "SetVar $LOCAL_VAR $(SingleQuote "$(VarValue $LOCAL_VAR)")"
else
SetVar "LOCAL_${LOCAL_SCOPE}_RESTORE_$LOCAL_VAR" "unset $LOCAL_VAR"
fi
done
}
# Restore the initial variables
EndLocal() {
LOCAL_RETCODE=$?
for LOCAL_VAR in $(VarValue "LOCAL_${LOCAL_SCOPE}_VARS") ; do
eval $(VarValue "LOCAL_${LOCAL_SCOPE}_RESTORE_$LOCAL_VAR")
unset "LOCAL_${LOCAL_SCOPE}_RESTORE_$LOCAL_VAR"
done
unset "LOCAL_${LOCAL_SCOPE}_VARS"
LOCAL_SCOPE=$(expr $LOCAL_SCOPE - 1)
return $LOCAL_RETCODE
}
#-----------------------------------------------------------------------------#
# Test routine
N=3
Test() {
Local A B
A=Before
B=$N
echo "#1 N=$N A='$A' B=$B"
if [ $N -gt 0 ] ; then
N=$(expr $N - 1)
Test
fi
echo "#2 N=$N A='$A' B=$B"
A="After "
echo "#3 N=$N A='$A' B=$B"
EndLocal
}
A="Initial value"
Test
echo "#0 N=$N A='$A' B=$B"
Output:
larvoire#JFLZB:/tmp$ ./LocalVars.sh
#1 N=3 A='Before' B=3
#1 N=2 A='Before' B=2
#1 N=1 A='Before' B=1
#1 N=0 A='Before' B=0
#2 N=0 A='Before' B=0
#3 N=0 A='After ' B=0
#2 N=0 A='Before' B=1
#3 N=0 A='After ' B=1
#2 N=0 A='Before' B=2
#3 N=0 A='After ' B=2
#2 N=0 A='Before' B=3
#3 N=0 A='After ' B=3
#0 N=0 A='Initial value' B=
larvoire#JFLZB:/tmp$
Using the same technique, I think it should be possible to dynamically detect if the local keyword is supported, and if it's not, define a new function called local that emulates it.
This way, the performance would be much better in the normal case of a modern shell having built-in locals.
And things would still work on an old Posix shell without it.
Actually we'd need three dynamically generated functions:
BeginLocal, creating an empty pseudo-local scope, as is done in the beginning of my Local above, or doing nothing if the shell has built-in locals.
local, similar to my Local, defined only for shells not having built-in locals.
EndLocal, identical to mine, or just preserving the last exit code for shells having built-in locals.
Define the functions using the function myfunc { syntax, and use typeset myvar to define your variables. If you define functions that way rather than using the myfunc(){ syntax, all of the common Bourne shells (bash, zsh, ksh '88 and '93) will localize variables defined with typeset (and aliases to typeset like integer).
Or reinvent the wheel. Whichever floats your boat. ;)
EDIT: while the question asks for POSIX, and this is not a POSIX-compliant function definition syntax, the person who asked indicates in a later comment that he's using bash. The use of "typset" in combination with the alternative function definition syntax is the best solution there, as the true POSIX mechanism requires the additional overhead of forking a new subshell.
Script works well when run manually, but when I schdule it in cronjob it shows :
malformed JSON string, neither array, object, number, string or atom, at character offset 0 (before "<html>\r\n<head><tit...") at /usr/local/lib/perl5/site_perl/5.14.2/JSON.pm line 171.
script itself:
#rest config vaiables
$ENV{'PERL_LWP_SSL_VERIFY_NONE'} = 0;
print "test\n";
my $client = REST::Client->new();
$client->addHeader('Authorization', 'Basic YWRtaW46cmFyaXRhbg==');
$client->addHeader('content_type', 'application/json');
$client->addHeader('accept', 'application/json');
$client->setHost('http://10.10.10.10');
$client->setTimeout(1000);
$useragent = $client->getUseragent();
print "test\n";
#Getting racks by pod
$req = '/api/v2/racks?name_like=2t';
#print " rekvest {$req}\n";
$client->request('GET', qq($req));
$racks = from_json($client->responseContent());
$datadump = Dumper (from_json($client->responseContent()));
crontab -l
*/2 * * * * /usr/local/bin/perl /folder/api/2t.pl > /dmitry/api/damnitout 2>&1
Appreciate any suggestion
Thank you,
Dmitry
It is difficult to say what is really happening, but in my experience 99% issues of running stuff in crontab stems from differences in environment variables.
Typical way to debug this: in the beginning of your script add block like this:
foreach my $key (keys %ENV) {
print "$key = $ENV{$key}\n";
}
Run it in console, look at the output, save it in log file.
Now, repeat the same in crontab and save it into log file (you have already done that - this is good).
See if there is any difference in environment variables when trying to run it both ways and try to fix it. In Perl, probably easiest is to alter environment by changing %ENV. After all differences are sorted out, there is no reason for this to not work right.
Good luck!