I have the following line:
echo -ne "\033]0;blah\007"
that correctly sets the term name to blah. But if I place that line within a function, as in:
setTermName()
{
echo -ne "\033]0;blah\007"
}
it doesn't work anymore. I guess escape sequences are not treated correctly within the function. So my question could be reformulated as: How do you use escape sequences within a function?
I only want to be able to do setTermName foo from command line.
You invoke that echo command from interactive ksh also? Are you sure it understands -ne? It's not standard. Maybe use printf.
And you can try to use alias instead.
UPD: I've checked with AIX ksh, the following function worked:
set_tn()
{
printf "\033]0;$1\007"
}
Related
I have this which works to declare a JSON string in a bash script:
local my_var="foobar"
local json=`cat <<EOF
{"quicklock":"${my_var}"}
EOF`
The above heredoc works, but I can't seem to format it any other way, it literally has to look exactly like that lol.
Is there any way to get the command to be on one line, something like this:
local json=`cat <<EOF{"quicklock":"${my_var}"}EOF`
that would be so much nicer, but doesn't seem to take, obviously simply because that's not how EOF works I guess lol.
I am looking for a shorthand way to declare JSON in a file that:
Does not require a ton of escape chars.
That allows for dynamic interpolation of variables.
Note: The actual JSON I want to use has multiple dynamic variables with many key/value pairs. Please extrapolate.
I'm not a JSON guy, don't really understand the "well-formed" arguments in the discussion above, but, you can use a 'here-string' rather than a 'here-document', like this:
my_var="foobar"
json=`cat <<<{\"quicklock\":\"${my_var}\"}`
why not use jq? It's pretty good at managing string interpolation and it lints your structure.
$ echo '{}' >> foo.json
$ declare myvar="assigned-var"
$ jq --arg ql "$myvar" '.quicklock=$ql' foo.json
the text that comes out on the other end of that call to jq can then be cat into a file or whatever you wanna do. text would look something like this:
{"quicklock": "assigned-var"}
You can do this with printf:
local json="$(printf '{"quicklock":"%s"}' "$my_var")"
(Never mind that SO's syntax highlighting looks odd here. Posix shell command substitution allows nesting one level of quotes.)
A note (thanks to Charles Duffy's comment on the question): I'm assuming $my_var is not controlled by user input. If it is, you'll need to be careful to ensure it is legal for a JSON string. I highly recommend barring non-ASCII characters, double quotes, and backslashes. If you have jq available, you can use it as Charles noted in the comments to ensure you have well-formed output.
You can define your own helper function to address the situation with missing bash syntax:
function begin() { eval echo $(sed "${BASH_LINENO[0]}"'!d;s/.*begin \(.*\) end.*/\1/;s/"/\\\"/g' "${BASH_SOURCE[0]}"); }
Then you can use it as follows.
my_var="foobar"
json=$(begin { "quicklock" : "${my_var}" } end)
echo "$json"
This fragment displays the desired output:
{ "quicklock" : "foobar" }
This is just a proof of concept. You can define your syntax in any way you want (such as end of the input by the custom EOF string, correctly escape invalid characters). For example, since Bash allows function identifiers using characters other than alphanumeric characters, it is possible to define such a syntax:
json=$(/ { "quicklock" : "${my_var}" } /)
Moreover, if you relax the first criterion (escape characters), ordinary assignment will nicely solve this problem:
json="{ \"quicklock\" : \"${my_var}\" }"
How about just using the shell's natural concatenation of strings? If you concatenate ${mybar} rather than interpolate it, you can avoid escapes and get everything on one line:
my_var1="foobar"
my_var2="quux"
json='{"quicklock":"'${my_var1}'","slowlock":"'$my_var2'"}'
That said, this is a pretty crude scheme, and as others have pointed out you'll have problems if the variables, say, contain quote characters.
Since no escape chars is strong requirement here is a here-doc based solution:
#!/bin/bash
my_var='foobar'
read -r -d '' json << EOF
{
"quicklock": "$my_var"
}
EOF
echo "$json"
It will give you the same output as the first solution I mentioned.
Just be careful, if you would put first EOF inside double quotes:
read -r -d '' json << "EOF"
$my_var would not be considered as a variable but as a plain text, so you would get this output:
{
"quicklock": "$my_var"
}
This question already has answers here:
How can I store a command in a variable in a shell script?
(12 answers)
Closed 4 years ago.
These work as advertised:
grep -ir 'hello world' .
grep -ir hello\ world .
These don't:
argumentString1="-ir 'hello world'"
argumentString2="-ir hello\\ world"
grep $argumentString1 .
grep $argumentString2 .
Despite 'hello world' being enclosed by quotes in the second example, grep interprets 'hello (and hello\) as one argument and world' (and world) as another, which means that, in this case, 'hello will be the search pattern and world' will be the search path.
Again, this only happens when the arguments are expanded from the argumentString variables. grep properly interprets 'hello world' (and hello\ world) as a single argument in the first example.
Can anyone explain why this is? Is there a proper way to expand a string variable that will preserve the syntax of each character such that it is correctly interpreted by shell commands?
Why
When the string is expanded, it is split into words, but it is not re-evaluated to find special characters such as quotes or dollar signs or ... This is the way the shell has 'always' behaved, since the Bourne shell back in 1978 or thereabouts.
Fix
In bash, use an array to hold the arguments:
argumentArray=(-ir 'hello world')
grep "${argumentArray[#]}" .
Or, if brave/foolhardy, use eval:
argumentString="-ir 'hello world'"
eval "grep $argumentString ."
On the other hand, discretion is often the better part of valour, and working with eval is a place where discretion is better than bravery. If you are not completely in control of the string that is eval'd (if there's any user input in the command string that has not been rigorously validated), then you are opening yourself to potentially serious problems.
Note that the sequence of expansions for Bash is described in Shell Expansions in the GNU Bash manual. Note in particular sections 3.5.3 Shell Parameter Expansion, 3.5.7 Word Splitting, and 3.5.9 Quote Removal.
When you put quote characters into variables, they just become plain literals (see http://mywiki.wooledge.org/BashFAQ/050; thanks #tripleee for pointing out this link)
Instead, try using an array to pass your arguments:
argumentString=(-ir 'hello world')
grep "${argumentString[#]}" .
In looking at this and related questions, I'm surprised that no one brought up using an explicit subshell. For bash, and other modern shells, you can execute a command line explicitly. In bash, it requires the -c option.
argumentString="-ir 'hello world'"
bash -c "grep $argumentString ."
Works exactly as original questioner desired. There are two restrictions to this technique:
You can only use single quotes within the command or argument strings.
Only exported environment variables will be available to the command
Also, this technique handles redirection and piping, and other shellisms work as well. You also can use bash internal commands as well as any other command that works at the command line, because you are essentially asking a subshell bash to interpret it directly as a command line. Here's a more complex example, a somewhat gratuitously complex ls -l variant.
cmd="prefix=`pwd` && ls | xargs -n 1 echo \'In $prefix:\'"
bash -c "$cmd"
I have built command processors both this way and with parameter arrays. Generally, this way is much easier to write and debug, and it's trivial to echo the command you are executing. OTOH, param arrays work nicely when you really do have abstract arrays of parameters, as opposed to just wanting a simple command variant.
I have strings in the following pattern: <SOMETHING_1>{<JSON>}<SOMETHING_2>
I want to keep the <JSON> and remove the <SOMETHING_X>blocks. I'm trying to do it with substring removal, but instead of getting
{x:1,action:PLAYING,name:John,description:Some description rSxv9HiATMuQ4wgoV2CGxw}
I keep getting
{x:1,action:PLAYING,name:John,description:Some}
because the whitespace in the description field cuts off the substring.
Any ideas on what to change?
CODE:
string="000{x:1,action:PLAYING,name:John,description:Some description rSxv9HiATMuQ4wgoV2CGxw}401"
string=$1
string=$(echo "${string#*{}")
string=$(echo "${string%}*}")
string={$string}
echo $string
The original code works perfectly, if we accept a direct assignment of the string -- though the following is a bit more explicit:
string="000{x:1,action:PLAYING,name:John,description:Some description rSxv9HiATMuQ4wgoV2CGxw}401"
string='{'"${string#*"{"}" # trim content up to and including the first {, and replace it
string="${string%'}'*}"'}' # trim the last } and all after, and replace it again
printf '%s\n' "$string"
...properly emits:
{x:1,action:PLAYING,name:John,description:Some description rSxv9HiATMuQ4wgoV2CGxw}
I'm guessing that the string is being passed on a command line unquoted, and is thus being split into multiple arguments. If you quote your command-line arguments to prevent string-splitting by the calling shell (./yourscript "$string" instead of ./yourscript $string), this issue will be avoided.
with sed:
string="000{x:1,action:PLAYING,name:John,description:Some description rSxv9HiATMuQ4wgoV2CGxw}401"
sed 's/.*\({.*}\).*/\1/g' <<<$string
output:
{x:1,action:PLAYING,name:John,description:Some description rSxv9HiATMuQ4wgoV2CGxw}
Here you go…
string="000{x:1,action:PLAYING,name:John,description:Some description rSxv9HiATMuQ4wgoV2CGxw}401"
echo "original: ${string}"
string="${string#*\{}"
string="${string%\}*}"
echo "final: {${string}}"
By the way, JSON keys should be surrounded with double quotes.
I wrote a small Perl script with regular expressions to get HTML components of a website.
I know its not a good way of doing this kind of job, but I was trying to test out my regex skills.
When run with either one of the two regex patterns in the while loop it runs perfectly and displays the correct output. But when I try to check both patterns in the while loop the second pattern matches every time and the loop runs infinitely.
My script:
#!/usr/bin/perl -w
use strict;
while (<STDIN>) {
while ( (m/<span class=\"itempp\">([^<]+)+?<\/span>/g) ||
(m/<font size=\"-1\">([^<]+)+?<\/font>/g) ) {
print "$1\n";
}
}
I am testing the above script with a sample input:
Link title
<span class="itempp">$150</span>
<font size="-1"> (Location)</font>
Desired output:
$150
(Location)
Thank you! Any help would be highly appreciated!
Whenever a global regex fails to match it resets the position where the next global regex will start searching. So when the first of your two patterns fails it forces the second to look from the beginning of the string again.
This behaviour can be disabled by adding the /c modifier, which leaves the position unchanged if a regex fails to match.
In addition, you can improve your patterns by removing the escape characters (" doesn't need escaping and / needn't be escaped if you choose a different delimiter) and the superfluous +? after the captures.
Also use warnings is much better than -w on the command line.
Here is a working version of your code.
use strict;
use warnings;
while (<STDIN>) {
while( m|<span class="itempp">([^<]+)</span>|gc
or m|<font size="-1">([^<]+)</font>|gc ) {
print "$1\n";
}
}
while (<DATA>) {
if (m{<(?:span class="itempp"|font size="-1")>\s*([^<]+)}i) {
print "$1\n";
}
}
__DATA__
Link title
<span class="itempp">$150</span>
<font size="-1"> (Location)</font>
You did not change $_ after or during matching, so it will always match and run into an infinite loop.
to fix it , you can add $_=$'; after print, to run match again in the rest of string.
Hi I'm creating a shell script.
and an example code looks like
#!/bin/bash
test_func(){
{
echo "It works!"
}
funcion_name = "test_func"
I want to somehow be able to call test_func() using the variable "function_name"
I know that's possible in php using call_user_func($function_name) or by sying $function_name()
is this also possible in the shell scripting?
Huge appreciation for the help! :)
You want the bash built-in eval. From man bash:
eval [arg ...]
The args are read and concatenated together into a single command. This command is then read and executed by the shell, and its exit status is returned as the value of eval. If there are no
args, or only null arguments, eval returns 0.
You can also accomplish it with simple variable substitution, as in
#!/bin/bash
test_func() {
echo "It works!"
}
function_name="test_func"
$function_name
#!/bin/bash
test_func() {
echo "It works!"
}
function_name="test_func"
eval ${function_name}