How to import shell functions from one file into another? - function

I have the shell script:
#!/bin/bash
export LD=$(lsb_release -sd | sed 's/"//g')
export ARCH=$(uname -m)
export VER=$(lsb_release -sr)
# Load the test function
/bin/bash -c "lib/test.sh"
echo $VER
DISTROS=('Arch'
'CentOS'
'Debian'
'Fedora'
'Gentoo')
for I in "${DISTROS[#]}"
do
i=$(echo $I | tr '[:upper:]' '[:lower:]') # convert distro string to lowercase
if [[ $LD == "$I"* ]]; then
./$ARCH/${i}.sh
fi
done
As you can see it should run a shell script, depending on which architecture and OS it is run on. It should first run the script lib/test.sh before it runs this architecture and OS-specific script. This is lib/test.sh:
#!/bin/bash
function comex {
which $1 >/dev/null 2>&1
}
and when I run it on x86_64 Arch Linux with this x86_64/arch.sh script:
#!/bin/bash
if comex atom; then
printf "Atom is already installed!"
elif comex git; then
printf "Git is installed!"
fi
it returned the output:
rolling
./x86_64/arch.sh: line 3: comex: command not found
./x86_64/arch.sh: line 5: comex: command not found
so clearly the comex shell function is not correctly loaded by the time the x86_64/arch.sh script is run. Hence I am confused and wondering what I need to do in order to correctly define the comex function such that it is correctly loaded in this architecture- and OS-dependent final script.
I have already tried using . "lib/test.sh" instead of /bin/bash -c "lib/test.sh" and I received the exact same error. I have also tried adding . "lib/test.sh" to the loop, just before the ./$ARCH/${i}.sh line. This too failed, returning the same error.

Brief answer: you need to import your functions using . or source instead of bash -c:
# Load the test function
source "lib/test.sh"
Longer answer: when you call script with bash -c, a child process is created. This child process sees all exported variables (including functions) from parent process. But not vice versa. So, your script will never see comex function. Instead you need to include script code directly in current script and you do so by using . or source commands.
Part 2. After you "sourced" lib/test.sh, your main script is able to use comex function. But arch scripts won't see this function because it is not exported to them. Your need to export -f comex:
#!/bin/bash
function comex {
which $1 >/dev/null 2>&1
}
export -f comex

Related

Bash script to run if HTML has changed

I write a .sh script that firstly downloads the source code of a page and secondly executes a Rscript only if the source code downloaded is different from the latter. The page is updated once a day and the URL ends with the actual date. This is all on a server and a cron job would run the .sh every 15 min. So I do this :
#!/bin/bash
lwp-download "https://geodes.santepubliquefrance.fr/GC_indic.php?lang=fr&prodhash=de1751e6&indic=type_hospit&dataset=covid_hosp_type&view=map2&filters=sexe=0,jour="$(date '+%Y-%M-%d') download.html
md5 page.html > last_md5
diff previous_md5 last_md5
if[ "$?" = "!" ] ; then
Rscript myscript.R
fi
mv last_md5 previous_md5
rm page.html
First problem, it carries on running the R script even though download.html is downloaded and unchanged.
Plus, I hit an error after the R script has run "Syntax error: "fi" unexpected"
Some issues:
You need to put a space between if and [ - or you could just do if command; then.
You calculate the MD5 sum on the wrong file.
You remove the wrong file.
Since you're probably not interested in seeing the actual diff in the MD5 sums, I suggest that you use cmp -s instead of diff.
Also note that I quoted the $(date ...) command too. It's not necessary in this particular case, but it makes linters happy.
#!/bin/bash
lwp-download "https://geodes.santepubliquefrance.fr/GC_indic.php?lang=fr&prodhash=de1751e6&indic=type_hospit&dataset=covid_hosp_type&view=map2&filters=sexe=0,jour=$(date '+%Y-%M-%d')" download.html
md5 download.html > last_md5
if ! cmp -s previous_md5 last_md5; then
Rscript myscript.R
mv last_md5 previous_md5
else
rm last_md5
fi
rm download.html
You should leave a space between if and [.
#!/bin/bash
lwp-download "https://geodes.santepubliquefrance.fr/GC_indic.php?lang=fr&prodhash=de1751e6&indic=type_hospit&dataset=covid_hosp_type&view=map2&filters=sexe=0,jour="$(date '+%Y-%M-%d') download.html
md5 page.html > last_md5
diff previous_md5 last_md5
if [[ "$?" = "!" ]] ; then
Rscript myscript.R
fi
mv last_md5 previous_md5
rm page.html
Also i'd recommend if you dont see any error to use any online lint to guide you in whats wrong
https://www.shellcheck.net/

Mysql cli not returning data in bash script run by crontab

I have a bash script that is executed via a cron job
#!/bin/bash
# abort on errors
set -e
ABS_DIR=/path/
# extract the creds for the mysql db
DB_USER="USERNAME"
DB_PASS="PASSWORD"
function extract_data() {
file=$2
sql_query=`cat $ABS_DIR/$1`
data=`mysql -u $DB_USER --password="$DB_PASS" -D "database" -e "$sql_query" | tail -n +2`
echo -e "Data:"
echo -e "$data"
}
extract_data "sql_query.sql" "log.csv"
When running it manually with bash extract.sh the mysql cmd fetches the data correctly and I see the echo -e "$data" on the console.
When running the script via a cron job
* 12 * * * /.../extract.sh > /.../cron_log.txt
then I get an empty line saved to the cron_log.txt file!?
This is a common problem; a script behaves differently when run from user shell and when run from crontab. The cause is typically due to differences in the environment variables in the user shell, and in the crontab shell; by default, they are not the same.
To begin debugging this issue, you could direct stderr as well as stdout from crontab, hopefully to capture an error message:
extract.sh &> /.../cron_log.txt
(notice the &)
Also: you have three dots (/.../) -- that is likely a typo, could also be the cause.

Shell command - How to run?

When I put a command on terminal, it works just fine, but when I put the same command in a .sh script and then run it, it doesn't give any output. What might be the reason for this?
The command:
IFS=$'\t'; while read -r k v; do
export "$k=\"$v\""
This is kind of expected since export sets the environment variable for that particular shell.
Docs -
export command is used to export a variable or function to the
environment of all the child processes running in the current shell.
export -f functionname # exports a function in the current shell. It
exports a variable or function with a value.
So when you create a sh script it runs the specified commands into a different shell which terminates once the script exits.
It works with the sh script too -
data.sh
#!/bin/bash
IFS=$'\t'; while read -r k v; do
export "$k=\"$v\""
echo $HELLO1
echo $SAMPLEKEY
done < <(jq -r '.data | to_entries[] | [(.key|ascii_upcase), .value] | #tsv' data.json)
Output -
$ ./data.sh
"world1"
"world1"
"samplevalue"
Which suggests that your variables are getting exported but for that particular shell env.
In case you want to make them persistent, try putting scripts or exporting them via ~/.bashrc OR ~/.profile.
Once you put them in ~/.bashrc OR ~/.profile, you will find the output something as below -
I used ~/.bash_profile on my MAC OS -
Last login: Thu Jan 25 15:15:42 on ttys006
"world1"
"world1"
"samplevalue"
viveky4d4v#020:~$ echo $SAMPLEKEY
"samplevalue"
viveky4d4v#020:~$ echo $HELLO1
"world1"
viveky4d4v#020:~$
Which clarifies that your env variables will get exported whenever you open a new shell, the logic for this lies in .bashrc (https://unix.stackexchange.com/questions/129143/what-is-the-purpose-of-bashrc-and-how-does-it-work)
Put your script as it is ~/.bashrc at the end -
IFS=$'\t'; while read -r k v; do
export "$k=\"$v\""
echo $HELLO1
echo $SAMPLEKEY
done < <(jq -r '.data | to_entries[] | [(.key|ascii_upcase), .value] | #tsv' data.json)
You need to make sure that data.json stays in user's home directory.
Basically: A child process can't change the environment of it's parent process.
You need to source the script instead of executing it:
source your_script.sh
source runs the script in the current shell which makes it possible to modify the environment.
Alternatively you can create a function in your shell startup files (e.g. ~/.bashrc):
my_function() {
IFS=$'\t'; while read -r k v; do
export "$k=\"$v\""
done < <(jq -r '.data | to_entries[] | [(.key|ascii_upcase), .value] | #tsv' /path/to/data.json)
}
After you've started a new shell you can run
my_function

How to pass arguments from cmd to tcl script of ModelSim

I run Modelsim in the cmd from a python program.
I use the following code which call a tcl script which run the modelsim:
os.system("vsim -c -do top_tb_simulate_reg.tcl " )
The tcl script contain the following:
vsim -voptargs="+acc" +UVM_TESTNAME=test_name +UVM_MAX_QUIT_COUNT=1 +UVM_VERBOSITY=UVM_LOW \
-t 1ps -L unisims_verm -L generic_baseblocks_v2_1_0 -L axi_infrastructure_v1_1_0 \
-L dds_compiler_v6_0_12 -lib xil_defaultlib xil_defaultlib.girobo2_tb_top \
xil_defaultlib.glbl
I want that the value of the +UVM_TESTNAME will be an argument which I passed from the cmd when I execute:
os.system("vsim -c -do top_tb_simulate_reg.tcl " )
How can I do it?
I tried the following with no succees:
Python script:
os.system("vsim -c -do top_tb_simulate_reg.tcl axi_rd_only_test" )
Simulation file (tcl script)
vsim -voptargs="+acc" +UVM_TESTNAME=$argv +UVM_MAX_QUIT_COUNT=1 +UVM_VERBOSITY=UVM_LOW \
-t 1ps -L unisims_verm -L generic_baseblocks_v2_1_0 -L axi_infrastructure_v1_1_0 \
-L dds_compiler_v6_0_12 -lib xil_defaultlib xil_defaultlib.girobo2_tb_top \
xil_defaultlib.glbl
I got the following error:
# ** Error: (vsim-3170) Could not find 'C:/raft/raftortwo/girobo2/ver/sim/work.axi_rd_only_test'.
The problem is that the vsim binary is doing its own processing of the arguments, and that is interfering. While yes, you can probably find a way around this by reading the vsim documentation, the simplest way around this is to pass values via environment variables. They're inherited by a process from its parent process, and are fine for passing most things. (The exception are security tokens, which should always be passed in files with correctly-set permissions, rather than either environment variables or command-line arguments.)
In your python code:
# Store the value in the *inheritable* environment
os.environ["MY_TEST_CASE"] = "axi_rd_only_test"
# Do the call; the environment gets passed over behind the scenes
os.system("vsim -c -do top_tb_simulate_reg.tcl " )
In your tcl code:
# Read out of the inherited environment
set name $env(MY_TEST_CASE)
# Use it! (Could do this as one line, but that's hard to read)
vsim -voptargs="+acc" +UVM_TESTNAME=$name +UVM_MAX_QUIT_COUNT=1 +UVM_VERBOSITY=UVM_LOW \
-t 1ps -L unisims_verm -L generic_baseblocks_v2_1_0 -L axi_infrastructure_v1_1_0 \
-L dds_compiler_v6_0_12 -lib xil_defaultlib xil_defaultlib.girobo2_tb_top \
xil_defaultlib.glbl
Late to the party but I found a great workaround for your obstacle. The do command within Modelsim's TCL instance does accept parameters. See command reference.
vsim -c -do filename.tcl can't take parameters, but you can use vsim -c -do "do filename.tcl params".
In your case this translates to os.system('vsim -c -do "do top_tb_simulate_reg.tcl axi_rd_only_test"'). Your .tcl script will find the parameter passed through the variable $1.
I hope to helps anyone!

How to Pass Parameters from QSub to Bash Script?

I'm having an issue passing variables to a Bash script using QSub.
Assume I have a Bash script named example. The format of example is the following:
#!/bin/bash
# (assume other variables have been set)
echo $1 $2 $3 $4
So, executing "bash example.sh this is a test" on Terminal (I am using Ubuntu 12.04.3 LTS, if that helps) produces the output "this is a test".
However, when I enter "qsub -v this,is,a,test example.sh", I get no output. I checked the output file that QSub produces, but the line "this is a test" is nowhere to be found.
Any help would be appreciated.
Thank you.
Using PBSPro or SGE, arguments can simply be placed after the script name as may seem intuitive.
qsub example.sh hello world
In Torque, command line arguments can be submitted using the -F option. Your example.sh will look something like this:
#!/bin/bash
echo "$1 $2"
and your command like so:
qsub -F "hello world" example.sh
Alternatively, environment variables can be set using -v with a comma-separated list of variables.
#!/bin/bash
echo "$FOO $BAR"
and your command like so:
qsub -v FOO="hello",BAR="world" example.sh
(This may be better phrased as a comment on #William Hay's answer, but I don't have the reputation to do so.)
Not sure which batch scheduler you are using but on PBSPro or SGE then submitting with qsub example.sh this is a test should do what you want.
The Torque batch scheduler doesn't (AFAIK) allow passing command line arguments to the script this way. You would need to create a script looking something like this.
#!/bin/bash
echo $FOO
Then submit it with a command like:
qsub -v FOO="This is a test" example.sh