We currently run a migrations script as a container_command, which does not appear to run when changing an environment variable using eb setenv. Is there an AL1 way to run a post environment variable being set script?
You could run the following from the terminal where eb is the path to the elasticbeanstalk script
cat .env | while read line ; do /path/to/eb setenv $line ; done
Related
I'm failing to escape QUOTE " character on AZ CLI command.
I need to remotly execute the command : C:\"Program Files"\Outlook\outlook.exe
So, I use RunPowerShellScript command of AZ CLI with start-process.
call az vm run-command invoke --command-id RunPowerShellScript --name xxx -g yyy --scripts "Start-Process C:\"Program Files"\Outlook\outlook.exe" --output yaml
I tryed many option like :
C:\"Program Files"\
C:\""Program Files""\
C:\"Program Files\"\
C:\^"Program Files^"\
C:`"Program Files`"\
Nothing works, always an error.
Any idea please ?
In fact, you do not need to escape quotes, you can directly use FilePath. So you could use "Start-Process 'C:\Program Files\Outlook\outlook.exe'". For example, I have a Putty.exe in the path C:\Program Files\PuTTY of my remote VM.
The problem is that command az vm run-command invoke execute scripts remotely on the VM. The contents included --scripts parameter are running on the remote VM. Make sure the scripts include a valid path in remote VM instead of local VM. If the file in the scripts does not exist on remote VM. You probably face this message.
Read more details: Run PowerShell scripts in your Windows VM with Run Command.
I'm trying to delete a file that is stored in persistent volume through CLI. I know the path but not sure how do I through CLI delete the file.
Reason I want to do it through CLI is that I am automating a certain workflow that requires triggering of a powershell script that runs OpenShift CLI to delete a file in volume and scale down.
How about Executing Remote Commands feature to remove the file as follows.
For example,
# oc exec <pod name> -- rm -f /path/to/file.txt
I hope it help you.
I need to do the following
Change environment variables according to the published env. Set Set up cron jobs according to the dev. I I would like to run just 1 command line "eb deploy dev" or something similar.
Use setenv
You can set environment variables with setenv. These will then be remembered for that environment.
More details: https://docs.aws.amazon.com/elasticbeanstalk/latest/dg/eb3-setenv.html
Example
For example, suppose you have created an EB environment called 'staging' and you want to set the variable DB to 'localhost', you can use:
eb setenv DB=localhost -e staging
Crons
Now that you have a different environment variables, you can check them in a script etc. to decide if the cron should be set up.
Note that the crons may not actually have access to your environment variables so you need to set those again for the cron while setting up the cron.
This is my solution to the problem, it took some time to setup but now i can do all the changes with 1 command line.
Make your own folder with all the files for all the environments.
In .ebextensions folder setup empty config files for eb.
npm runs a script named "deploy.js" together with the flag of the specific env.
The script will do the following
copy the requested env data to the empty files according to the env
git stash the changes of .ebextensions folder (eb deploys using git)
eb use env
eb deploy
So now i can tun npm run deploy:dev and everything runs
I wanted clarification on the possible scripts that can be added in the .s2i/bin directory in my project repo.
The docs say when you add these files they will override the default files of the same name when the project is built. For example, if I place my own "assemble" file in the .s2i/bin directory will the default assemble file run also or be totally replaced by my script? What If I want some of the behavior of the default file? Do I have to copy the default "assemble" contents into my file so both will be executed?
you will need to call out the original "assemble" script from your own. Similar to this
#!/bin/bash -e
# The assemble script builds the application artifacts from a source and
# places them into appropriate directories inside the image.
# Execute the default S2I script
source ${STI_SCRIPTS_PATH}/assemble
# You can write S2I scripts in any programming language, as long as the
# scripts are executable inside the builder image.
Using OpenShift, I want to execute my own run script (run).
So, I added in the src of my application a file in ./s2i/run
that slightly changes the default run file
https://github.com/sclorg/nginx-container/blob/master/1.20/s2i/bin/run
Here is my run file
#!/bin/bash
source /opt/app-root/etc/generate_container_user
set -e
source ${NGINX_CONTAINER_SCRIPTS_PATH}/common.sh
process_extending_files ${NGINX_APP_ROOT}/src/nginx-start ${NGINX_CONTAINER_SCRIPTS_PATH}/nginx-start
if [ ! -v NGINX_LOG_TO_VOLUME -a -v NGINX_LOG_PATH ]; then
/bin/ln -sf /dev/stdout ${NGINX_LOG_PATH}/access.log
/bin/ln -sf /dev/stderr ${NGINX_LOG_PATH}/error.log
fi
#nginx will start using the custom nginx.conf from configmap
exec nginx -c /opt/mycompany/mycustomnginx/nginx-conf/nginx.conf -g "daemon off;"
Then, changed the dockerfile to execute my run script as follows
The CMD command can be called once and dictates where is the script located that is executed when the Deployment pod starts.
FROM registry.access.redhat.com/rhscl/nginx-120
# Add application sources to a directory that the assemble script expects them
# and set permissions so that the container runs without root access
USER 0
COPY dist/my-portal /tmp/src
COPY --chmod=0755 s2i /tmp/
RUN ls -la /tmp
USER 1001
# Let the assemble script to install the dependencies
RUN /usr/libexec/s2i/assemble
# Run script uses standard ways to run the application
#CMD /usr/libexec/s2i/run
# here we override the script that will be executed when the deployment pod starts
CMD /tmp/run
(Doctrine-Ubuntu) I run $ doctrine-cli.php command and get doctrine-cli.php: command not found.
P
What's the exactly command you're executing? It should be something like this:
php doctrine-cli.php COMMAND
Also php should be available under PATH system variable but in Ubuntu (if you installed PHP by package installer) that's done by default.
Do this in the command line if you just want to execute it:
chmod +x doctrine-cli.php
and optionally
cp doctrine-cli.php doctrine
this way you can do
./doctrine-cli.php [arguments]
and if you did the optional step you can do
./doctrine [arguments]
and if you don't want the ./ do this
export PATH=$PATH:/full/path/to/doctrine/folder
so now you can execute the doctrine command from any folder
What this does is adds the 'execute' permission to the file. If you have a proper shebang on the top of the file everything should execute fine (I imagine it does) and optionally adds the doctrine folder to your PATH so you can execute it from anywhere