I am using OpenShift with Tomcat 7. I am trying to scp a directory into webapps. I search online and the recursion option is -r. However, when I type rhc help scp, the -r option is
-r, --remote-path file_path Remote filesystem path
So is there a recursion option for scp in OpenShift? How can I upload a directory in OpenShift?
Any help would be much appreciated. Thank you.
scp localfile gearnumber#myapp.rhcloud.com:app-root/data
scp localdir -r gearnumber#myapp.rhcloud.com:app-root/data
The only directory where your user has write permission is app-root/data, you can reference it from your code with the environment variable OPENSHIFT_DATA_DIR
Related
I am trying to deploy an app using the following:
az webapp deployment source config --branch master --manual-integration --name myapp --repo-url https://$GITUSERNAME:$GITUSERPASSWORD#dev.azure.com/<Company>/Project/_git/<repo> --resource-group my-windows-resources --repository-type git
The git repo contains 2 .sln solution files and this causes an error when attempting to deploy. Is there any way I can specify which solution file to use? I can seem to find a way in the docs but wondered if there might be a workaround.
I found a solution where you create a .deployment file in the root of the solution with these contents
[config]
project = <PATHTOPROJECT>
command = deploy.cmd
Then a deploy.cmd
nuget.exe restore "<PATHTOSOLUTION>" -MSBuildPath "%MSBUILD_15_DIR%"
The -MSBuildPath may be optional for you
With scp I can add the -r flag to download directories to my local machine via ssh.
When using:
gcloud compute scp -r
it sais that '-r' is not an available option.
Without -r I get an error saying that my source path is a directory. (Implying I can only download single files.)
Is there an equivalent to -r flag for gcloud compute scp command?
Found it!
GCE offers an equivalent and it is --recurse.
My final command looks like this:
gcloud compute scp --recurse username#instance_name:./* "local_dir"
For some reason I also needed the * behind the source folder to avoid some security issue.
Your gutils already has the right credentials, so just simply do
gcloud compute scp --recurse [the_instance_name]:[the_path_on_gcp_instance_folder] [the_path_on_your_machine]
I am attempting to restrict access to a few backend folders in an elastic beanstalk environment, but I cannot figure out how to set chmod permissions in an existing environment.
I am aware that there may be a way to do this through the .ebextensions file, but I have been unable to find a resource outlining this process.
How do I restrict folder access to folders and files in my elastic beanstalk environment?
Thank you!
There is a setting you can use in the .ebextenstions file called "files". I haven't tested this with folders though and I am not sure if you can change permissions on already existing files and folders with it.
https://docs.aws.amazon.com/elasticbeanstalk/latest/dg/customize-containers-ec2.html#linux-files
You could just add a command that does it though.
commands:
01_set_file_permissions:
command: "chmod 600 /path/to/file"
How to download or backup or to save a copy of a file from openshift remote folder into my local-system folder using rhc client tool? or is there any other way other than rhc client tool to make a backup of it to my local system?
Also, Is there a way to copy an entire folder from remote(openshift) to local?
First, tar and gzip your folder on the server within a ssh session, the syntax is:
rhc ssh <app_name>
tar czf <name_of_new_file>.tar.gz <name_of_directory>
Second, after you have disconnected from the openshift server (with CTRL-D), download this file to your local system:
rhc scp <app_name> download <local_destination> <absolute_path_to_remote_file>
Then on your local machine you can extract the file and perform your actions.
Use winscp (if on windows) to ssh into your openshift app. Navigate to your folder. Drag and drop folder or files to local machine.
Filezilla - using filezilla and sftp with openshift
теперь можно так
copy a pod directory to a local directory:
oc rsync <pod-name>:/opt/app-root/src /c/source
https://docs.okd.io/3.11/dev_guide/copy_files_to_container.html
Steps To Replicate
On Windows 8.
In shell (with SSH connection active):
rhc snapshot save [appname]
Error
No system SSH available. Please use the --ssh option to specify the path to your SSH executable, or install SSH.
Suggested Solution
From this post:
Usage: rhc snapshot-save <application> [--filepath FILE] [--ssh path_to_ssh_executable]
Pass '--help' to see the full list of options
Question
The path to keys on PC is:
C:\Users\[name]\.ssh
How do I define this in the rhc snaphot command?
Solution
rhc snapshot save [appname] --filepath FILE --ssh "C:\Users\[name]\.ssh"
This will show the message:
Pulling down a snapshot of application '[appname]' to FILE ...
... then after a while
Pulling down a snapshot of application '[appname]' to FILE ... DONE
Update
That saved the backup in a file called "FILE" without an extension, so I'm guessing in the future I should define the filename as something like "my_app_backup.tar.gz" ie:
rhc snapshot save [appname] --filepath "my_app_backup.tar.gz" --ssh "C:\Users\[name]\.ssh"
It will save in the repo directory, so make sure you move it out of this directory before you git add, commit, push etc, otherwise you will upload your backup too.