I want to copy a file from my laptop to the Compute Instance. Can i copy the file to a folder which does not exists but is created during the copy?
Example:
I have a file index.php and i want to copy it to /var/www/test
The folder test is not present.
When i run a command:
gcloud compute copy-files index.php user#instance-1:/var/www/test
It does not give any error. But when i ssh into the instance it shows test under /var/www but
cd /var/www/test
gives me:
-bash: cd: test: Not a directory
How can i create a directory and then copy a file?
This is an interesting set of circumstances leading to surprising results. To understand what's going on, it's useful to know that the gcloud compute copy-files command runs the scp command under the covers.
In this case, the scp command is interpreting /var/www/test as a destination path. Because /var/www exists, but /var/www/test does not, it's interpreting the test portion as the name of the file you want to save on the remote machine. So it's dutifully copying the contents of index.php into a file called /var/www/test on the remote machine.
To get the results you want, you should remove the file (with rm /var/www/test), and create a directory (with mkdir /var/www/test). If you were starting with a fresh machine, you could achieve the desired result like this:
gcloud compute ssh user#instance-1 --command='mkdir /var/www/test'
gcloud compute copy-files index.php user#instance-1:/var/www/test
Related
With scp I can add the -r flag to download directories to my local machine via ssh.
When using:
gcloud compute scp -r
it sais that '-r' is not an available option.
Without -r I get an error saying that my source path is a directory. (Implying I can only download single files.)
Is there an equivalent to -r flag for gcloud compute scp command?
Found it!
GCE offers an equivalent and it is --recurse.
My final command looks like this:
gcloud compute scp --recurse username#instance_name:./* "local_dir"
For some reason I also needed the * behind the source folder to avoid some security issue.
Your gutils already has the right credentials, so just simply do
gcloud compute scp --recurse [the_instance_name]:[the_path_on_gcp_instance_folder] [the_path_on_your_machine]
I'm learning data analysis in Zeppelin, I'm a mechanical engineer so this is outside my expertise.
I am trying to download two csv files using a file that contains the urls, test2.txt. When I run it I get no output, but no error message either. I've included a link to a screenshot showing my code and the results.
When I go into Ambari Sandbox I cannot find any files created. I'm assuming the directory the file is in is where the csv files will be downloaded too. I've tried using -P as well with no luck. I've checked in man wget but it did not help.
So I have several questions:
How do I show the output from running wget?
Where is the default directory that wget stores files?
Do I need additional data in the file other than just the URLs?
Screenshot: Code and Output for %sh
Thanks for any and all help.
%sh
wget -i /tmp/test2.txt
%sh
# list the current working directory
pwd # output: home/zeppelin
# make a new folder, created in "tmp" because it is temporary
mkdir -p /home/zeppelin/tmp/Folder_Name
# change directory to new folder
cd /home/zeppelin/tmp/Folder_Name
# transfer the file from the sandbox to the current working directory
hadoop fs -get /tmp/test2.txt /home/zeppelin/tmp/Folder_Name/
# download the URL
wget -i test2.txt
I want to upload a file to the disk attached to my google compute vm from my local machine.
abhigenie92_gmail_com#instance-1:~$ pwd
/home/abhigenie92_gmail_com
abhigenie92_gmail_com#instance-1:~$ gcloud compute copy-files C:\Users\sony\Desktop\Feb\Model\MixedCrowds28 Runge kutta 2nd order try.nl
ogo: ./
abhigenie92_gmail_com#instance-1:~$ gcloud compute copy-files C:\Users\sony\Desktop\Feb\Model\MixedCrowds28 Runge kutta 2nd order try.nl
ogo: /home/abhigenie92_gmail_com
ERROR: (gcloud.compute.copy-files) All sources must be
edit2: Get the following error now:
RE: edit2
Since gcloud's copy-files is a custom implementation of scp, you need to specify the complete path on your VM where you want to copy the files to. In your specific case:
LOCAL-FILE-PATH> gcloud compute copy-files [FILENAMES] [VM-NAME]:[FULL-REMOTE-PATH]
In your specific example:
C:\Users\sony\Desktop> gcloud compute copy-files copy.nlogo instance-1:/home/abhigenie92_gmail_com/
This command will then place the file(s) into your user's home directory root. Just make sure the remote path exists, and that you user has write rights to the destination.
From the looks of what you posted, you're trying to copy things from your local machine to a cloud instance from inside the instance. I'm afraid you can't do that.
I take it you have already installed the gcloud compute tool? If not, install that on your local machine (follow the link) and open up the windows command line, type gcloud auth login to authenticate, then you should be able to do what you want to with the following command:
gcloud compute copy-files C:\Users\sony\Desktop\Feb\Model\MixedCrowds28\ Runge\ kutta\ 2nd\ order\ try.nlogo <VM Name>:~/
Note that I have escaped the spaces in your filename - it's a good idea to get out of the habit of spaces in filenames - and made a couple of assumptions:
Your VM is running linux
You are okay with copying up to your home directory on the VM
If any of these assumptions is incorrect, you may have problems. To copy somewhere else, change the path in the <VM Name>:~/ part
Edit: I mangled a file extension in the original, fixed now!
I mistakenly installed the Google Cloud SDK to the wrong directory on my local machine (I installed it to my Google Drive folder, which is not ideal). What is the preferred method of moving the folder? I haven't tried anything yet for fear of creating issues with environment variables that may have been set during installation. I'm running OS X on my local machine.
The Cloud SDK is self contained, and so the google-cloud-sdk directory can generally be moved to wherever you like. The only thing that is configured outside that directory is your ~/.bash_profile file (only if you said yes during the installation process) which adds the SDK to your PATH and installs command tab completion. If you had the installer update that, probably the easiest thing to do is just delete the google-cloud-sdk directory entirely and reinstall in the location you want. The installer will re-update your ~/.bash_profile with the new location.
Here is the magic script.. just change the PREV_DIR & NEW_DIR variables
PREV_DIR=/Users/some_user/Downloads/google-cloud-sdk
NEW_DIR=/Users/some_user/google-cloud-sdk
function z(){
if test -f "$1"; then; sed -i "" -e "s#$PREV_DIR#$NEW_DIR#g" $1 ; fi
}
z ~/.zshrc
z ~/.zprofile
z ~/.bashrc
z ~/.bash_profile
z ~/.kube/config
for zsh, just move the folder to desired location and update .zshrc, check for following lines and set new path:
# The next line updates PATH for the Google Cloud SDK.
... path.zsh.inc ...
# The next line enables shell command completion for gcloud.
... completion.zsh.inc ...
Steps To Replicate
On Windows 8.
In shell (with SSH connection active):
rhc snapshot save [appname]
Error
No system SSH available. Please use the --ssh option to specify the path to your SSH executable, or install SSH.
Suggested Solution
From this post:
Usage: rhc snapshot-save <application> [--filepath FILE] [--ssh path_to_ssh_executable]
Pass '--help' to see the full list of options
Question
The path to keys on PC is:
C:\Users\[name]\.ssh
How do I define this in the rhc snaphot command?
Solution
rhc snapshot save [appname] --filepath FILE --ssh "C:\Users\[name]\.ssh"
This will show the message:
Pulling down a snapshot of application '[appname]' to FILE ...
... then after a while
Pulling down a snapshot of application '[appname]' to FILE ... DONE
Update
That saved the backup in a file called "FILE" without an extension, so I'm guessing in the future I should define the filename as something like "my_app_backup.tar.gz" ie:
rhc snapshot save [appname] --filepath "my_app_backup.tar.gz" --ssh "C:\Users\[name]\.ssh"
It will save in the repo directory, so make sure you move it out of this directory before you git add, commit, push etc, otherwise you will upload your backup too.