gsutil doesn't run in the mounted drive directory - google-compute-engine

I'm trying to run gsutil in the shared environment and I see a really weird behaviour.
When I run it being in the root of the filesystem, as well as anywhere else - everything is fine, but when I open the shared drive mounted directory it fails with this:
$: gsutil
cannot open path of the current working directory: Permission denied
The shared drive folder itself is the Google Fileshare NFS with drwxrwxr-x, and the user is in the group that can do rwx.
Any help appreciated, thanks!
update: The issue was in the snap way of the installation of the gcloud-sdk, I'm not sure the exact nature of the problem, but reinstalling it following the google-sdk istallation manual with apt-get solved the issue.

Related

Is there a way to mount Google Drive on my local machine like what could be done in Colab?

In Colab, the following code snippet is used for mounting Google Drive.
from google.colab import drive
drive.mount('/test', force_remount=True)
And I'm wondering if it could work on my local machine. When implementing this locally, it says "no module named google", even after having executed pip install google.
Is there another package that should be installed, or it just cannot be achieved? I've searched for a while, but it seems that the only solution is to install Google Drive Desktop to give access to remote files.
Although google.colab python library can be found here, this library is a collection of tools meant to work in conjunction with the Google Colab product.
Indeed, Google Drive Desktop is your best option to "mount" your Google Drive to your local machine.
Alternatively, there are several 3rd party Google Drive clients available.
Use ocamlfuse.
Here are the step by step details: https://medium.com/#enthu.cutlet/how-to-mount-google-drive-on-linux-windows-systems-5ef4bff24288
Instead of mounting it to a home folder (named googledrive in tutorial), I suggest mounting it such that folder structure would be same for both colab and local machine. To do that:
create your mounting folder at root (it's not a recommended practice but there is no harm). You need to use sudo. i.e at /, use sudo mkdir test
then create MyDrive inside test.
Chnage test or MyDrive owner to yourself: sudo chown <your username> MyDrive/
Mount to MyDrive by: google-drive-ocamlfuse MyDrive/
Enjoy!

Access denied when using go get

I was using go get go get the go-sql-driver from github. When I ran the command go get -u github.com/go-sql-driver/mysql as prompted by the repo, I encountered an error saying "access is denied":
go: writing stat cache: mkdir C:\Program Files\GoPath\pkg: Access is denied.
go: downloading github.com/go-sql-driver/mysql v1.5.0
go get github.com/go-sql-driver/mysql: mkdir C:\Program Files\GoPath\pkg: Access is denied.
I am using Windows 10. This happened when I changed the %GOPATH% environment variable. Any suggestions on solving this?
It seems like Go doesn't have the right to access the "Program Files" folder. I created a GoPath folder at another place and go get seemed to work. However, it created a folder named "pkg" in "GoPath", but from the compilation file I am having, it looks like the package should be installed under an "src" folder. Could somebody please explain how this happened?
mkdir C:\Program Files\GoPath\pkg: Access is denied.
It looks like access denied while trying to create directory. Either change the installation directory or grant access to the path mentioned.
If it is your development machine or laptop, open Command Prompt as administrator and try installation.

Permissions for external HDD nextcloud container

good day.
im trying to migrate my NC19 server to a container.
so far i can install the container and map a persistent drive to the host’s drive but when i try to use an external HDD i get the following erros on the log:
**Initializing nextcloud 19.0.2.2
rsync: chown “/var/www/html/data” failed: Operation not permitted (1)
rsync error: some files/attrs were not transferred (see previous errors) (code 23) at main.c(1207) [sender=3.1.3]**
this is more than likely a permissions issue, but when mounting the HDD as root and granting access to the external drive /media/ncd as root:root or as www-data:www-data i get the above error.
now, the external hdd file system is exfat (not sure if this will affect on the container).
any ideas how can i get past this?
im close to format my hdd as ext4 to see if this fixes it.
thanks in advance
just got this fixed by changing the filesystem from ExFat to ext4

How do I access my Google Drive (G: ) Windows Mount from the Windows Subsystem for Linux (WSL)?

I'm working on a project that I have stored on my Google Drive mount on Windows, and I would like to use Linux for portions of that project. The Windows Subsystem for Linux has served me well for most of my projects, but I've never had the need to mount a network drive. While it's not imperative that I use my Google Drive mount for this project (I could easily place it in my /downloads or /documents folder), I was curious as to how I could access my Google Drive from WSL.
I attempted to create a new mount via:
sudo mkdir /mnt/googledrive
This successfully created the directory, and then I used the command:
sudo mount -t drvfs G: /mnt/googledrive
This too seemed to be successful.
I was able cd to the /mnt/googledrive directory, but I couldn't access any of my files (it reported the '.' location was unavailable).
Perhaps I've simply misunderstood what I was doing?
Any help would be greatly appreciated!
I found a workaround, not using the "Google Drive" application but the "Backup and Sync" for individuals (https://www.google.com/drive/download/).
Basicaly it's doing the same for me but in a different way. Backup and Sync will permit you to backup your drive to Google but also Sync your Google Drive localy.
By choosing to sync your drive localy, you can even select some folders, the files are sync to the "C" drive under your user profile at the same level of your "My Documents" folder.
Using that way, you can access your files from your linux with the working /mnt/c/... link.
If that answer is too late for you, might be still in time for others ;-)

With the Sublime Text 2 SFTP plugin, when I try to save the file I get a failure (Permission denied) message. Why?

I logged in fine, as I can browse the directories fine and open the files, but when I go to save I get a permission denied error. What am I doing wrong?
Maybe simply you don't have the permission to write to the directory.
Mac or PC? I had the same problem on OSX when using SFTP with XAMPP built in FTP server (so i could update local files instead of remote).
Try modifying the permissions on your XAMPP folder (CMD + I) and set EVERYONE to READ/WRITE. Then click the cog (gear icon) and choose apply. This will apply the permissions to all subfolders as well.
Try uploading your files again and see if you still get permission denied errors.
If doing this for a remote server, you'll have to make sure your SFTP user account is listed within an allowed group or is owner of files on your server.
Hope this helps.
Yes, make sure you have write permissions. The easiest way to check and update is the follwing:
ls -al see which permissions are on your file
sudo chmod 777 file_name give permissions to desired file
See more about establishing permissions for an entire group of files http://www.rackspace.com/knowledge_center/article/how-to-add-linux-user-with-document-root-permissions
Take two steps:
check your server permission
make sure you local sftp-config.json
{
"type": "sftp", // not ftp if port is default
}