I noticed that anytime a file is put into the Google Drive File Stream folder the entire file is uploaded even if it is deleted from the user's computer mid-sync. Is there a way to change that behavior? Or even stop it from uploading mid-sync? I've tried going into the cache folder (%AppData%\Local\Google\DriveFS) but had no luck finding anything there.
Thanks for the help.
Signing out did the trick for me. I got a prompt saying:
Sign out of Drive File Stream? There are 2 files (600.0GB) that have not yet been uploaded to Google Drive. If you sign out, these files, as well as files marked "Available offline," will be removed from this computer.
Only drawback is that it sounds like you may need to re-download any files that you want to be available offline.
Backstory:
I tried to upload a couple 300GB files and they (understandably) hung. I deleted the files on my computer but drive still continued trying to upload them. Every time I restarted my computer or Drive File Stream they would show up as pending or syncing.
Related
Yesterday I went to download MySQL, and I managed to install it, when the download finished, I double-clicked and the program opened normally. I created an account just for testing, so I didn't save the password, so I deleted MySQL to create another account... However when I go to download, now only the option to download in Compressed (zipped) Folder format appears, and I have no idea how to open any type of download in that format, I've tried following the steps taught in the videos and I couldn't either.
Is there a way to fix this? Because yesterday MySQL was downloading normally (as in the image below) and I had no difficulty opening it.
But now the download is just like this (as in the image), and I can no longer open the program.
Note: I use the 64 bit version of Windows 10.
I assume you're downloading MySQL community server from it's official page here: https://dev.mysql.com/downloads/mysql/
If you are, then you can see in the page there are 3 blue buttons
I assume that you are downloading from the "Other Downloads" section (no.2 in image).
could you share the link to the source where you downloaded MySQL?
Usually you should be able to unzip the contents of any .zip file by right clicking the archive in a explorer and choosing extract (not sure what the exact buttons in Windows are called since I don't use it).
Maybe your file got corrupted while you downloaded it (connection issues), have you tried re-downloading it?
I've just searched for the download and started it from the official website
(https://cdn.mysql.com//Downloads/MySQLInstaller/mysql-installer-community-8.0.32.0.msi). This starts the download of an .msi file in my case. Can you confirm that this still starts the download of a .zip file?
Using Google Takeout usually gives you zip or tar.gz in Google Drive.
Is there any way, by API or any programming methods, to decompress those files and put them back in Drive without downloading them locally?
You can't decompress a file in Drive, but there are some workarounds for that. Using the API you could easily develop a solution that downloads the file, then decompresses it locally and finally uploads it again. As an alternative, you could speed up that process by using a sync folder that allows you to extract the file locally. Feel free to ask any doubts about those approaches.
I was trying to upload a big image folder into google drive and github but github not allowed and google drive taking too long. How can I upload the local folder to colab.
Sorry, I don't think there's a solution to your issue. If your fundamental problem is limited upload capacity from the machine with the images, you'll just need to wait.
A nice property to uploading to Drive is that you can use programs like Backup and Sync to retry the transfer until it's successful. And, once the images have been uploaded to Drive once, you'll be able to access them quickly in Colab thereafter without uploading again. (See this example notebook showing how to connect your Google Drive files to Colab as a filesystem.)
convert the folder to zip file and then upload it on colab.
further you can unzip your folder by following command.
! unzip "your path"
The unzip method only works for csv files.
If you use a kaggle dataset, use
os.environ['KAGGLE_USERNAME'] = 'enter_username_here' # username
os.environ['KAGGLE_KEY'] = 'enter_key_here' # key
!kaggle datasets download -d dataset_api_command_here
If you have the image in google drive, use
from google.colab import drive
drive.mount('/content/drive')
I uploaded a pdf file to the IPFS de-centralised network. My question here. When I have the IPFS console and I couldn't view the pdf file anymore through the https://ipfs.io/gateway. Why is that? My understanding is once the file is uploaded to the IPFS network and the file will be distributed to the nodes.
Adding a file to IPFS via ipfs add <file> does not distribute it to the network (that would be free hosting!), it only puts the file into the standard format (IPLD) and makes it possible to access over the network (IPFS) as long as someone connected to the network has the file. When you first add something, that's only you. So if you close your laptop, suddenly the file is no longer available. UNLESS someone else has downloaded it since then, because then they can distribute it while your computer is off. There are many "pinning services" which do just that, for a small fee.
Hi Your understanding is correct,But can you tell me how are you uploading files to ipfs network there are number of ways to add data to ipfs network,
if you are able to add data to ipfs you will get the hash of the data, condition is daemon is running locally so that your data can be broadcasted to other peers you are attached to, you can check it by command: ipfs swarm peers
if above conditions are fulfilled you view/get data from https://ipfs.io/ipfs/<replace with hash you will get after adding>
if daemon is not running you can able to add you file and get the hash but you files will be saved locally, you wont be able to access it from web.
please let me know if you need other information
I use Xampp for localhost. I would like to have my files on localhost synced on different computers. I figure that I could just install Google Drive inte the Xampp directory eg. "localhost/Google Drive" (of course I will have to do this on all computers).
Before I do so I wonder if there would be any disadvantages doing so?
Also I wonder how to get "localhost/Google Drive/some-website/index.php" to work (note the space in "Google Drive")?
The best way to do this is to use a local service that calls your google drive files by url, check the google drive documentation so you can do this integration.
Couldn't wait.
Works like a CHARM! Did not experience any disadvantages.
I solved the "Google Drive space problem" like this:
Quit Google Drive
Move the folder to your "htdocs" folder and rename the folder to eg. "google-drive" (my location and filename is "C:\xampp\htdocs\google-drive")
Restart Google Drive (it will say that folder could not be found)
Choose the option to relink to the new folder.
Since I installed the Google Drive folder in "C:\xampp\htdocs\google-drive" I have the option to work on localhost without syncing files eg. files in "C:\xampp\htdocs\my-uncynked-folder".