I am working on neural networks in Keras and I use Colab to train my network. Unfortunately, any time I stop the training, one of the following problems occurs:
Colab unmounts my gdrive folder. So I must remount it to restart the training.
My gdrive folder on Colab partially empties (I loose my dataset). In this case I also need to restart the session in order to remount gdrive.
Does anyone know the reasons ?
By stopping the training, do you mean stopping the kernel?
If you stop or restart the kernel, the drive will be unmounted.
If you want your training to continue, save your models into checkpoints.
Related
I'm currently using Filebase - a paid IPFS pinning service - to pin my IPFS files. Filebase are still in beta, so the prices are low but could rise in the future.
Therefore I'd like to keep copies of all my files on a drive on my home computer too just in case Filebase has problems,outages, etc. in the future. My local computer files will be an IPFS node itself.
Should I pin these local files on my computer as well? Or will duplicate pinning cause problems?
You can totally pin the local files on your local IPFS node running on your home computer. However, it is more likely that your node on your home computer will crash or the machine itself will corrupt than Filebase. It's true Filebase is a centralized service with a single point of failure, but they are likely running several IPFS nodes pinning your files and storing them long-term on the Sia network.
To safest option is to explore storing your data on Filecoin, which is equivalent to Sia in the sense that it is for long-term persistence (which IPFS is not). Think of IPFS as the decentralized version of Redis. It can cache and store information, but it's not designed to store them like a database is.
I'm using Colab with a mounted Google Drive to unpack zips and consolidate the csvs that come out of them. But this, for example:
for z in zip_list:
zipfile.ZipFile(z, 'r').extractall()
zipfile.ZipFile(z, 'r').close()
os.remove(z)
runs about 60x slower in Colab/Drive compared to when I run it on my local computer. Why is this so much slower and how can I fix it?
A typical strategy is to copy the .zip file from Drive to the local disk first.
Unzipping involves lots of small operations like file creation, which are much faster on a local disk than Drive, which is remote.
I want to download files from a Google Drive account to a server for backup purposes. The account holds about 40GB of files, which are mostly not owned by the user (so Google Takeout won't work).
I'd like to download the files in parallel to speed up the process.
You can use the Google Drive Linux client, which is conveniently called drive. It's under development, but works pretty well.
It's got some dependencies (seemingly Go 1.2+), which can be hard to satisfy in a server environment. But it's possible to install.
$ drive init
$ drive pull
Will pull your whole Drive account down, but be fairly slow.
$ drive list | sed -e 's/^\///' | xargs -P 10 -I{} drive pull -quiet -no-prompt '{}'
Will download your top level folders in parallel, which may or may not be what you want.
It is possible to download in parallel, however you will reach quotas designed to prevent abuse of the system. In the developers console you can increase the rate limit so that a single user (you) can consume all the quota but you will eventually reach the rate limit exception with too many files downloaded in parallel. Basically google makes sure you dont go over a per-second limit given that its a free service or fixed price like google apps.
Is there a way to create a VM instance on Google Cloud using a VHD? It looks pretty straight forward and well documented on the Azure cloud but there is no information I can find on G-Cloud.
Seems like I'm wasting my time recreating my servers on the Google Cloud and I'm quickly losing site of the value of the Google Cloud.
You can bring your own image to GCE. You can follow the instruction on the public documentation on GCE. There's also this video that explains the process.
VHD and VHDX virtual disk files can be converted to GCE compatible disks using steps here. For more sophisticated workflows the image import workflows in the compute-image-tools repository can be useful.
I changed "hdd1\Content" folder with JTAG console, how to restore and update this folder, because xbox hangs up.
Thank you!
If you have a development kit Xbox 360, try accessing the retail emulation partition from neighbourhood (assuming the kernel boots) and renaming it there. If that fails, do a recovery.
If it's a retail Xbox 360, you might need to somehow connect your hard drive to a computer and find an XFAT tool that will enable you to rename the folder back to Content. Your console should boot fine if you detach the hard drive. Until you manage to fix the problem, you can probably still play by detaching the HDD and using a flash drive for storage.
EDIT: I know this question is a few months old now (and the issue is probably resolved), but hopefully this information may be useful to others in the same situation.