I am currently exploring Google Drive APIs and am able to upload an individual file using Postman successfully.
I tried to create/upload a folder using Google Drive API but failed. In fact, I don't see any specific API to deal with a folder. I was giving it a try with create files API.
Reference - https://developers.google.com/drive/api/v3/reference
Any help will be appreciated.
If you check the documentation for files.create you will notice at the top it states
Creates a file.
This is singular each request creates a single file. If you want to upload a directory you will need to first create the directory then upload each file one at a time.
You could try batching but I don't think you can batch the file upload itself you could probably batch the metadata creation performance#details
We use shared google drives and will often link files to one another. E.g., a pdf saved in drive may be discussed in a spreadsheet/doc with a link to drive included. We usually grab these links in a browser or using the "Copy link to clipboard" option on Google Drive for Mac (right click in Finder).
I'm looking to automate some processes and would like to include the Copy Link step as part of the automation.
Digging through the documentation for the google drive api, I've found some options to update permissions, but can't find anything to just copy an existing share link like we do in our workflows.
https://developers.google.com/drive/api/guides/manage-sharing
Does anyone have suggestions on this? We have automation set up in r and python, so options within that would be preferable.
would appreciate if anyone can suggest ideas:
I currently have a Puppeteer script which automates CSV downloads into google drive which then allows me to programmatically access the data and move it into google sheets.
Currently, I run this locally on my computer on VS code. I don't yet understand all too well how to handle file downloads through puppeteer, but I currently have a work around by setting the default chrome download path to a folder on my computer, and that folder is actually Drive for desktop, so easy way to get files directly in drive.
await page._client.send('Page.setDownloadBehavior', {behavior: 'allow', downloadPath: path.resolve('G:/My Drive/DriveData')});
Thing is, I'd like this to be some type of scheduled cloud function not having to be manually run on my computer. IF I were to do that, my Drive for Desktop work-around would not be viable. So is there a way I could pass the file download in Puppeteer directly into Google AppScript somehow? Or use the Drive API to directly receive the file?
My main issue is I don't understand how to actually handle file downloads in Puppeteer, I'm currently just editing the default download location for the google chrome instance puppeteer creates. I ultimately just need to get the data somewhere accessible by Google AppScript.
Any suggestions would be uber-appreciated!
I was also facing the same situation, unfortunately we cant install or use puppeteer in google app script environment,
The alternative is to do the file download using UrlFetchApp , cookies and sessions using session and cookies
The following code will help you save the file into Gdrive
var file = folder.createFile(UrlFetchApp.fetch("url of the downloaded file fetched dynamically"));
You need to check through the network logs in chrome to know how the file and on what URL request you are getting the file downloaded which you can pass to the function
Another alternative that will help you to download the file to drive is to use https://developers.google.com/drive
I had used python to save files in drive https://www.thepythoncode.com/article/using-google-drive--api-in-python
I have a bunch of screenshots (probably in thousands) that I want to run OCR on. I tried different services but Google Drive results are far better. But the process is a little tedious. First I have to upload a photo to Google Drive and then right click the uploaded file and select "Open with -> Google Docs". The resultant doc file contains all the text that was in the screenshot. However, it works for only one file at a time.
So is there a way to tell Google Drive to automatically create the doc file for every photo at the time I upload them.
I know there is Google Drive API that could help in this regard but I have very limited experience with coding and API. I could use a program if available.
I Have read the notebook about how to open drive. I already did as instructed using:
from google.colab import drive
drive.mount('/content/drive')
After this, I can use !ls to list the contents of my drive but I cannot read or open any file. I already tried:
with open("/content/drive/My Drive/filename.ext", "r") as file:
file = open("/content/drive/My Drive/filename.ext", "r")
!cp "/content/drive/My Drive/filename.ext" "filename.ext"
and also
import pandas as pd
file = pd.read_csv("/content/drive/My Drive/filename.ext")
But none of the above worked. I always get "operation not supported" or "cannot open file for reading".
I have seen some suggestin to use PyDrive. But it is done by copy file from Google Drive to Google Drive. I don't get why you would have to copy back and forth files, since I need to iterate over all the files on the folder.
Why can't google colab just read the file stored on drive? Or am I doing something wrong? Another thing is that I uploaded a bunch of csv files, but google drive lists them as ".csv.gsheet" (using glob). Could that be the problem? I have no other ideas.
It is straight forward.
from google.colab import drive
drive.mount('/content/drive')
This will ask to open a url which will authorize the mount after you copy paste the token.
If you are not able to read files even now, then prefix your file path with this: 'drive/My Drive' and you are good to go.
For example: file = 'drive/My Drive/data/file.txt'
Where data is a directory in my Google Drive containing file.txt file.
I ran into a similar issue last night. As some of the previous responders posted there are concerns that influence your ability to read the file. These concerns are, one, making certain that your file is accessible via google drive from your Collab notebook and also, two, making certain that your file is in the correct format.
I will explain the steps and include a screen shot.
Open Google Collab. Open the File Browser.
Click the icon that says Mount Drive when hovered. This inserts a new cell in your notebook with the code:
from google.colab import drive
drive.mount('/content/drive')
Run the cell. You are prompted to accept permissions and get a token to use to mount the drive. Grant the permissions and copy and paste the code into the text input. Hit enter.
The drive now appears in the file browser. Right click the folder /drive/My Drive or click the three dots action menu and select Upload.
Locate your file on disk and Upload.
The file appears in the File Browser. Right click the File (or use the three dots action menu) and select Copy Path.
Paste that file path into your pd.read_csv() call.
Run the cell with the pd.read_csv function call.
You should now have the file uploaded in your Google Drive. Accessible to google collab and file formatting preserved because it not been accessed by any other program to munge the format.
Below is the example sans Permission tab because I previously granted permissions.
I just tried mounting and creating a Drive file as you described and couldn't reproduce the error you describe.
https://colab.research.google.com/drive/17iiKJPQOPv1eW5-Ctf707mPHXDtipE5G
Perhaps try resetting your backend using the Runtime -> Reset all runtimes menu. Or, can you share a notebook illustrating the problem?
I (partially) found out what was going on based on Bob Smith and Ami F's answers.
I believe google drive blocks read access from files converted to drive formats (gsheet, gdoc, etc.). And so, whenever I tried to use !cat or open, I got an "operation unsupported" error. When I tried Bob's example, creating a file and then reading it, it worked on my notebook.
So I managed to prevent google from converting files, deleted the old files and uploaded everything to drive again. Now, all my csv's were being kept unchanged (no .gsheet extesion) and I am able to access them using open.
The fact that you see ".csv.gsheet" filenames even though you upload ".csv" filenames makes me think that you're uploading your CSVs to sheets.google.com instead of drive.google.com. Can you confirm that uploading to drive.google.com makes things work?
I do suspect RenatoSz's answer is correct: I can open XLSX files fine, but even just file = open('name_of_file.gsheet') fails for me with Operation not supported error. Annoying that you cannot do the simple action of opening a Google Sheet in Google Colab - this seems like basic functionality.
A workaround for me was:
from google.colab import auth
auth.authenticate_user()
import gspread
from oauth2client.client import GoogleCredentials
# authorise
gc = gspread.authorize(GoogleCredentials.get_application_default())
# open
gsheets = gc.open_by_url('some_fun_URL')
# read
sheets = gsheets.worksheet('List of all experts').get_all_values()
# parse
df = pd.DataFrame(sheets[1:], columns=sheets[0])
Note that gc.open(...) did not work for me.
You can avoid this problem by the following steps
Upload the dataset(.csv) to google drive.
Now select the uploaded dataset on google drive and select the share option from the drop down menu acquired by making a left-click on the selected file.
Now a pop-up window appears on the screen, change the sharing setting to editor.
And in the bottom left of the pop-up window change the restricted(Only-added can edit) to "Anyone with the link can edit.
After these settings are saved. Copy the generated sharable link.
Now got to the below mentioned website and convert your link by pasting the priorly copied link and generate the google drive downloadable link.
https://sites.google.com/site/gdocs2direct/
Copy the generated google drive downloadable link.
We are ready with the perfect path address now for the dataset.
file = open("Paste Here the Generated link which we copied", "r")
This would sort the issue perfectly.
The same would work even if was a .txt file as well.