Power Automate Error moving CSV from SharePoint to Google Drive - csv

I'm running into an issue with my Power Automate Flow. I'm trying to push a CSV file from a SharePoint folder to a folder in Google Drive. Creating the CSV is fine, but when I try to push the CSV there seems to be a (unreasonable) limit on the size of the CSV I can push.
As the file already exists and I am simply PUSHING the file, not recreating it, surely it shouldn't hit any limits? This file is only 52MB!
The Flow I have - previous step is just a scheduler
error message I get

Look at this link and try with copy to google drive
https://sharepains.com/2019/05/31/copy-large-files-with-microsoft-flow/

Related

Is it possible to upload an entire folder to Google Drive using Google Drive API?

I am currently exploring Google Drive APIs and am able to upload an individual file using Postman successfully.
I tried to create/upload a folder using Google Drive API but failed. In fact, I don't see any specific API to deal with a folder. I was giving it a try with create files API.
Reference - https://developers.google.com/drive/api/v3/reference
Any help will be appreciated.
If you check the documentation for files.create you will notice at the top it states
Creates a file.
This is singular each request creates a single file. If you want to upload a directory you will need to first create the directory then upload each file one at a time.
You could try batching but I don't think you can batch the file upload itself you could probably batch the metadata creation performance#details

Loading CSV file from Google Drive to BigQuery produces zero rows with no errors

A vendor provided me with a set of CSV files over Google Drive. When I load it to a new table in BigQuery using the URL to its location in Google Drive it seems to work without any errors, but there are zero rows in the table. The schema loads correctly. When I download the file to my hard drive and upload the same file it works fine - all of the data is there.
Any thoughts on what the problem might be or how to diagnose it?
"Loading data into BigQuery from Google Drive is not currently supported, but you can query data in Google Drive by using an external table."

Read file from drive in google colab

I Have read the notebook about how to open drive. I already did as instructed using:
from google.colab import drive
drive.mount('/content/drive')
After this, I can use !ls to list the contents of my drive but I cannot read or open any file. I already tried:
with open("/content/drive/My Drive/filename.ext", "r") as file:
file = open("/content/drive/My Drive/filename.ext", "r")
!cp "/content/drive/My Drive/filename.ext" "filename.ext"
and also
import pandas as pd
file = pd.read_csv("/content/drive/My Drive/filename.ext")
But none of the above worked. I always get "operation not supported" or "cannot open file for reading".
I have seen some suggestin to use PyDrive. But it is done by copy file from Google Drive to Google Drive. I don't get why you would have to copy back and forth files, since I need to iterate over all the files on the folder.
Why can't google colab just read the file stored on drive? Or am I doing something wrong? Another thing is that I uploaded a bunch of csv files, but google drive lists them as ".csv.gsheet" (using glob). Could that be the problem? I have no other ideas.
It is straight forward.
from google.colab import drive
drive.mount('/content/drive')
This will ask to open a url which will authorize the mount after you copy paste the token.
If you are not able to read files even now, then prefix your file path with this: 'drive/My Drive' and you are good to go.
For example: file = 'drive/My Drive/data/file.txt'
Where data is a directory in my Google Drive containing file.txt file.
I ran into a similar issue last night. As some of the previous responders posted there are concerns that influence your ability to read the file. These concerns are, one, making certain that your file is accessible via google drive from your Collab notebook and also, two, making certain that your file is in the correct format.
I will explain the steps and include a screen shot.
Open Google Collab. Open the File Browser.
Click the icon that says Mount Drive when hovered. This inserts a new cell in your notebook with the code:
from google.colab import drive
drive.mount('/content/drive')
Run the cell. You are prompted to accept permissions and get a token to use to mount the drive. Grant the permissions and copy and paste the code into the text input. Hit enter.
The drive now appears in the file browser. Right click the folder /drive/My Drive or click the three dots action menu and select Upload.
Locate your file on disk and Upload.
The file appears in the File Browser. Right click the File (or use the three dots action menu) and select Copy Path.
Paste that file path into your pd.read_csv() call.
Run the cell with the pd.read_csv function call.
You should now have the file uploaded in your Google Drive. Accessible to google collab and file formatting preserved because it not been accessed by any other program to munge the format.
Below is the example sans Permission tab because I previously granted permissions.
I just tried mounting and creating a Drive file as you described and couldn't reproduce the error you describe.
https://colab.research.google.com/drive/17iiKJPQOPv1eW5-Ctf707mPHXDtipE5G
Perhaps try resetting your backend using the Runtime -> Reset all runtimes menu. Or, can you share a notebook illustrating the problem?
I (partially) found out what was going on based on Bob Smith and Ami F's answers.
I believe google drive blocks read access from files converted to drive formats (gsheet, gdoc, etc.). And so, whenever I tried to use !cat or open, I got an "operation unsupported" error. When I tried Bob's example, creating a file and then reading it, it worked on my notebook.
So I managed to prevent google from converting files, deleted the old files and uploaded everything to drive again. Now, all my csv's were being kept unchanged (no .gsheet extesion) and I am able to access them using open.
The fact that you see ".csv.gsheet" filenames even though you upload ".csv" filenames makes me think that you're uploading your CSVs to sheets.google.com instead of drive.google.com. Can you confirm that uploading to drive.google.com makes things work?
I do suspect RenatoSz's answer is correct: I can open XLSX files fine, but even just file = open('name_of_file.gsheet') fails for me with Operation not supported error. Annoying that you cannot do the simple action of opening a Google Sheet in Google Colab - this seems like basic functionality.
A workaround for me was:
from google.colab import auth
auth.authenticate_user()
import gspread
from oauth2client.client import GoogleCredentials
# authorise
gc = gspread.authorize(GoogleCredentials.get_application_default())
# open
gsheets = gc.open_by_url('some_fun_URL')
# read
sheets = gsheets.worksheet('List of all experts').get_all_values()
# parse
df = pd.DataFrame(sheets[1:], columns=sheets[0])
Note that gc.open(...) did not work for me.
You can avoid this problem by the following steps
Upload the dataset(.csv) to google drive.
Now select the uploaded dataset on google drive and select the share option from the drop down menu acquired by making a left-click on the selected file.
Now a pop-up window appears on the screen, change the sharing setting to editor.
And in the bottom left of the pop-up window change the restricted(Only-added can edit) to "Anyone with the link can edit.
After these settings are saved. Copy the generated sharable link.
Now got to the below mentioned website and convert your link by pasting the priorly copied link and generate the google drive downloadable link.
https://sites.google.com/site/gdocs2direct/
Copy the generated google drive downloadable link.
We are ready with the perfect path address now for the dataset.
file = open("Paste Here the Generated link which we copied", "r")
This would sort the issue perfectly.
The same would work even if was a .txt file as well.

Download a file from a link on another site?

I am attempting to use Google Apps Script to download an excel file that is linked to on a CRM. That is, instead of each day having a person log in to the CRM, download new data as an excel file, and copy and paste the new data to our existing Google Sheet, I would like to have my script do so.
Is it possible to do this? I have been looking at UrlFetchApp, however I'm having trouble because there is no direct link to the file I need (only a link to generate and download the file).

How to delete file content in Google Drive?

Is possible to delete only file content in Google Drive? i.e I wanted to delete content stream of a particulate file, not the file.
If possible, can some one explain the process?
Thanks in advance.
You can sync. the Google Drive using a local folder and edit it locally.
Syncing locally depends on your OS - download the application from https://www.google.com/drive/download/. After you'll have it locally and the folder is synced, you can update the file (in your case, open the file and delete its content) and it will be empty on the next sync.