Any other alternative to upload files to Drive without using service account keys - google-drive-api

Context:
I have a script which uploads zip files to Drive using python and Drive api. In order to use Drive API I'm using client-secret.json key. because of internal changes I need to stop using service account keys. Is there any other alternate way to use Drive API using normal account instead of service account?

You can use it with a desktop credential instead of a service account.
You can check the python quickstart under prerequisites you will see:
Authorization credentials for a desktop application. To learn how to create credentials for a desktop application, refer to Create credentials.
Then you will need to change the way you authorize the credentials on the script. You can use the quickstart ones as well:
creds = None
# The file token.json stores the user's access and refresh tokens, and is
# created automatically when the authorization flow completes for the first
# time.
if os.path.exists('token.json'):
creds = Credentials.from_authorized_user_file('token.json', SCOPES)
# If there are no (valid) credentials available, let the user log in.
if not creds or not creds.valid:
if creds and creds.expired and creds.refresh_token:
creds.refresh(Request())
else:
flow = InstalledAppFlow.from_client_secrets_file(
'credentials.json', SCOPES)
creds = flow.run_local_server(port=0)
# Save the credentials for the next run
with open('token.json', 'w') as token:
token.write(creds.to_json())
try:
service = build('drive', 'v3', credentials=creds)

Related

Google auth credentials error. Requires a JSON file, provided a TOML file

I am trying to host a project with google cloud vision in streamlit cloud. Streamlit requires the google auth credentials file to be .toml while google cloud requires it to be in .json.
def detect_document(path):
cred = ".streamlit/secrets.toml"
os.environ["GOOGLE_APPLICATION_CREDENTIALS"] = cred
client = vision.ImageAnnotatorClient()
with io.open(path, 'rb') as image_file:
content = image_file.read()
image = vision.Image(content=content)
I cannot put my GCP credentials in github since it is a security risk. Any work around or suggestion for this issue.
As #ferdy suggested, go read the Connect Streamlit to Google Cloud Storage.
The steps you are probably missing:
Copy your app secrets to the cloud.
As the secrets.toml file above is not committed to GitHub, you need to
pass its content to your deployed app (on Streamlit Community Cloud)
separately.
This avoids storing them on GitHub as you stated you don't want to do (and rightfully so).
Use the newly stored credentials:
import streamlit as st
from google.oauth2 import service_account
from google.cloud import storage
# Create API client.
credentials = service_account.Credentials.from_service_account_info(
st.secrets["gcp_service_account"]
)
client = storage.Client(credentials=credentials)

How to access gcloud bucket storage from script running in a Cloud Run job

I'm able to create an image from sources and create and run a Cloud Run job through the Google CLI.
gcloud builds submit --pack image=gcr.io/<my-project-id>/logger-job
gcloud beta run jobs create helloworld --image gcr.io/<my-project-id>/logger-job --region europe-west9
gcloud beta run jobs execute helloworld --region=europe-west9
From the CLI, I can also upload files to a bucket by running be python script:
import sys
from google.cloud import storage
def get_bucket(bucket_name):
storage_client = storage.Client.from_service_account_json(
'my_credentials_file.json')
# Make an authenticated API request
buckets = list(storage_client.list_buckets())
found = False
for bucket in buckets:
if bucket_name in bucket.name:
found = True
break
if not found:
print("Error: bucket not found")
sys.exit(1)
print(bucket)
return bucket
def upload_file(bucket, fname):
destination_blob_name = fname
source_file_name = fname
blob = bucket.blob(destination_blob_name)
blob.upload_from_filename(source_file_name)
print(
f"File {source_file_name} uploaded to {destination_blob_name}."
)
if __name__ == "__main__":
bucket = get_bucket('MyBucket')
upload_file(bucket, 'testfile')
Before I can even figure out how the authentication towards the bucket works when running this script through a job, I'm getting errors on target
ModuleNotFoundError: No module named 'google'
Which kind of makes sense, but I don't know how I would include the google.cloud module when running the script on target job? Or should I access the bucket in another way?
There are multiple questions and you would benefit from reading Google's documentation.
When (!) the Python code that interacts with Cloud Storage is combined into the Cloud Run deployment, it will run using the identity of its Cloud Run service. Unless otherwise specified, this will be the default Cloud Run Service Account. You should create and reference a Service Account specific to this Cloud Run service.
You will need to determine which IAM permissions you wish to confer on the Service Account. Generally you'll reference a set of permissions using a Cloud Storage IAM role.
You're (correctly) using a Google-provided client library to interact with Cloud Storage. You will be able to benefit from Application Default Credentials. This makes it easy to run e.g. your Python code on e.g. Cloud Run as a Service Account.
Lastly, you should review Google's Cloud Storage client library (for Python) documentation to see how to use e.g. pip to install the client library.
Place a requirements.txt file, containing the 3rd party modules that are necessary for running the code, in the folder that is being built when executing
gcloud builds submit --pack image=gcr.io/<my-project-id>/logger-job
This will automatically pip install the modules when exeuting the job.

what client_secret.json file is it required for google drive files access

I am getting this error:
File "/usr/local/lib/python3.7/site-packages/pydrive/auth.py", line 388, in LoadClientConfigFile
raise InvalidConfigError('Invalid client secrets file %s' % error)
pydrive.settings.InvalidConfigError: Invalid client secrets file ('Error opening file', 'client_secrets.json', 'No such file or directory', 2)
all I want is to list files and folders using pydrive !
I tried creating credentials but On consent page its showing unverified status.
from the credential page under OAuth 2.0 Client IDs tab I downloaded created credentials and downloaded crediential file secret_json_[.....].json file
Looking into the documentation for pyDrive:
Click ‘Download JSON’ on the right side of Client ID to download client_secret_<really long ID>.json.
The downloaded file has all authentication information of your application. Rename the file to client_secrets.json and place it in your working directory.
So probably you are lacking the last step of renaming the file and placing it in the working directory.
I tried creating credentials but On consent page its showing unverified status.
This should not matter at all, unverified just means that you are trying to access sensitive scopes and until Google verifies your application there would be an extra screen indicating that it is a unverified apps. Is okay for personal/development use.

How to pass the gaction token to gactions when it is called by an exec node (without operator)

Impossible to pass the token, because I have no possibility to paste any code.
The command gaction cli is made by the application.
When gactions is launched, it replies with an URL, to access to the account, and get the token.
But as this gactions is started by the code itself (inside node-red, an exec node), I haven't the possibility to paste the token returned by google.
So, is there any other method to register the action.json file to the app?
I already tried to paste the token to the same exec node of nodered, but unsuccessfully, because the PID changes each time a new command is passed.
I already tried to paste the token using a deamon node, also unsuccessfully (error)
Here is how my gaction file is called:
gactions test -preview_mins 9999999 -action_package action.json -project my-test-app-11111
Actually, the node-red node returns the URL which has to be pasted in a browser.
Then I must identify at my account in Google actions web site, and then, I get a token.
What I need know is the possibility to start gactions with the parameters and the token in the same command line, or to ask gactions (used on Linux) to register by itself with my identification logins.
On a client, when you log into gactions, it will create a creds.data file in that directory, which contains information about your login credentials to automatically authenticate later. If this file is not found, you will be asked to authenticate to get these credentials. If your system has the creds.data file located in the same directory, you should be able to run commands without the explicit need to login each time.

Reauthenticate or Remove authentication Pydrive during runtime in Google Colab

I want to reauthenticate my Google Drive Credentials so that I can login from a different Google account and access it using pydrive. How can I do that?
I am using the following code to authenticate:
auth.authenticate_user()
gauth = GoogleAuth()
gauth.credentials = GoogleCredentials.get_application_default()
drive = GoogleDrive(auth = gauth)
I want to get a fresh token and option to login again into a different google account?
Thanks for helping!
Try deleting the credential file produced by auth.authenticate_user() based on the auth module
rm /content/adc.json
I have just found a solution to this issue. If one goes to his/hers Google Account Settings->Security->Signing in to other sites(all the way down)->YOUR PROCESS NAME and then simply remove it. If you try simply listing the files in the authenticated drive again it should fail with "Token has been expired or revoked."
Just make sure when you create and enter your correct account credentials to delete the credentials/client_secret.json file as the settings.yaml does not seem to overwrite
To reset credentials just remove the gdrive config folder with a shell command from a code block
rm ~/.config/Google/DriveFS/ -rf
I ran into a similar problem where I clicked on the wrong account for Google Drive authentication. None of the files I needed to access were available, and I couldn't figure out a way to revoke a token by the comments above. I tried both restarting the runtime and changing from GPU to TPU. When I tried the latter, it worked and the authorization prompt returned. However, I'm not sure if it worked or just reached the token refresh time.
I factory reset my runtime and was able to get it to accept a different credential.
Runtime -> factory reset runtime