Service Account and PyDrive - google-drive-api

I would like to use the Google Drive API to store some backups on it using a cronjob. I just don't understand how I can use PyDrive using a Service Account. When I generate the service account file, and I put it in the directory as my script as client_secret.json.
Using this code :
#!/usr/bin/env python
# -*- coding: utf8 -*-
from pydrive.auth import GoogleAuth
from pydrive.drive import GoogleDrive
def main():
gauth = GoogleAuth()
drive = GoogleDrive(gauth)
f = drive.CreateFile({'parent': 'toto'})
f.SetContentFile('test.drive.py')
f.Upload()
if __name__ == '__main__':
main(sys.argv[1:])
Result
pydrive.settings.InvalidConfigError: Invalid client secrets file Invalid file format.
Well ok. Then I look at other posts on SO, and find these two :
Automate Verification Process
The code from the first answer on this question returns this :
Traceback (most recent call last):
File "test.drive.py", line 4, in <module>
from oauth2client.client import SignedJwtAssertionCredentials
ImportError: cannot import name SignedJwtAssertionCredentials
And then this one :
Automating pydrive verification process
Which just... Doesn't help me much.
Where should I start ? What should I do ? Could someone give me an example with pydrive and Service Authentication just to upload a file ?
EDIT :
After some more research it seems like I needed to install pycrypto to fix the import error described above. I don't know why as it is not specified in the error message.

After some more research it seems like I needed to install pycrypto to fix the import error described above. I don't know why as it is not specified in the error message.

Related

Flask Application (MySQL) - KeyError: 'migrate'

In order to perform a migration, through commands flask db init and flask db migrate, I receive the following error: directory = current_app.extensions['migrate'].directory KeyError: 'migrate'.
I have created a Migrate object within my __init__.py file, however, still receive the error stated above:
from flask import Flask
from flask_sqlalchemy import SQLAlchemy
from config import app_config
from flask_login import LoginManager
from flask_migrate import Migrate
# Creating Flask app.
app = Flask(__name__)
# Database variable initialisation.
db = SQLAlchemy()
login_manager = LoginManager()
def create_app(config_name):
app = Flask(__name__, instance_relative_config=True)
app.config.from_object(app_config[config_name])
app.config.from_pyfile('config.py')
db.init_app(app)
# Creating login manager object and initialising it.
# Login view and message prevent a user to accessing a page they are not authorised to.
login_manager.init_app(app)
login_manager.login_message = "Please login to access this page."
login_manager.login_view = "auth.login"
migrate = Migrate(app, db)
from app import models
return app
Any advice for what I am doing wrong? I have already viewed quite a few pages related to this error and have implemented the possible alterations.
I found the problem. The above __init__.py class was completely fine.
My config.py file which contained the following code:
SQLALCHEMY_DATABASE_URI = 'mysql+mysqlconnector://{db_user}:{my_password}#localhost/{database}'
... had a format extension to it which was not needed. I have successfully updated my database.

Apache2 No permission to write file [Errno 13] Permission denied Flask Python

A few Details first
So I did a little web application with Flask.
In theory it should get the ip whenever someone requests or visits the website.
I have everything done (On Windows my Code runs perfectly), but I installed Flask and moved my Project over to a Linux Server where I have Apache2 installed. Ive configured Apache so it handles the requests for the Flask web app.
Everything fine, like my templates load just fine, but the part with logging the ip doesn't work.
I think getting the IP is no problem, tho storing it in say a json file is.
Every time i try to run I get a 500 error on my website.
Apache Error Log : [Errno 13] Permission denied '/opt/iplogs/iplog.json'
The Python Code
def writeToJSONFile(path, fileName, data):
filePathNameWExt = path + fileName + '.json'
with open(filePathNameWExt, 'a') as fp:
json.dump(data, fp, indent=2)
fp.close()
#app.route("/")
def getIP():
visit = {}
ip_visit = request.remote_addr
now = datetime.now()
request_time = now.strftime("%d/%m/%Y %H:%M:%S")
visit["IP"] = str(ip_visit)
visit["date"] = str(request_time)
writeToJSONFile("/opt/iplogs/", "iplog", visit) # WHEN i comment this function out there is no 500 error
return render_template("home.html")
The Main Problem
So in Windows in a Development Envoirement it works fine, but also in linux when i just let Flask run without apache handling its requests
Only when I run the website through Apache I get the error "Permission denied"
So it has to do something with apache and its permissions to write?
Note the folder where my flask(python code) lives is completly different from where the ips are logged
+ I use Ubuntu and i didn't change anything regarding permissions with files or so, heck im even running through root (I know I shouldn't be doing that but its only for testing a very small project)
Thats all I can give you guys
Thanks for all the responses
Try this:
sudo chown -R www-data:www-data /opt/iplogs/
The Apache2 user www-data has no perrmission to manipulate this file.

Azure Batch :Elevating the user privileges during Pool Creation using Azure CLI

I need to mount the azure file storage to Linux-Pools when they are being spun-up.I am following the instructions given here to achieve that: mounting Azure-File Storage to Batch Specically in my Azure CLI script under the Pools start commands I am inserting something which looks like this
--start-task-command-line="apt-get update && apt-get install cifs-utils && mkdir -p {} && mount -t cifs {} {} -o vers=3.0,username={},password={},dir_mode=0777,file_mode=0777,serverino".format(_COMPUTE_NODE_MOUNT_POINT, _STORAGE_ACCOUNT_SHARE_ENDPOINT, _COMPUTE_NODE_MOUNT_POINT, _STORAGE_ACCOUNT_NAME, _STORAGE_ACCOUNT_KEY)
but when I run the tasks with the auto-user that batch uses by default I get an error in the stderr.txt file mentioning that it was unable to create the "/mnt/MyAzureFileshare" directory and so my guess is the mounting didn't occur during the pool creation process.I saw a very similar question to the one I am facing:setting custom user identity for tasks and even the official Microsoft documentation goes over this in detail:Run Tasks under User accounts in Batch but none of them put a light on how to achieve this using Azure CLI.
In order to install specific packages so that Azure File Storage can be mounted requires sudo privileges and I am unable to do that through the Azure-CLI. In order to recreate the error I would recommend having a look at this:app to replicate the issue
What I want to achieve is:
1) Create a Pool with the Azure-File Storage mounted on it and elevate the privileges of the auto-user to the admin level using Azure CLI
2) Run tasks with the same auto-user with Admin Privileges using the azure CLI
Update 1:
I was able to mount Azure File Storage with Batch using the Azure CLI. I still am not able to populate the Azure File Storage with the output files of the app that I deployed on Batch Nodes.I have got no error in the stderr.txt files.
The output of the stderr.txt file is:
WARNING: In "login" auth mode, the following arguments are ignored: --account-key
Alive[################################################################] 100.0000%
Finished[#############################################################] 100.0000%
pdf--->png: 0%| | 0/1 [00:00<?, ?it/s]
pdf--->png: 100%|##########| 1/1 [00:00<00:00, 1.16it/s]WARNING: In "login" auth mode, the following arguments are ignored: --account-key
WARNING: uploading /mnt/batch/tasks/workitems/pdf-processing-job-2018-10-29-15-36-15/job-1/mytask-0/wd/png_files-2018-10-29-15-39-25/akronbeaconjournal_20180108_AkronBeaconJournal_0___page---0.png
Alive[################################################################] 100.0000%
Finished[#############################################################] 100.0000%
The Python App that was deployed on the Batch Nodes is:
import os
import fitz
import subprocess
import argparse
import time
from tqdm import tqdm
import sentry_sdk
import sys
import datetime
def azure_active_directory_login(azure_username,azure_password,azure_tenant):
try:
azure_login_output=subprocess.check_output(["az","login","--service-principal","--username",azure_username,"--password",azure_password,"--tenant",azure_tenant])
except subprocess.CalledProcessError:
sentry_sdk.capture_message("Invalid Azure Login Credentials")
sys.exit("Invalid Azure Login Credentials")
def download_from_azure_blob(azure_storage_account,azure_storage_account_key,input_azure_container,file_to_process,pdf_docs_path):
file_to_download=os.path.join(input_azure_container,file_to_process)
try:
subprocess.check_output(["az","storage","blob","download","--container-name",input_azure_container,"--file",os.path.join(pdf_docs_path,file_to_process),"--name",file_to_process,"--account-key",azure_storage_account_key,\
"--account-name",azure_storage_account,"--auth-mode","login"])
except subprocess.CalledProcessError:
sentry_sdk.capture_message("unable to download the pdf file")
sys.exit("unable to download the pdf file")
def pdf_to_png(input_folder_path,output_folder_path):
pdf_files=[x for x in os.listdir(input_folder_path) if x.endswith((".pdf",".PDF"))]
pdf_files.sort()
for pdf in tqdm(pdf_files,desc="pdf--->png"):
doc=fitz.open(os.path.join(input_folder_path,pdf))
page_count=doc.pageCount
for f in range(page_count):
page=doc.loadPage(f)
pix = page.getPixmap()
if pdf.endswith(".pdf"):
png_filename=pdf.split(".pdf")[0]+"___"+"page---"+str(f)+".png"
pix.writePNG(os.path.join(output_folder_path,png_filename))
elif pdf.endswith(".PDF"):
png_filename=pdf.split(".PDF")[0]+"___"+"page---"+str(f)+".png"
pix.writePNG(os.path.join(output_folder_path,png_filename))
def upload_to_azure_blob(azure_storage_account,azure_storage_account_key,output_azure_container,png_docs_path):
try:
subprocess.check_output(["az","storage","blob","upload-batch","--destination",output_azure_container,"--source",png_docs_path,"--account-key",azure_storage_account_key,\
"--account-name",azure_storage_account,"--auth-mode","login"])
except subprocess.CalledProcessError:
sentry_sdk.capture_message("Unable to upload file to the container")
def upload_to_fileshare(png_docs_path):
try:
subprocess.check_output(["cp","-r",png_docs_path,"/mnt/MyAzureFileShare/"])
except subprocess.CalledProcessError:
sentry_sdk.capture_message("unable to upload to azure file share ")
if __name__=="__main__":
#Credentials
sentry_sdk.init("<Sentry Creds>")
azure_username=<azure_username>
azure_password=<azure_password>
azure_tenant=<azure_tenant>
azure_storage_account=<azure_storage_account>
azure_storage_account_key=<azure_account_key>
try:
parser = argparse.ArgumentParser()
parser.add_argument("input_azure_container",type=str,help="Location to download files from")
parser.add_argument("output_azure_container",type=str,help="Location to upload files to")
parser.add_argument("file_to_process",type=str,help="file link in azure blob storage")
args = parser.parse_args()
timestamp = time.time()
timestamp_humanreadable= datetime.datetime.fromtimestamp(timestamp).strftime('%Y-%m-%d-%H-%M-%S')
task_working_dir=os.getcwd()
file_to_process=args.file_to_process
input_azure_container=args.input_azure_container
output_azure_container=args.output_azure_container
pdf_docs_path=os.path.join(task_working_dir,"pdf_files"+"-"+timestamp_humanreadable)
png_docs_path=os.path.join(task_working_dir,"png_files"+"-"+timestamp_humanreadable)
os.mkdir(pdf_docs_path)
os.mkdir(png_docs_path)
except Exception as e:
sentry_sdk.capture_exception(e)
azure_active_directory_login(azure_username,azure_password,azure_tenant)
download_from_azure_blob(azure_storage_account,azure_storage_account_key,input_azure_container,file_to_process,pdf_docs_path)
pdf_to_png(pdf_docs_path,png_docs_path)
upload_to_azure_blob(azure_storage_account,azure_storage_account_key,output_azure_container,png_docs_path)
upload_to_fileshare(png_docs_path)
The upload_to_fileshare() in the python app above should initiate the upload but in my case nothing happens and there is no error in the copy operation in the stderr.txt files
Please let me know a way to troubleshoot this issue
It does not look like the run elevated parameter is exposed via a command line argument through the CLI. You can however specify a JSON file to the --json argument formatted as the REST API object to get all functionalities.

cx_freeze to access .json files

I have created an application for windows using pythons cx_freeze module. The application runs the openpyxl module which runs fine for the script but when frozen it fails to find the .constants.json files. The following error is displayed.
FileNotFoundError: [Errno 2] No such file or directory: 'C:....\exe.win-amd64-3.4\library.zip\openpyxl.constants.json'
I have found a fix for this (https://cx-freeze.readthedocs.org/en/latest/faq.html#using-data-files) detailed below :
def find_data_file(filename):
if getattr(sys, 'frozen', False):
# The application is frozen
datadir = os.path.dirname(sys.executable)
else:
# The application is not frozen
# Change this bit to match where you store your data files:
datadir = os.path.dirname(__file__)
return os.path.join(datadir, filename)
The question I have is where do I paste this code? Does it go in the setup.py file? Or somewhere else?

web2py function not triggered on user request

Using web2py (Version 2.8.2-stable+timestamp.2013.11.28.13.54.07), on 64-bit Windows, I have the following problem
There is an exe program that is started on user request (first an txt file is created then p is triggered).
p = subprocess.Popen(['woshi_engine.exe', scriptId], shell=True, stdout = subprocess.PIPE, cwd=path_1)
while the exe file is running it is creating a txt file.
The program is stopped on user request by deleting the file the program needs as input.
when exe is started i have other requests user can trigger. it is common that request comes to server (I used microsoft network monitor to check that), but the function is not triggered.
I tried using scheduler but no success. Same problem
I am really stuck here with this problem
Thank you for your help
With a help of web2py google group the solution is.
I used scheduler. Created a scheduler.py file with the following code
def runWoshiEngine(scriptId, path):
import os, sys
import time
import subprocess
p = subprocess.Popen(['woshi_engine.exe', scriptId], shell=True, stdout = subprocess.PIPE, cwd=path)
return dict(status = 1)
from gluon.scheduler import Scheduler
scheduler = Scheduler(db)
In my controller function
task = scheduler.queue_task(runWoshiEngine, [scriptId, path])
you also have to import scheduler (from gluon.scheduler import Scheduler)
then I run the scheduler from command prompt with the following (so if I understood correctly you have two instances of web2py running, one for webserver, one for scheduler)
web2py.py -K woshiweb -D 0 (-D 0 is for verbose logging so it can be removed)