I've been developing an apps script project for my company that tracks our time/expenses. I've structured the project like so:
The company has a paid Gsuite account that owns all the spreadsheets hosted on the company's google drive.
Each employee has their own "user" spreadsheet which is shared from the company Gsuite account with the employee's personal gmail account.
Each of the user spreadsheets has a container-bound script that accesses a central library script.
The library script allows us to update the script centrally and the effects are immediate for each user. It also prevents users from seeing the central script and meddling with it.
Each of the user container-bound scripts have installable triggers that are authorized by the company account so that the code being run has full authority to do what it needs to to the spreadsheets.
This setup has been working quite well for us with about 40 users. The drawback to this setup is that since all the script activity is run by the company account via the triggers, the activity of all our users is logged under the single company account and therefore capped by the apps script server quotas for a single user. This hasn't been much of an issue for us yet as long as our script is efficient in how it runs. I have looked into deploying this project as a web-app for our company, but there doesn't seem to be a good way to control/limit user access to the central files. In other words, if this project was running as a web app installed by each user, each user would need to have access to all the central spreadsheets that the project uses behind the scenes. And we don't want that.
SO with that background, here is my question. How do I efficiently track apps script activity to see how close we are to hitting our server quota, and identify which of my functions need to be optimized?
I started doing this by writing a entry into a "activity log" spreadsheet every time the script was called. It tracked what function was called, and who the user was and it had a start time entry and and end time entry so I can see how long unique executions took and which ones failed. This was great because I had a live view into the project activity and could graph it using the spreadsheet graphs tools. Where this began to break down was the fact that every execution of the script required two write-actions: one for initialization and another for completion. Since the script is being executed every time a user made an edit to their spreadsheet, during times of high traffic, the activity log spreadsheet became inaccessible and errors would be thrown all over the place.
So I have since transitioned to tracking activity by connecting each script file to a single Google Cloud Platform (GCP) project and using the Logger API. Writing logs is a lot more efficient than writing an entry to a spreadsheet, so the high traffic errors are all but gone. The problem now is that the GCP log browser isn't as easy to use as a spreadsheet and I can't graph the logs or sum up the activity to see where we stand with our server quota.
I've spent some time now trying to figure out how to automatically export the logs from the GCP so I can process the logs in real-time. I see how to download the logs as csv files, which I can then import into a google spreadsheet and do the calcs and graphing I need, but this is a manual process, and doesn't show live data.
I have also figured out how to stream the logs from GCP by setting up a "sink" that transfers the logs to a "bucket" which can theoretically be read by other services. This got me excited to try out Google Data Studio, which I saw is able to use Google Cloud Storage "buckets" as a data source. Unfortunately though, Google Data Studio can only read csv files in cloud storage, and not the json files that my "sink" is generating in my "bucket" for the logs.
So I've hit a wall. Am I missing something here? I'm just trying to get live data showing current activity on our apps script project so I can identify failed executions, see total processing time, and sort the logs by user or function so I can quickly identify where I need to optimize my script.
You've already referenced using GCP side of your Apps Script.
Have a look at Metric explorer, it lets you see quota usage per resource and auto generates graph for you.
But long term I think re-building your solution may be a better idea. At minimum switching to submitting data via Google Forms will save you on operation.
Related
I intend to sell copies of a Google Sheet with bound scripts that trigger two OAuths scopes,https://googleapis.com/auth/spreadsheets and https://googleapis.com/auth/script.send_mail. Customers will receive a URL that opens a copy of my original sheet. I am told that verification will not travel with the copy. But is the same true of the auth cap? Can a thousand customers open a copy and answer the auth questions without hitting the cap?
The 100 user cap is a limit for unverified apps. In this case, an "unverified app" refers to the GCP project used as medium to connect to the Google APIs.
As mentioned in the comments by #Ruben, each Apps Script file is connected to its own unique GCP project. Each project has a 100 user cap so if you're just distributing copies of the file then you can give out as many as you want and they will all have their own limit. They would only run into an issues if they happened to share the same file with 100 other users, though this is kind of a moot point since once they have a copy they can keep creating their own copies anyway.
Also of note is that the amount of scopes doesn't matter here. Going through the consent screen counts as one authorization regardless of how many scopes it has.
In short, if all users are using the same script, or different scripts with the same standard project attached, then you have to verify the app or take the 100 auth limit into account. If they're just getting copies of the script with default projects then there's no problem.
I created a small app on top of a spreadsheet (with GAS and HTML, CSS) and I deployed it.
Users can access to it without having to enter in the spreadsheet.
It works really well but i'm not able to see even the basic analytics (for eg. the number of viewers)
Thanks
Go to the project overview page. In the Google Apps Script web IDE, on the lefmost sidepanel click on Overview.
Also, if you have starred your project, go to https://script.google.com/home/starred
Rather than "viewers" you will see "users". If you have set your web-app as execute as you by anyone even anonymous, you will see only one user, you, as this page show the users that exectuted the scripts like the doGet function and the server-side functions called through google.script.run.
Note: https://script.google.com keep execution logs for the last 7 days. If you need to keep the logs longer you have to use another place to keep these logs, i.e. Cloud Logging (requires a Google Cloud standard project), Google Sheets spreadsheet.
Resources
https://developers.google.com/apps-script/guides/logging
Related
Effective way to debug a Google Apps Script Web App
I have a container-bound Apps Script Project contained to a Form Response Google Sheet, triggered on form submit. The script runs as me. I'm dealing with execution runtimes 6-8x the nominal run time during peak hours of the day, which seems largely correlated to increased traffic of form submissions. The program follows this series of steps:
Form response collects a string of information and triggers the Apps Script Project to execute
Project creates a copy of a complex Google sheet of a few tabs and a lot of complex formulas
Project pastes the string collected from the form response into this google sheet copy, flushes the sheet, and then returns a string of results calculated within the sheet
Google sheet file is named and the Project creates a unique Drive folder where this sheet eventually gets moved to.
Run complete
The Project performs a wide variety of getValue and SetValue calls throughout the run as it's updating cell values or reading calculated results. Over the last year, I've improved optimization in many ways (i.e. batch calls to getValues or setValues, batch API calls, etc). It's normal run time is 25-45 seconds, but increases to 200+ seconds during my company's peak business hours. Using logs, there is no one particular step that gets hung up. But rather the script lags in all aspects (When it creates the file copy, SpreadsheetApp.flush(), append or delete rows in google other sheets it references by Sheet ID, etc). Although, I'll say a common reason for a failed execution returns the error message "Service Spreadsheets timed out while accessing document with id..." But most of the executions complete successfully, just after a lengthy run.
It seems there is a clear correlation between execution time and traffic. So my questions are:
Can anyone confirm that theory?
Is there a way I can increase the service's bandwidth for my project?
Is there a better alternative to having the script run as me? (mind you I'm performing the rest of my job using Chrome throughout the day while this program continues to automatically run in the background)
This is an Apps Script managed project; it is not tied to a personal GCP Project. Is that a potential factor at play here?
In my company we use a team drive, where we store a template spreadsheet which our API server copies a lot of times. The spreadsheet contains a few custom Google App Script functions which should be copied as well.
Right now when the API copies the spreadsheet, the scripts stop working in the copy. According to this issue https://issuetracker.google.com/issues/36762799 , service accounts can't execute scripts. However, in a Team Drive there are no owners. I even checked the permissions with Permissions.List from the Drive API and I can see that everyone is an organizer and the API does not allow me to specify another user as an owner, but the scripts still won't work (Error: Please try saving the project again). On the other hand, if I go and copy the generated sheet by hand, scripts execute just fine.
I tried moving everything to an add-on, however to my displeasure, it appears that service account-owned files can't execute add-on scripts as well (if the file is created by the bot, custom function names simply fail to resolve).
Does anyone know of a way to deal with this situation? Is there a more elegant solution besides using an actual account for the copying if that's even possible to do programmatically?
The goal is to set up a simple relational database in Google Cloud SQL (or BigQuery) that automatically receives/retrieves daily emailed reports. The data source, in this case, is Doubleclick but regardless of the source, I'd like to better understand how scheduled email reports (as attachments) can be sent to or ingested by Google Cloud SQL.
Is there some other app or service out there to make this connection? Is there a Google product like MS Visual Studio to run jobs? Sorry for the very beginner questions but none of the Google support articles are very helpful!
You will need to use Google Apps Scripts to tie the process together.
App Scripts can DoubleClick data and potentially read from a gmail account.
Once you have access to the data, you should be able to use BigQuery service in script to load the data
You can then create a scheduled project trigger within script to poll periodically for new files