GmailApp limitations with Google Apps Script - google-apps-script

I have a script running in google sheets that collects data via a Google Form. On form Submit triggers the script to run and create a PDF file which is then sent out as an attachment to a list of emails on my team which is anywhere from 15-30 people. Several of these are sent out per day as progress reports so the whole team can see daily progress.
The script keeps eventually failing and giving an error saying Gmail service is invoked too many times.
I am using GmailApp.sendEmail.
I have looked extensively through the google documentation on limits and quotas, but I cannot seem to understand why it is invoked too many times. It's my understanding that I should be allotted 1500 recipients per day with my business GSuite account.
This account sends through SMTP and also sends an attachment with every email. The script runs several times per day.

Related

Accessing the Gmail mailbox by bot written in Apps Script

I am creating a Google Chat bot using Apps Script.
It has to check every x minutes my team's mailbox for some upcoming messages. I used a Gmail service to achieve this functionality.
Is there any option to that the function that is responsible for checking could be run as that certain user (eg. myteamsmalibox#gmail.com)? As for now, I got all my code working but the function is running through my own mailbox.

Apps Script Activity Reporting/Visualization

I've been developing an apps script project for my company that tracks our time/expenses. I've structured the project like so:
The company has a paid Gsuite account that owns all the spreadsheets hosted on the company's google drive.
Each employee has their own "user" spreadsheet which is shared from the company Gsuite account with the employee's personal gmail account.
Each of the user spreadsheets has a container-bound script that accesses a central library script.
The library script allows us to update the script centrally and the effects are immediate for each user. It also prevents users from seeing the central script and meddling with it.
Each of the user container-bound scripts have installable triggers that are authorized by the company account so that the code being run has full authority to do what it needs to to the spreadsheets.
This setup has been working quite well for us with about 40 users. The drawback to this setup is that since all the script activity is run by the company account via the triggers, the activity of all our users is logged under the single company account and therefore capped by the apps script server quotas for a single user. This hasn't been much of an issue for us yet as long as our script is efficient in how it runs. I have looked into deploying this project as a web-app for our company, but there doesn't seem to be a good way to control/limit user access to the central files. In other words, if this project was running as a web app installed by each user, each user would need to have access to all the central spreadsheets that the project uses behind the scenes. And we don't want that.
SO with that background, here is my question. How do I efficiently track apps script activity to see how close we are to hitting our server quota, and identify which of my functions need to be optimized?
I started doing this by writing a entry into a "activity log" spreadsheet every time the script was called. It tracked what function was called, and who the user was and it had a start time entry and and end time entry so I can see how long unique executions took and which ones failed. This was great because I had a live view into the project activity and could graph it using the spreadsheet graphs tools. Where this began to break down was the fact that every execution of the script required two write-actions: one for initialization and another for completion. Since the script is being executed every time a user made an edit to their spreadsheet, during times of high traffic, the activity log spreadsheet became inaccessible and errors would be thrown all over the place.
So I have since transitioned to tracking activity by connecting each script file to a single Google Cloud Platform (GCP) project and using the Logger API. Writing logs is a lot more efficient than writing an entry to a spreadsheet, so the high traffic errors are all but gone. The problem now is that the GCP log browser isn't as easy to use as a spreadsheet and I can't graph the logs or sum up the activity to see where we stand with our server quota.
I've spent some time now trying to figure out how to automatically export the logs from the GCP so I can process the logs in real-time. I see how to download the logs as csv files, which I can then import into a google spreadsheet and do the calcs and graphing I need, but this is a manual process, and doesn't show live data.
I have also figured out how to stream the logs from GCP by setting up a "sink" that transfers the logs to a "bucket" which can theoretically be read by other services. This got me excited to try out Google Data Studio, which I saw is able to use Google Cloud Storage "buckets" as a data source. Unfortunately though, Google Data Studio can only read csv files in cloud storage, and not the json files that my "sink" is generating in my "bucket" for the logs.
So I've hit a wall. Am I missing something here? I'm just trying to get live data showing current activity on our apps script project so I can identify failed executions, see total processing time, and sort the logs by user or function so I can quickly identify where I need to optimize my script.
You've already referenced using GCP side of your Apps Script.
Have a look at Metric explorer, it lets you see quota usage per resource and auto generates graph for you.
But long term I think re-building your solution may be a better idea. At minimum switching to submitting data via Google Forms will save you on operation.

How to continue using google scripts?

Google sent me an email to do app verification to be able to continue using the gmail api. My app is basically a bunch of scripts running in spreadsheets that send an email notification when a change occurs.
Having to get a gsuite account and going through App verification to allow my scripts to survive seems very heavy process.
Does anyone know if all these spreadsheet scripts will now be stripped of their ability to send emails in the new year? or if they can continue as long as you are running them only under your own account?
I have not been able to see any documentation on this to confirm one way or another.
Seeing process described for verification here: Google app script verify scares me!

Google script UrlFetch quota does not reset/replenish

I have been having this issue on my multiple Google accounts (all are consumer accounts, not GSuite). Whenever I try to run any requests with use of UrlFetch function I keep getting the error:
"Service invoked too many times for one day: urlfetch."
I went to myaccount.google.com/permissions?pli=1 and revoked permissions for all scripts that could use the external services. In fact I shut down ALL access for all the scripts. That was last week!
And yet I still don't have the quota replenished/restored.
I read all that Google Developers website has to read about the quota.

Google Apps Script Failed

A couple years back, I installed a script called "Monthly Stats for Gmail" from another gmail user. It would give me my own analytics on when I would send, receive, and who I would send and receive email from. I am now getting this error email every day at the same time 1:13AM, seemingly arbitrary time. I have combed thru the Google Apps Scripts documentation as well as the links given in the email. Is there a way to "uninstall" time based triggers when I was not the originator or author? I am fairly frustrated and would appreciate your help. I am getting this error in the form of an email below. Thx.google script error