how to reduce "URL Fetch calls" refresh rate in GoogleSheets? - google-apps-script

I have a Spreadsheets that collects data from external JSON API to form a database, I know this JSON API updates 1 time a day, this my spreadsheets make about 500 "URL Fetch calls" to fill the whole database, because it is 1 call per line and the database has about 500 lines, in addition, I also do the "File > Share > Public on web" mode and I know that there is an audience of 50~100 people who access this page.
According to Google's rules, external data is limited to 20,000 per day, as stated on this page, in the "URL Fetch calls" line:
https://developers.google.com/apps-script/guides/services/quotas
If I check the option "automatically republish when changes are made" I notice that at the bottom of the page it shows the information "Updated automatically every 5 minutes", but with this option checked, the limit of the "URL Fetch calls" happens in the middle of the day exceeds.
So I have some questions:
spreadsheets make 500 calls every 5min. even if the data is not changed? if yes then i have 144,000 calls per day so is there any way to change this time to 5min. for 12hour?
by changing ImportJSON.gs is there any way to make this refresh rate be reduced to just 1 or 2 times a day?
here is the function I use to collect the JSON data:
https://github.com/bradjasper/ImportJSON/blob/master/ImportJSON.gs
At the moment I haven't tried anything to solve it, as I don't have much knowledge about this ImportJSON. I expect some code change in ImportJSON function or GoogleSheets config or something like that.

Use the Cache Service to make your custom function call external services less often. See Apps Script Best Practices for sample code. I am not affiliated with any product, website or book related with the Cache Service or the sample code, and I am not employed by any company that is so affiliated.
If all data rows can be retrieved in one go, get them all with one call instead of duplicating your spreadsheet formula 500 times.

Related

Using BigQuery as a datastorage/cache system for google sheets addon

I have a google sheets addon with custom formulas that fetch data from my API to show in their result.
The problem for many users is the addon frequently reaches the Urlfetch quota. So I'm trying to use another source of data for my formulas, I been trying to setup BigQuery for that ( I know is not meant to be used like that).
My approach would be something like this:
When a user executes a formula, I look first in BigQuery to see if data already there, if not fetch from API then stores the result in a BigQuery.
I tried a proof of concept, where I added a custom function to my addon with the code sample in https://developers.google.com/apps-script/advanced/bigquery
where I replaced projectId for my own and queried a sample table
when executed the formula got this error:
GoogleJsonResponseException: API call to bigquery.jobs.query failed with error: Request is missing required authentication credential. Expected OAuth 2 access token, login cookie or other valid authentication credential. See https://developers.google.com/identity/sign-in/web/devconsole-project.
I tried also execute the same code from a function called from the sidebar frontend, where I got this error instead:
User does not have bigquery.jobs.create permission in project
From these errors I'm guessing I have to assign a BigQuery role in IAM to each user. But I have hundreds of them.
Is there any why that any addon user can access the BigQuery project? or my whole approach is wrong.
How about using the Cache service instead of BigQuery? The Cache service is simple to use, does not require authentication, and also works in a custom function context.
Note these limitations:
The maximum length of a key is 250 characters.
The maximum amount of data that can be stored per key is 100KB.
The maximum expiration time is 21,600 seconds (6 hours), but cached data may be removed before this time if a lot of data is cached.
The cap for cached items is 1,000. If more than 1,000 items are written, the cache stores the 900 items farthest from expiration.

Google Drive API /files slow response

I want to ask for help/ideas on the issue I will describe below.
Our iOS app allows users to access their Google Drive files.
We use Changes API (https://developers.google.com/drive/api/v3/reference/changes). The main pre-condition to using this API is to build a local DB that holds the snapshot of the user's Drive file tree and the token. To initially fill the DB we must request the list of all files from user's Drive. Getting the list of all files (with metadata) takes too long for many of our users. This is the issue I want to address.
We request files with the series of Files requests (https://developers.google.com/drive/api/v3/reference/files/list). Most requests are plain files?q=trashed%20%3D%20false.
For example, at my own private Google Drive:
69K files
initial request of all files takes 5+ minutes with my current network speed (Download 527 Mbps, Upload 417 Mbps; ping www.googleapis.com – 40–45 ms)
~150 requests
each request brings information about ~460 files
each request takes around 2-2.5 seconds
Sometimes I observed requests to take up to 6 seconds, which means that getting all files list took 15 minutes at my account.
If I look at the Developer Console, the latency is below 0.1s
Many of our users have Drives far bigger than mine. Standard iOS app user's session is not long enough to complete the initial request. We do save every intermediate page token so that all data received during single app session is not lost if user leaves the app – next session we will keep downloading data from the last saved token. But still there're some cases when our app needs the DB to be filled out with data before starting some operations – in that case our users see "Pending..." progress and they complain that our app is slow.
So, questions:
is it possible to improve the described request speed/latency?
maybe there's some quota that we are missing and it can be changed?
maybe someone can advice a more effective way of getting all files list?
P.S. We could potentially reduce the amount of requests. We have to perform some double checks for Shared with Me folders as we observed that sometimes request of all files doesn't list all files from Shared folders. That's a bit of a side story, and I don't think this will dramatically improve situation for us. I can provide more details on the actual set of requests we perform if necessary.
Are you returning all the fields - I would assume so since the only query param provided is trashed=false as the query param. Do you need all the fields? Can you try to reduce the query to only return the fields you really care about (using a field mask) and see if that improves your performance?

How to reload gmail add-on through background process using app-script

May i get some help on the below points where i am using app-script to develop a gmail add-on:
How can we refresh gmail add-on with back ground process?
=> Here is my case, I need to display card with multiple sections which is the process of hitting multiple apis to fetch data and to display the card. For this initially we will show a card with minimal information to the user once i get information from api, i need to update the basic cards with complete information.
How can we trigger a function on every mail thread open?
=> Currently it works once for a mail, here as explained above point need to refresh a card once we fetch the data. If not, user will be seeing same basic information card every time he opens the mail.
From above mentioned issues for point one we are trying to get solution where we can hit service for certain interval of time to check data availability and if data exists then fetch data and update cards, i mean to say need a setTimeout function kind of thing, unfortunately we did'nt found this in app script and We found sleep/waitLock functions in app-script, but my services may take little time to fetch data as it connects though multiple services so we cant make the user to wait until the whole process is to be completed. So that we will show a card with basic information required then after need to auto refresh the cards once we fetch the data. we tried of keeping refresh button for the user to click and fetch the updated data but here we are losing user experience, trying for auto refresh with out user interference to get updated information.
Need a process / solution where we can auto refresh the card with out user interference after the data available at our end instead of making user to wait until the process to be completed.
Earliest reply will be more helpful for us.
Thanks.
If a data status on a third-party backend changes as the result of a user interaction with your add-on UI, it is recommended that the add-on set a 'state changed' bit to true so that any existing client side cache is cleared. See the ActionResponseBuilder.setStateChanged() method description for additional details.
The card-based interface in Gmail Addons is an Apps Script Service.
You can interlink it with other Apps Script services as well as implement API calls - everything within the same Apps Script file.
Gmail Addons contents automatically update every time the user opens a different e-mail or refreshes his browser.
Within your Apps Script code you can install time-driven triggers to run the data availability check with a customized frequency.
Consider to install for your users an Auto Refresh extension if you do not want them to refresh the card themselves.

Best way to refresh the thumbnail/base url of Google Photos/Google Drive API

Google suggests to retrieve the base url again when needed after 60 minutes after the origin query because the url's expire.
So far, so good. But what if I'm developing a photo gallery and I'm displaying 5000 or them in a grid? Should I query the API again and again? They use a maximum page size of 100 (instead pf 1000 for google drive), so we're starting many requests if that's true.
I'm already caching the photos locally, but when the user scrolls to another section, the url will be expired after one hour.
What is the best solution for that?
Google has a batch request for that use case :
https://developers.google.com/photos/library/reference/rest/v1/mediaItems/batchGet
Anyway the maximum number of items you can request per call is 50, so you should anyway queue some of those requests.
Personally I made it automatic in my code. When the baseUrl i'm loading gives me a 403, it automatically get the mediaitem updated object and retry.
Also you should not cache base urls or generally mediaobjects within different app launches (assuming you're writing an app). You should retain only the mediaItem id and reload it or the entire collection when needed.

Google Drive Email Recipient Quota

Referencing this limit: https://script.google.com/dashboard
We have a few forms through our Google Drive, and I use the FormEmailer script (placed in the script editor) to send out a summary notification to two recipients once someone fills it out.
Occasionally, we will have a meeting/gathering where a whole bunch of people will fill out these forms at once.
I did a good amount of research on this but I am getting conflicting information. Some webpages say you can't increase that limit, some say you can (how? We would be willing to pay). It's not a whole lot of extra e-mail, maybe 200 a day instead of the 100 limit.
I see this page but don't see a way to increase it: https://developers.google.com/apps-script/guides/services/quotas
Thanks!
There's no way to increase the limit for Google Apps script.
One possible work around here is to have multiple scripts that perform the same function, and send script1 to some people, and script2 to others or something similar. Alternatively if the two recipients are always the same 2 people, you can consider placing them in a group, and sending the email to the group instead, which counts as sending to only 1 person (Thus doubling your current limit).