Does this risk me to exceed Google spreadsheet API calls quota? - google-apps-script

I am using google spreadsheet to collaborate on some common data, and processing this data using spreadsheet macros and saving it there itself in cells. This approach is error prone as the macro functions which are processing the data depend on inputs apart from what is given as parameters.
Example of how the common data-looks
Content Sheet
Sections Sheet
Pages Sheet
Explanation of this common data
The three sheets are filled by various collaborators
Content sheet defines base elements of a page, they are referred (using their UUIDs) in the sections sheet (using macros), and finally all sections add together to make publishable html page.
The output html format varies depending upon the destination which are multiple - static html, Google Document, Google Slides, Word-press, MediaWiki etc. The formatting is done using GAS macros.
I tried various solutions, but nothing is working properly. Finally I have thought to keep the google spreadsheet as a data source only and move the entire computation to my local machine. This means I have to download the sheets data (3 sheets) which are some 1,000 rows, but can become voluminous with time.
Earlier when the entire computation was on google spreadsheets, I had to only fetch the final data and use it, which amounted to a lot fewer APIs calls. Referring to the example above, It means I would only fetch the output html of the "Pages sheet".
Q1) So my question is, given that I plan to move the entire computation to local machine, if I make only 3 APIs calls to bulk fetch data of three sheets, and pull the entire data, does it still counts as just 3 API calls, or big requests have a different API cost? Are there any hidden risks of over-running the quota in this approach?
Q2) And let's say i use hooks, both spreadsheet hooks and drive hooks to sync with real-time changes, what are my risks of running out of quota. Does each hook call constitute a API call. Any general advice on best practices for doing this?
Thanks

As per the documentation says by default Google Sheets API has a limit of 500 requests per 100 seconds per project, and 100 requests per 100 seconds per user unless you change it on your project.
A1) If you make a single request it counts as 1 request no matter how large the data is, for that reason I'd make a single request in order to GET the entire Spreadsheet rather than making 3 API calls and then make the whole process you mentioned before.
A2) If you've considered using Push Notifications keep in mind that you will need some extra configuration such as a domain.
As a workaround I recommend you to use Drive File Stream
Since you want move the computation to your local machine, this option would be better but will need some changes. For example, you could change the output to CSV in order to better handle the data and it still will be a Google Sheets valid file type and then host your app in the same machine or make your local machine accessible.

Related

How can I optimize my Google Sheet's importXML() calls with an Apps Script to avoid loading errors?

I've been trying to use importXML in Google Sheets to import specific data (in this case, only player name) from several players via the Steam Web API.
I encountered what seems to be a limit with the number of importXML calls I can make in my sheet, because I get loading errors:
Loading data may take a while because of the large number of requests. Try to reduce the amount of IMPORTHTML, IMPORTDATA, IMPORTFEED or IMPORTXML functions across spreadsheets you've created.
This list will likely grow (currently at about 170) and I need a way for it to be able to handle the calls. I don't need the data to update very frequently (even 2-3 times a day is sufficient).
I've tried the code I found from another SO post, but that seems to refresh all the importxml calls at once, so I still got loading errors.
From what I've researched so far, it seems like I'll need to use an Apps Script to optimize my sheet by creating intervals for the calls. Is there a way I could have a script do the following:
Call 25 rows (or whichever limit is optimal)
Wait some amount of time
Call next 25 rows
Continue till the end of the sheet, then restart loop
I'm not too savvy with writing functions so don't know how to edit the code to achieve that. Any help would be appreciated.
If you'd like to take a look at the spreadsheet I'm working with, here it is. For now, only Column B has the importXML calls and the url's are concatenated using cells in Column H. So there's one importXML call per row.

App Script formula in Google spreadsheet cell gets invalidated and needs forced reload

I'm managing a student registration system based on Google docs. Students enter their registration data in a Google Form, the results are gathered in a Google spreadsheet ("Préinscriptions") that uses additional sheets to normalize the data. The data is then imported by another spreadsheet ("Inscriptions") to handle the registrations.
In two normalization sheets in the Google spreadsheet, I use custom Apps Script functions to normalize the data from the response sheets. The functions are used in a vertical concatenation array to gather normalized data from several response sheets into the normalization sheets. The formulas look like this:
On tab "Nouveaux":
={normalizeNouveau("FR", 'FR - Nouveau'!$A$1:$AB);normalizeNouveau("EN", 'EN - Nouveau'!$A$1:$AB); normalizeNouveau("ES", 'ES - Nouveau'!$A$1:$AB)}
On tab "Anciens":
={normalizeAncien("FR", 'FR - Ancien'!$A$1:$AB); normalizeAncien("EN", 'EN - Ancien'!$A$1:$AB); normalizeAncien("ES", 'ES - Ancien'!$A$1:$AB)}
normalizeNouveau and normalizeAncien are the custom functions that normlized data linewise. "FR - Nouveau", "EN - Nouveau" ... "ES - Ancien" are the response sheets.
Now the problem is that, while this all globally works great, the formulas that gather the normalized data, on both "Anciens" and "Nouveaux" sheets, sometimes get "disconnected" (for lack of a better word) and display error messages for each formula -- either a single one or 3, corresponding I expect to the three blocks of data produced by the various function calls. When that happens (usually several times a day), what I need to do is (simply :-| ) to cut the formulas from their original cells, validate the change, then paste them again. And then the data reappears normally.
When the problem happens:
After cut-pasting:
That is very problematic because, as I said, another file ("Inscriptions") uses that data to handle many aspects of the registration process, and when this happens, that whole files becomes empty and useless -- this can even lead to errors if the problem occurs at a time where I'm running an Apps Script function, because the data is then effectively absent. So when that happens in "Inscriptions", I need to go back to "Préinscriptions" to perform the little trick I was just describing. Doing it myself is annoying enough, but other people might need to use that file too and would simply get stuck or, worse, break something if the data simply disappears without warning.
Sometimes the connection seems to break between "Préinscriptions" and "Inscriptions", which loads the data from yet a third tab where "Nouveaux" and "Anciens" get merged. The same trick applies, although it may need several attempts to successfully show the data:
Can anyone tell me why this happen and if there's anything I can do to prevent it? I'm heavily relying on the stable dataflow between the forms, response sheets and registration spreadsheet for the system to work, and this is just breaking the workflow way too often to be negligible.

Can one iterate through bound scripts and edit their manifest files via a script?

I am building an application that will have many users, each of whom will have many Google documents. Each doc will have a custom menu and that custom menu will invoke a library script. I may need or want to change the coding in that library script from time to time.
As changes to a library script must be "saved" as a new version in order for the changed version to be passed on to client scripts (in my case, the scripts bound to Google Docs), I need a way that users can "batch" update the version number in their docs' bound script appsscript.json
file.
I have researched this issue and there seems to be two general alternatives: set the client scripts' library mode to "Developmental" or use an add-on.
The problem with the former is that it won't work unless the users are all granted edit mode access to the library script (which seems particularly a bad idea as the users may well not even be known to me).
The problem with the later is essentially complication and cost. If I make the add-on private, it only works for users in the same domain which means I have to create a G-Suite domain (and pay at least (as of this writing) $72 per year per user—a non-starter for this project).
If I make the add-on public, in addition to the complication, I have to sign up to the Google Cloud Platform and the costs for that require one to navigate a veritable maze of choices and alternatives such that at this point, I really have no idea what the cost per service or user would be.
Below I present some "mock-up" code that should at least indicate the direction I am trying to go.
function upDate() {
var version = 23
var scripts = "https://script.google.com/u/0/home"
//while (scripts.hasNext()) {
//var script = files.next();
//Note: All of the script's have the same name as they commence life bound to a template, which template is duplicated to create the rest of the user's docs
if( scriptName = ScriptName){
//set.dependencies.enabledAdvancedServices[].version
}
}
I don't even know if it's possible to step through bound scripts the way one step's through files in a Google Drive, so that is the first question. Then, the second question is whether, assuming you can step through the scripts one by one, you can change a manifest value—in this case, the version number.
One cannot step through container-bound scripts as they are (no longer) located in one's Google Drive. Moreover, despite Google's documentation about using a "stable" value in the version section of the manifest, that documentation appears erroneous. Finally, one cannot programmatically edit standalone scripts.
However, there is a workaround. What I ended up doing was writing a script that steps through all of the involved Google Docs and copies them to a blank template (i.e., in effect, duplicates them all). That blank template has the bound script installed in it with the new version number of the library. Then, delete original docs (via the same script) and voilà, batch update to all of the target docs is accomplished. (One drawback of this is: if Google Doc revision history is important to you, be advised this gambit jettisons that (unless you keep the original versions).

Delay in changing Google Spreadsheet content via Google apps Script?

I am totally new to Google-apps-script and this may be very poor question
I am making a basic setup using google forms and google apps script
From the responses of form I change my content in Google Spreadsheet accordingly
For example my query from form needed 10000 records to be selected and produced in whole another spreadsheet
I just wanted to know that is there some kind of delay introduced when I set and get values of any cell of spreadsheet on such a large scale? If so on what it depends and how as a programmer can I remove or optimize them?
Thanks is advance!
The Best Practices article by Google is the primary reference for this. The most important advice is to minimize the number of calls to spreadsheet methods by batching operations. So, select a range that contains all the records you need, get them all at once with getValues, process without further interaction with the spreadsheet, and output the result using setValues.
If you follow this advice, 10000 records is still a reasonable amount of data to process by a script.

Autocompletion based on large sheet (2000+ rows)

When using Google Spreadsheets, and you want to use Validation on a cell based on a range of values, you get a pretty nice autocompletion feature that makes data entry much nicer.
My most common application is in an inventory-like situation, where I reference inventory items through some kind of hash code or part number. My frequently used hashes are committed to my brain, but when i need a new part or I need a variation on an old one, I want a little help making sure I have the correct part# selected.
I always find that I want additional row context with my autocompletion, so now I think I want to make a sidebar addon that has smarter searching rules and also includes more contextual data to ensure that I have the part# I meant. Once I am sure of the part#, one button can push the selected result over to the currently active row.
This solution is a bit "heavier" than data validation, but it does exactly what I want.
Assuming that my inventory source is another spreadsheet, what is a good way to set up my Addon-Script Project?
I was thinking that my sidebar would call an HtmlService function that utilizes Cache Service to hold my "hash list" and a few bits of context in memory. I don't think I am looking at a heavy jQuery solution (only to build the autocomplete dialog as I type), but that is really the whole purpose of this question!
Any thoughts on high level project overview? I am fairly new to Apps Scripts in general, especially since the newer API's have been coming out since 2013.
I did exactly that with my Budget Sheets, moved from Data Validation to Jquery's Autocomplete in a sidebar when the number of compositions jumped from 500 to 2.500, and it is a LOT faster, the search is faster than Autovalidation with 100 itens, the logic I use:
Database:
It's base data is in a Spreadsheet, each time it is updated, there's an OnEdit function that will trigger in several minutes a DB update, this is so that the function won't run unecesserary on several times consectively for the same edit.
The DB is then stored in simple text in JSON format on Google Drive, it is a 2MB file generated from the Spreadsheet data, using DriveApp.getFileById(id).setContet(JSON.stringify(myJsonDataFromSpreadsheet)), the file generation and saving takes up to 30 seconds, the file reading is around 4segs.
Sidebar:
Build a normal HTML - remember to use IFRAME option - and serve it, this is all in the docs, you'll have to send data from the HTML to GoogleScript (eg. the part# to insert) via google.script.run, and get back data (eg. the file with all the part numbers) with SuccessHandler in conjunction with google.script.run.
Main function references:
https://developers.google.com/apps-script/guides/html/reference/run
https://developers.google.com/apps-script/guides/dialogs#custom_sidebars
https://developers.google.com/apps-script/reference/drive/file -> First get the file With DriveApp.getFileById(id).