Latency From Google Drive v3's modifiedTime attribute - google-drive-api

To avoid updating a Google Spreadsheet (via Sheets v4 API) if it's unchanged, I maintain a cache of the spreadsheet's data. I use the Google Drive v3 API's modifiedTime property to establish whether or not my cache is out of date.
I've noticed, though, that modifiedTime doesn't update immediately after something (be it an API call or a user in the web interface) actually modifies the spreadsheet. It often takes several minutes. That makes the cache data somewhat unreliable (because it might be a minute or two out of date, but the latency means I can't know that right now).
Am I missing something? If not, is there another way to get this information without latency?
I think this answer may be partially responsive but I'm not sure where to go with it.

Related

Assuming I have edit access to a Google Sheet, how can I write to it programmatically?

A friend has given me edit access to a Google Sheet he owns. I want to write to it via web server code hosted elsewhere---at Wix, actually, where I've built a page that wants to update the spreadsheet from time to time, perhaps a dozen updates per day.
I have found several technologies that might solve the problem. Google Apps Script is one, the Sheets API another. The latter of these is (as far as I can tell) really three distinct options depending on authentication scheme: API key, service account, or OAuth2.
The question is this: Given my specific situation, which of these four approaches (or others I haven't found) is feasible and most appropriate? I'm not asking for opinions; I just don't want to go down one path only to learn later that it's an unworkable dead end (as preliminary research suggests that API key might be) or absurd overkill (as preliminary research suggests that OAuth2 with a Google-approved app might be). Note in particular that I have edit access to the spreadsheet in question and can give that access to others if necessary. If the choice depends on factors I haven't mentioned, what are those factors?

Evaluate Google Sheet offline programmatically

What's the best way to evaluate a Google Sheet without using the API? By evaluate I mean, change some cells and recompute the dependent cells.
You can use Google Sheets offline, so they have to be stored somewhere and the evaluation code also needs to run offline. How can I control this functionality from JS/Python?
I need to run thousands of computations as quickly as possible and the API would be too slow for this / I would get rate limited.
You can use Google Sheets offline, so they have to be stored somewhere and the evaluation code also needs to run offline. How can I control this functionality from JS/Python?
For offline support the data must be stored by the browser.
You can use the browser's Developer console to inspect the data-stores supported by that browser (ie. IndexedDB, Local Storage, Web SQL, etc.). From what I can see from Google Chrome (go to your developer console and select the Application tab), you can find some google docs data mirrored in IndexedDB databases.
Unfortunately, I don't believe you can interact with that data in any meaningful way outside of the dev console. Plus some of that data appears to be encrypted.
If your goal is to use Google Sheets as a analytics platform to perform heavy computation then you're in for a lot of disappointment. As you're no doubt discovering Google Sheets are bound by a number of limitations and its not meant to be used in that capacity. You probably need something more robust and that means moving to more traditional database solutions, but they wont' be free.
You should consider using normal formulas, this documentation will show you the basics on how to use them for regular data management in the Sheet.
After that, you should see this list of functions to know what tools you have available to create the functionality that you want.
Lastly, you should see this tutorial on creating your own formulas, it can be helpful to expand what the normal functions can do and have a more tailored solution to your needs.

Architecture of a company-wide Google Apps Scripts

In our company we use Google spreadsheets heavily. I would like to automate some things, for instance every minute I would like to call our bank's API and save all the new transactions into a spreadsheet.
What I'm missing in the official documentation is some major architecture best practices.
First part of the question: Who should be the owner/runner of such user-neutral company-wide script? I would not like to bind it to my personal account. I want it to run seamlessly even after I leave the company some day. Is some artificial technical user needed?
If it makes any difference we plan on moving to team drives in future.
Second part of the question: I would like the scripts to have persistent storages (in the example above - for the record of transactions). I'm considering to be a rule that every script is bound to a spreadsheet which acts as its data store. Is it a good idea?
I generally recommend that clients provide me with an "artificial technical user", as you say, to keep that owner alive and away from my personal account. Which also avoids using up my daily quota especially for regular triggers.
I'd avoid binding the scripts to a GSheet as you'll then get issues with potentially using GitHub or creating add-ons in the future. I generally make all scripts in a separate library (Bruce McPherson did various tests and never found a performance hit) and then link them with a very small script to a script contained in a GSheet if needs be. This allow eases separate development whilst the script is being used live.
GSheets do provide a fair amount of storage (2 million cells) but if you reach that - and the sheets do tend to slow down - take a look at using Firebase for which you get 1GB free and there is a great Firebase library.
There are various general recommendations for Apps Script development. I link a few here.
Happy scripting!

Referrer limit per google-maps api key

We are providing websites/CMS solutions for more than 2500 customers. Almost all websites have google-map module. So since google changed its map usage policy, from one day to another all those webs had an error on their map modules. We need to come up with some quick (and dirty) solution. We decided to use multiple api-keys, and devide domains between them - alphabetic. And we registered all those 2500+ domains under these keys - manually. One by one.
The solution worked until last week. Now we somehow reached some kind of limit, as we cannot register any new domains/referrers under one of those api-keys. The actual count of domains/referrer of this given api-key: 1537. The saving process yields an error with tracking code (which is every time I try different).
Is there really some kind of limit? Does anyone experienced the same problems. Does some time-economic solution exists?
Thanks for any help or suggestions. Peace!
There is indeed a limit of (at time of writing) about 1,000 referer restrictions per API key. You can create about 100 keys per project, so you can authenticate 100,000 domains with a single project. To proceed further, you can create multiple projects (note that multiple projects can be combined under the same billing account, so you would still receive a single bill).
As a short term fix, you can temporarily remove all restrictions on the key, so that apps relying on that key are functional again. Then you can take the time to release a new key sharding pattern that follows these guidelines.
I just created a feature request so that the situation can be improved, for this use case ("star" it, to be notified of updates).
Google has recently released an alpha version of API that allows manage API keys programmatically.
The best way to handle thousands of authorized domains is to use an API to programmatically manage your API Keys and their restrictions, and we have recently launched a new service that allows you to do this.
This API is still in Alpha. If you are interested in becoming a Trusted Tester for this service, you can use the following form to sign up, please read the instructions carefully:
https://forms.gle/qx2SMcarWCAsbWVp7
Please note that this API is not part of the Google Maps Platform. After you fill out the form, you will be contacted by the API Keys API team with instructions on how to get started, and how to receive support.
API Keys API is currently free of charge. However, please note that use of Cloud Endpoints may be subject to charges at high traffic volume. You can check the pricing sheet here:
https://cloud.google.com/endpoints/pricing-and-quotas
source: https://issuetracker.google.com/issues/35829646#comment12
Hope this helps!

Google App Script limits on number of times doGet can be called

The google script limitations link shows that we can do URL Fetch calls 20,000 / day. Now thats looks quiet ambiguous to me. Inside script, you can use the UrlFetchApp to make get/post requests to external urls. But what if we are calling a deployed google script from external non script client(e.g. web browser/mobile device ).
Does that imply that we can only call the script(with url say abc/exec)20000 times a day(20000=total sum of times the script is called from all client devices) from outside of google app script?
I don't see the relationship between fetching an URL from within a script and running an app from a browser. I never saw any mention of a limit about how many times a webapp could be called but there are probably limits on the total processing time a script can use. The quota dashboard specifies the maximum processing time used by triggers, it does not however specify a limit on processing time by a human user.
If Google does not specify it that means they don't care or that they don't want us to know... in both case the result is the same: we have no way to get the info.
That said, I never encounter any issue with an app being called too often even if I know that some of them are heavily used sometimes.
Was your question purely rhetorical or did you experience some real situation?