What's the best way to evaluate a Google Sheet without using the API? By evaluate I mean, change some cells and recompute the dependent cells.
You can use Google Sheets offline, so they have to be stored somewhere and the evaluation code also needs to run offline. How can I control this functionality from JS/Python?
I need to run thousands of computations as quickly as possible and the API would be too slow for this / I would get rate limited.
You can use Google Sheets offline, so they have to be stored somewhere and the evaluation code also needs to run offline. How can I control this functionality from JS/Python?
For offline support the data must be stored by the browser.
You can use the browser's Developer console to inspect the data-stores supported by that browser (ie. IndexedDB, Local Storage, Web SQL, etc.). From what I can see from Google Chrome (go to your developer console and select the Application tab), you can find some google docs data mirrored in IndexedDB databases.
Unfortunately, I don't believe you can interact with that data in any meaningful way outside of the dev console. Plus some of that data appears to be encrypted.
If your goal is to use Google Sheets as a analytics platform to perform heavy computation then you're in for a lot of disappointment. As you're no doubt discovering Google Sheets are bound by a number of limitations and its not meant to be used in that capacity. You probably need something more robust and that means moving to more traditional database solutions, but they wont' be free.
You should consider using normal formulas, this documentation will show you the basics on how to use them for regular data management in the Sheet.
After that, you should see this list of functions to know what tools you have available to create the functionality that you want.
Lastly, you should see this tutorial on creating your own formulas, it can be helpful to expand what the normal functions can do and have a more tailored solution to your needs.
Related
To avoid updating a Google Spreadsheet (via Sheets v4 API) if it's unchanged, I maintain a cache of the spreadsheet's data. I use the Google Drive v3 API's modifiedTime property to establish whether or not my cache is out of date.
I've noticed, though, that modifiedTime doesn't update immediately after something (be it an API call or a user in the web interface) actually modifies the spreadsheet. It often takes several minutes. That makes the cache data somewhat unreliable (because it might be a minute or two out of date, but the latency means I can't know that right now).
Am I missing something? If not, is there another way to get this information without latency?
I think this answer may be partially responsive but I'm not sure where to go with it.
I have about 700 Google sheets, running scripts.
For too many years now, I have to re-authorize these scripts indefinitely.
Of course, I searched the net, I simplified my scripts to minimize the number of services called. Recently, I was hoping that the Google Alpha V8 script could have helped me, but no, definitely no. Still no solution.
My hundreds of sheets are opened by the employees of my company, therefore hundreds of scripts linked to these sheets are automatically used thanks to triggers (simple and installable triger)
It seems obvious that there is a limitation on the number of scripts allowed at the same time. This limitation having been visibly reached for years, I have to re-authorize, every day, some scripts. But by doing this, I guess I am revoking so many other scripts. How Google decides to revoke this or that script, I don't know. I guess the least used.
Why this limitation? Is it possible to increase it?
If not, what is the solution?
I add I found this page https://developers.google.com/apps-script/guides/services/authorization , which is talking about OAuth application user limits (at the very bottom of the page) . I wonder if this could help me.
Thank you.
I worked on my topic, reading a lot of english docs... I painly read and slowly understood them. I hope so !
As I could expected, I hit on restrictions, placed by my organization, I imagine. I am unable to create project. See 1
In the same time, and strangely, I own several projects. 2. I don't know why they exist, even if I recognize their names which are those of several of my scripts.
Anyway if I want to publish my script as an Add-on, I have to "link" my script to a "standard GCP project".
As it is impossible to create specific projects, I tried to follow a google procedure, consisting to link my script to an existing project. But as it was expected, it does not work either 3
If someone had a good suggestion, it would be welcome.
Thx.
JM.
Convert your scripts to an add-on. Then these add-ons can be shared across users and spreadsheets, and you can actually reduce the number of places you need to maintain code. You can view the publishing guide here: https://developers.google.com/gsuite/add-ons/how-tos/publishing-editor-addons
A second solution is to create a standard Google Cloud Platform project and change (some/all of) your scripts to use it, rather than their own hidden GCP project. More info about this step: https://developers.google.com/apps-script/guides/cloud-platform-projects
Note that both of these solutions will result in sharing of certain quota limits, which may mean you need to redesign or update how you do things
However, both of these changes provide increased visibility into the errors that your scripts encounter, as you no longer have 700+ places to check for error reports, just a 1 or a handful.
I've just learned that Google Realtime API is now deprecated. It is suggested to migrate to Firestore instead. However, Firestore's model is not built around Google Drive so it will not be possible to manage and share real-time documents via Google Drive.
Is there any alternative migration path that would keep files stored in Google Drive?
For example, real time documents may have a simple API endpoint that would allow to get and save them as pure JSON. This would mean that we can keep using Google Drive to store our data and only use Firestore to handle real-time editing sessions (if this is needed).
I don't think you'll find any off-the-shelf solutions natively supporting Drive as a backing source, but if you can export your data model as JSON, Convergence can probably handle it. (In full transparency, I am a founder)
It is almost certainly the quickest way to get back to feature parity with the Realtime API, and probably beyond, as we have first-class support for UX features you probably want as well, beyond just the data synchronization.
Yes, you can export your Realtime data via a separate REST API detailed here:
Exporting Realtime Data This ability should remain available after the realtime API is turned down on January 15th, 2019, though Google has not stated for how long, AFAIK
(Firestore doesn't seem to offer any Operational-Transform-like conflict resolution, so is not a viable alternative for applications that currently have good reasons to use the realtime API, though that is Google's only suggestion so far.)
In our company we use Google spreadsheets heavily. I would like to automate some things, for instance every minute I would like to call our bank's API and save all the new transactions into a spreadsheet.
What I'm missing in the official documentation is some major architecture best practices.
First part of the question: Who should be the owner/runner of such user-neutral company-wide script? I would not like to bind it to my personal account. I want it to run seamlessly even after I leave the company some day. Is some artificial technical user needed?
If it makes any difference we plan on moving to team drives in future.
Second part of the question: I would like the scripts to have persistent storages (in the example above - for the record of transactions). I'm considering to be a rule that every script is bound to a spreadsheet which acts as its data store. Is it a good idea?
I generally recommend that clients provide me with an "artificial technical user", as you say, to keep that owner alive and away from my personal account. Which also avoids using up my daily quota especially for regular triggers.
I'd avoid binding the scripts to a GSheet as you'll then get issues with potentially using GitHub or creating add-ons in the future. I generally make all scripts in a separate library (Bruce McPherson did various tests and never found a performance hit) and then link them with a very small script to a script contained in a GSheet if needs be. This allow eases separate development whilst the script is being used live.
GSheets do provide a fair amount of storage (2 million cells) but if you reach that - and the sheets do tend to slow down - take a look at using Firebase for which you get 1GB free and there is a great Firebase library.
There are various general recommendations for Apps Script development. I link a few here.
Happy scripting!
If I write an entire web app in GAS and then it gets popular and it starts receiving a thousand requests per second. Is there any way to either pay Google to handle it, or host my GAS app on non-Google infrastructure, in the same way that here does for Google App Engine?
Your GAS script scales up automatically. The only thing that you should be worried about is your code where, if you have locks, thousands of users waiting for a lock will cause delays to the user. Other than that, scaling up shouldn't be a problem. After all, there are possibly millions of scripts being run by different users.
I've never seen Google suggest Apps Script is the tool for that kind of scale. Go to App Engine, do not pass Go, etc.
There is no third-party implementation of the Google Apps Script services. It is, however, a JavaScript implementation on Java (think Rhino) which you can run yourself - or you could run on App Engine and use the Java GData APIs to replace the Apps Script services.
if the App is running under the user account (execute the app as the user accessing the web app) it will consume each user's quotas and use their own account resources (docs, sheets, etc). This would allow for 'unlimited' scalability.
This is just my own view, am I right or totally wrong?