I am in the process of building a custom Data Studio Community Connector for my web app using Google Apps Script. In my case, the data studio report is embedded as an iframe and the visibility is "public" for users of my application and I do not require any additional Google authorization (it is running as me).
Given that Apps Script has a maximum 30 simultaneous executions limit, does that mean that if I have 30 users simultaneously trying to view the report, I will hit the 30 simultaneous connections?
If that's the case, if I wanted someone else to be executing it, how would I go about doing that? Thank you.
About the number of concurrent accesses to Web Apps, in my experience, when 30 concurrent users are accessed to the Web Apps, there was the case that an error occurs. You can see the experimental data at this answer.
In my recent experiment, I have reported "Concurrent Writing to Google Spreadsheet using Form". In this case, in order to test the concurrent accesses to the Web Apps, I used the LockService in the script of Web Apps. By this, I could find that concurrent access to more than 30 users can be achieved. You can see the experimental data in the following figure.
But in this case, I think that this result will depend on the script of Web Apps. For example, when the script of Web Apps is complicated and the process cost is high, the number of concurrent accesses might become low. So please be careful about this. But, it was found that the LockService for Web Apps is important for concurrently accessing Web Apps.
In my Community Connector data source settings, I had to set it to run as Viewer (pictured below).
However, this prompts the user to authorize the community connector as well as to OAuth their Google account so the script can run as them. I submitted my OAuth consent screen for verification and it was approved within 48 hours.
Related
I am creating a new Google Apps Script projects for clients of mine. These projects are only owned and managed by my single Google Account. These Apps Script projects are duplicates of each other with changes in document IDs. I have a total of 6 Apps Script projects, and I am looking to add 20 more clients (so 20 more projects).
Will this slow down the execution time on my end as well as my clients end? Does each project run on different servers? Does the amount of project on 1 Google Account play a role in execution speed?
As pointed out by #Alan Wells the execution time is not affected by the amount of projects in your account, however your Apps Script quota limits apply to the sum of all your projects.
And if the code were a running as an Add-On it would probably do so on different servers as the users would be running the code from their account, and likely in different geographic locations.
But to summarize:
If the code is running from your account, then it goes against your quota limits.
Apps Script isn't scalable. You can't somehow get more quota.
If your code were published as an Add-On, then the code would run as if it were on the users account, and go against their quota.
Add-On users can't see the project code even though it's using their quota.
You can view the duration of the different script executions in the Apps Script dashboard's execution tab.
I'm developing a Google Apps Script Web App. This web app is the next version of FlipVideo, a script that I previously developed in a Google Sheet (see version at https://sites.google.com/view/flipvideo/)
This is a school oriented web app. A lot of students will access at the same time from around the world (thousands probably). I'm worried about scalability and quota limits.
I will publish the web app using "Execute the apps as: user accessing the web app" and "Who has access the app: anyone". Could this configuration allow thousands of concurrently executions or there is a hard quota impossible to scale?
Unfortunately, what you require is not currently possible. GAS Web Apps have a max of 30 concurrent users (see simultaneous executions under current limitations on the quotas page). You'll need to leverage a different solution to operate at such a large scale. If you want to stick with Google's Cloud Platform I would recommend using Cloud Functions. However, its not a free service.
There are too many scripts running simultaneously for this Google user
account. This indicates that you have too many scripts executing at
once, although not necessarily the same script. Like the exception
above, this most commonly occurs for custom functions that are called
repeatedly in a single spreadsheet.
This above limitation means a single user cannot have the same script executing 30 times. For example, you can achieve this limit by opening the same web app in 30 different tabs. So I guess the limit on the number of users does not exist, or else App Script would lose all meaning as an IDE.
I am in the process of architecting a small application for a G Suite customer. The app will be used by all employees at the customer and during peak times could see as many as 5,000 concurrent users.
They do not have App Maker and do not plan to enable App Maker anytime soon.
Will an Apps Script Web App be able to handle this many concurrent users?
Database:
I was originally thinking of using a Google Sheet as the database. There are no documented Apps Script limits around reading or writing data to a Google Sheet. So as long as I stay within the Google Sheets API quota limits I should be good.
With that said, I am also debating on using Cloud SQL as the database. App Maker forces you to use Cloud SQL so I assume it is a superior option to a Google Sheet. The only way I see to connect to Cloud SQL from Apps Script is via the JDBC service. However there is a hard limit of 50,000 connections per day. https://developers.google.com/apps-script/guides/services/quotas#current_quotas
Does anyone know if this limit is per app script or per user?
If per app script, then 5,000 users would only have 10 calls each per day. That would be unusable for my needs.
Side note, Google Cloud SQL has a maximum of 4,000 connections. I am banking on the fact that reads and writes will be extremely fast so max connections at a single point in time will be less than 4,000.
As TheMaster noted in the above comment you have a max of 30 simultaneous executions which limits you to a max of 30 concurrent users for a GAS Web App.
As an alternative you might be able to leverage Cloud Functions (basically just a Node.js/Express.js module). ̶M̶a̶x̶ ̶n̶u̶m̶b̶e̶r̶ ̶o̶f̶ ̶c̶o̶n̶c̶u̶r̶r̶e̶n̶t̶ ̶u̶s̶e̶r̶s̶ ̶i̶s̶ ̶1̶0̶0̶0̶ ̶t̶h̶o̶u̶g̶h̶. It's not a free platform, but it supports CloudSQL and may be cheaper than getting a Google for Business account (US$11.00 per month per user) as required by App Maker.
I'm developing a web application that's going to start with 200gb of data to be storaged. Over the years, the same application possibly can reach 1tb, perhaps 2tb in 5 years.
What I want from this application is the clients to upload files to the server and the server then upload files do Google Drive, persisting the webviewlink on database. It's working this way on localhost.
I know two options for authentication for Google Drive API: client account and service account.
Service Account's option fits better for me because I want the server to have control of the files, not the client have control.
But Service Account can storage too few data and the storage limit can't be increased. The limit is something around 15gb I guess, not sure.
If the Service Account will not help me, what options would I have to storage 2tb of data or more? Should I find another way to storage the files?
I'd like to stay using Google. If there's not any option using Google Drive API, please, suggest anything else for this scenario.
You have a couple of options.
Use a regular account instead of a Service Account. You will still need to pay for the storage, but it will work and you'll have everything in a single account. From your question "I want the server to have control of the files, not the client have control" I suspect you have looked at the OAuth quickstart examples and concluded that only end users can grant access. That's not the case. It's perfectly valid, and really quite simple, for your server app to grant access to an account it controls. See How do I authorise an app (web or installed) without user intervention? for how to do this.
Use multiple Service Accounts and shard your data across them. The various accounts could all share their folders to a kinda master account which would then have a coherent view of the entire corpus.
Personally I'd go with option 1 because it's the easiest to set up and manage.
Either way, make sure you understand how Google will want to charge you for the storage. For example, although each Service Account has a free quota, it is ultimately owned by the regular user that created it and the standard user quota limits and charges probably apply to that user.
First, the system architecture:
Server: Running IIS ASP and delivering data to a hundred or so WinXP+ clients in the field upon automated requests from those clients. Data sent from the server is large graphic or video files. If a file is placed on the server by a user, these remote clients will "discover" it and download the file.
Clients: As stated above, the clients are remote unattended boxes that fetch content from the server. The end purpose is digital signage.
Problem: All clients hitting the server at the same time makes for slow transfers of large files - not enough bandwidth.
Solution (I think): Use Google Cloud Storage or Google Drive to hold the files and have the clients request (automated and unattended) those files. I think Google would have a higher available bandwidth (at least the NSA thinks so).
Questions:
Which is a better solution between Google Cloud Storage and Google Drive?
Is it possible to use Windows PowerShell or WScript to run scripts to interact with Google? Reason is that I need to avoid installing new software on the client machines that might require user interaction.
Yes you can use powershell as long as you can urlfetch https data. The oauth flow might be tricky to get working, follow examples for installed apps.
100% use cloud storage instead of drive. Drive is not meant to scale with simultaneous downloads and has several quotas so you will need to implement exponential backoff etc with drive.
Yes you can use Drive or Cloud Storage. I would go for Drive over Cloud Storage, because :-
It's free, Cloud Storage will cost you and so you have to worry about your credit card expiring
It's easier to program since it's a simple http GET to retrieve your files
You need to think about your security model. With Drive you could (nb not should), make the files public. Provided your clients can be informed of the URL, then there is no OAuth to worry about. If you need better security, install a Refresh Token on each client. Before each download, your client will make a call to Google to convert the refresh token to an access token. I suggest prototype without OAuth to begin with. Then if (a) it fits, and (b) you need more security, add OAuth.
The Drive web app gives you your management console for the downloadable files. If you use Cloud Storage, you'll need to write your own.
The quota issue is discussed here Google Drive as a video hosting/streaming platform?
Because the quota isn't documented, we can only guess at what the restrictions are. It seems to be bandwidth for a given file, so the larger the file, the fewer the number of downloads. A simple workaround is to use the copy API https://developers.google.com/drive/v2/reference/files/copy to make multiple copies of the file.
You have other options too. Since these are simply static files, you could host them on Google Sites or Google App Engine. you could also store them within App Engine datastore which has a free quota.
Finally, you could even consider a BitTorrent approach.