Is it possible to open 2000 streaming connections by impersonating 2000 mailboxes? - exchangewebservices

I am in the process of creating an Exchange Service account to listen for EWS notifications for up to 2000 mailboxes. I have been reading through the documentation and it states that
Sa1 can open the connection in the following ways:.... By
impersonating any of the users — m1 for example — so that the
connection is charged against a copy of m1’s budget. (M1 itself can
open ten connections by using Exchange Online, and all service
accounts impersonating m1 can open ten connections by using the copied
budget.)
If the connection limit is hit, the following workarounds are
available: If option 1 is used, the administrator can create multiple service accounts to impersonate additional users.
The microsoft documentation is here: (https://msdn.microsoft.com/EN-US/library/office/dn458789(v=exchg.150).aspx)
Can someone tell me if it is possible to open up 2000 streaming connections to EWS using the same service account by impersonating 2000 mailboxes?
Thanks.

Can someone tell me if it is possible to open up 2000 streaming connections to EWS using the same service account by impersonating 2000 mailboxes?
Yes I have apps that work with 3000+ users but it can be environment dependant. As the link you posted suggests that you should be using Grouping to maintain affinity in 2013 and greater. There is a maximum of 200 user per group (which basically means per connection). The concurrent connection charge if your using Grouping and Impersonation should be charged to the Mailbox your Anchoring the Group connection to (which is generally the first user in the group) not the service account. As each group should have a different anchor Mailbox you shouldn't run into any problems with the 10 User concurrent connection limit.
If you are using Exchange Online you'll find your users are spread across a large number of Servers and most probably data-centres so as long you implement grouping and impersonation correctly you shouldn't have any issues.

Related

Alternative access to application files when server is down

I have an application that generates some reports at every hour. These reports are very critical (and sensitive) to the users and the only access is through the application (excel/pdf generation in memory with database) with previous user/password/role validation.
Last week the server that host the application shut down for several hours (hardware failure) and the users could not retrieve those reports (and i cant access to the db inmediatly).
My client needs to at least access the last generated reports. For example, if the failures occurs at 5 pm, he needs the report of the 4 pm.
So, i thought in store the reports in other place. The server/network administration is not my responsability. I dont have another server (and i cant avoid the network or hardware failures for ever), but i have a hard drive connected to the same server network (NAS).
Also i am thinking in storing the reports in Google Drive (client G suite with some encryption) or some other cloud service. But i am aware that i need permanent internet access.
¿What do you recommend me to do?
Have a nice day.
The best approach uses Nginx and creates multiple instances of the executable file and point to it if one instance stay down, the other instance will serve and the app will be live

Support 5000 concurrent users of Apps Script Web App

I am in the process of architecting a small application for a G Suite customer. The app will be used by all employees at the customer and during peak times could see as many as 5,000 concurrent users.
They do not have App Maker and do not plan to enable App Maker anytime soon.
Will an Apps Script Web App be able to handle this many concurrent users?
Database:
I was originally thinking of using a Google Sheet as the database. There are no documented Apps Script limits around reading or writing data to a Google Sheet. So as long as I stay within the Google Sheets API quota limits I should be good.
With that said, I am also debating on using Cloud SQL as the database. App Maker forces you to use Cloud SQL so I assume it is a superior option to a Google Sheet. The only way I see to connect to Cloud SQL from Apps Script is via the JDBC service. However there is a hard limit of 50,000 connections per day. https://developers.google.com/apps-script/guides/services/quotas#current_quotas
Does anyone know if this limit is per app script or per user?
If per app script, then 5,000 users would only have 10 calls each per day. That would be unusable for my needs.
Side note, Google Cloud SQL has a maximum of 4,000 connections. I am banking on the fact that reads and writes will be extremely fast so max connections at a single point in time will be less than 4,000.
As TheMaster noted in the above comment you have a max of 30 simultaneous executions which limits you to a max of 30 concurrent users for a GAS Web App.
As an alternative you might be able to leverage Cloud Functions (basically just a Node.js/Express.js module). ̶M̶a̶x̶ ̶n̶u̶m̶b̶e̶r̶ ̶o̶f̶ ̶c̶o̶n̶c̶u̶r̶r̶e̶n̶t̶ ̶u̶s̶e̶r̶s̶ ̶i̶s̶ ̶1̶0̶0̶0̶ ̶t̶h̶o̶u̶g̶h̶. It's not a free platform, but it supports CloudSQL and may be cheaper than getting a Google for Business account (US$11.00 per month per user) as required by App Maker.

Google Drive API - Service account storage limitation - Alternatives

I'm developing a web application that's going to start with 200gb of data to be storaged. Over the years, the same application possibly can reach 1tb, perhaps 2tb in 5 years.
What I want from this application is the clients to upload files to the server and the server then upload files do Google Drive, persisting the webviewlink on database. It's working this way on localhost.
I know two options for authentication for Google Drive API: client account and service account.
Service Account's option fits better for me because I want the server to have control of the files, not the client have control.
But Service Account can storage too few data and the storage limit can't be increased. The limit is something around 15gb I guess, not sure.
If the Service Account will not help me, what options would I have to storage 2tb of data or more? Should I find another way to storage the files?
I'd like to stay using Google. If there's not any option using Google Drive API, please, suggest anything else for this scenario.
You have a couple of options.
Use a regular account instead of a Service Account. You will still need to pay for the storage, but it will work and you'll have everything in a single account. From your question "I want the server to have control of the files, not the client have control" I suspect you have looked at the OAuth quickstart examples and concluded that only end users can grant access. That's not the case. It's perfectly valid, and really quite simple, for your server app to grant access to an account it controls. See How do I authorise an app (web or installed) without user intervention? for how to do this.
Use multiple Service Accounts and shard your data across them. The various accounts could all share their folders to a kinda master account which would then have a coherent view of the entire corpus.
Personally I'd go with option 1 because it's the easiest to set up and manage.
Either way, make sure you understand how Google will want to charge you for the storage. For example, although each Service Account has a free quota, it is ultimately owned by the regular user that created it and the standard user quota limits and charges probably apply to that user.

How many open connections does an REST API consumes concurrently?

Let say I have a REST API for an accounting system and 10000 customers are accessing it concurrently using web and mobile clients that I have also developed. There are also other web services that access the API. I only have a single MySQL database instance for the application. I want to be enlightened about the number of MySQL connections that the application will use.
Is it n clients = 1 connection or n clients = n connections?
Please refer to the images below:
A.
OR
B.
Which in these figures is correct? Thank you for your responses!

Destination SMTP server does not want to receive message body

I am using software named send blaster, am using gmail smtp server but it does not allows me to send more then 500 mails, is there any one who knows how can i over come form this prob... need help is there any stmp server or software that can send unlimited mails.
This is the issue of your charset, select the correct charset when composing email in sendblaster, that is "windows-1252"
Google deliberately limits the number of emails that can be sent in a single session and over various timeframes. As you achieved 500 mails I suspect that you are already using one of their premium products - such as Google Apps for Enterprise.
Still, beware that email traffic is measured across the whole domain and that you may be restricting email for other users at your company, or sharing the Google Apps account.
To send batches of thousands of emails you will need a dedicated mailing list manager - search Google for lots of products. You will also have to make changes to your DNS records to prove that you are not sending spam.
To send unlimited numbers of messages, you will need access to a dedicated marketing service.
The Best Free mailing solution according to my experience is spark post which gives you 100k emails for free, But do remember when you're using free 3rd party services, you're using shared servers, therefore if other free users send spam from the shared ip, your mail will probably be sent to spam as well, hence I recommend Implementing your own smtp server or going for a paid plan.
UPDATE
After some research it seems sparkpost no longer offer this free package. But there are no good alternatives. Closest I can find is Mailgun, And best case scenario would be if you send from an AWS EC2 server in which case you get 62000 free mail per month SES.
P.S : These are subject to change, and only valid at the time of the update. Down voting this post will not help change company policy.