What are firebase analytics rate limits? - firebase-analytics

I know google analytics has this way of handling rate limits, now how does firebase handle this (maybe a maximum of hits per second)?

There is no rate limit. However unless you are in debug mode events will generally be bundled and sent about once per hour.

Related

N2D/C2 quota for europe

I am a google cloud user which created a cloud HPC system. The web application is really consuming in terms of hardware resources [it's quite common that I have 4/5 N1 instances allocated with 96 cores each for a total of more than 400 cores].
I found that currently in certain zones it's possible to use N2D and C2 instances which are higher in terms of CPU the first ones and dedicated to the computing the latter. Unluckily I can't use these two instances because, for some reason, I have troubles increasing the quota N2D_CPUS and C2_CPUS above the default value of 24 [which is nothing considering my needs].
Can anyone help me?
The only way to increase the quota is to submit a Quota Increase request. Once you submit the request, you should receive an email saying that the request has been submitted and that it is being reviewed.

What are the rate limits that throw the "Rate Limit Exceeded" error during an upload?

Case
I have this server-to-server file upload implementation that uses Google Drive on the other end. All of a sudden I've been seeing this intermittent error called Rate Limit Exceeded during scheduled file uploads.
Refactor and test
I know the error can be handled by batching the uploads and/or by doing exponential backoff based from official documentation. My concern is the actual rate limits so I did a test.
I restructured the code to make the uploads 1 file only for every 3 minutes.
Didn't work! - still getting the same errors and still happens intermittent.
Questions
Are there official figures as to maximum rate limits? How many requests per hour? Something like size-to-period ratio or number-of-requests-to-period ratio would really help.
What are the actual rate limits that throw/trigger the "Rate Limit Exceeded" error during a file upload?
You can check your current traffic usage from https://console.developers.google.com.
Some operations like create and update operations have additional internal limits that may be lower than your allowed QPS.
Depending on your use case, there are more specific things you can do (e.g. slow down on per-user operations, but compensate by doing more users in parallel to maximize throughput).
Also, the "403 Rate Limit Exceeded" errors that you are seeing may be occurring due to the number of read/write requests that you are performing per user per sec in your application. Please consider trying the following steps to reduce the errors:
You may need to optimize your code by reducing the number of api calls made simultaneously per user/sec.
Batch the requests.

What are the specific quotas for Properties Service in Google Apps for an Add-On?

We have an Add-On that uses DocumentProperties and UserProperties for persistence. We've recently had some users complain about receiving the error: "Service invoked too many times for one day: properties". I spent some time optimizing usage of the Properties Service when developing the Add-On to ensure this doesn't happen, but it still seems to in some situations.
According to https://developers.google.com/apps-script/guides/services/quotas, there is a quota of 50,000 "set" calls per day per user or document. There does not appear to be a limit on the number of reads.
Here's how our app utilizes properties:
We only set properties when values change. Even with very heavy usage, I still can't imagine more than 500 property "set" calls per day per user. When we set properties, we also write to the Cache Service with a 6 hour timeout.
We read the properties when the user is using the add-on, and also every 10 seconds while the add-on is idling. That comes out to 8640 reads per document per day. However, we use the Cache Service for reads, so very few of those reads should hit the Properties Service. I did discover a bug that existed when the most recent bug report came in where we don't re-write to the Cache even after it expires until an explicit change is made. By my calculations, that leads to 6 hours with 1 read, and then 18 hours of 6 reads/min * 60 min/hr, or 6480 reads. Factor in some very heavy usage, and we're still at about 7000 reads per day per document. The user claims he had two copies of the document open, so 14000 reads. That's still far below the 50000 quota, not to mention that the quota seems to be for setting properties, not reading them.
Would anyone from Google be able to offer some insight into any limits on Properties reads, or give any advice on how to avoid this situation going forward?
Thanks much!
This 50,000 writes per day quota is shared across all the scripts. This means that if a particular script makes heavy use of writes, it could negatively impact other scripts that are installed by the same user.

Google Compute Engine Quotas silently changing?

I've been using GCE for several weeks with free credits and have repeatedly found that the quota values keep changing. The default CPU quota is 24 per region, but, depending on what other APIs I enable and in what order, that can silently change to 2 or 8. And today when I went to use it the CPU quota had again changed from 24 to 2 even though I hadn't changed what APIs were enabled. Disabling then re-enabling Compute Engine put the quota back to 24, but that is not a very satisfactory solution. This seems to be a bug to me. Anyone else have this problem and perhaps a solution? I know about the quota increase request form, but it says that if I request an increase than that is the end of my free credits.
Free trial in GCE has some limitation as only 2 concurrent cores at a time, so if for some reason you were able to change it to 24 cores, it's expected that it will be back at 2 cores.

Latency for Geocoding API

I was planning to use Google Geocoding API. I was wondering what is the latency I should expect in getting the response back? I cannot find out these details on the website.
Does anyone aware of what will be the actual latency if I am using Google Geocoding API?
Meaning how much time it will take to get the response back from the Geocoding API.
we have a live app working in the playstore and we get roughly 120-150 hits per hour. Our median latency is around 210 ms and latency (98%) is 510 ms.
We have an application 24x7 with ~2 requests per second.
Median: 197.08 ms
98th percentile (slowest 2%): 490.54 ms
Could be a high bottle neck for you application... use some strategies to help you:
Memory cache
Secondary cache
batch persistence