So, I am using gadash to get a google analytics dashboard on a system I am building. I had it working earlier with no problem but now It is throwing this 403 Access not configured error. Why would it do this with no code changes since it was working? It is being sent from a web server.
Probably you exceeded the qouta allowed by Google. Or exceeded the requests per second limit.
403 means that the server understood the request but refuses to fulfill it.. And that the problem is not an authentication problem as described here
You can find google analytics absolute limits here
So please verify that the requests per seconds and quota didn't exceeds the Google limits.
Related
We have a daily python script that pulls search console information from the Google Search Console API. It's run for over a year with no problems. Then suddenly on January 6th 2020, we started getting 403 errors that stated we exceeded our load quota:
HttpError: https://www.googleapis.com/webmasters/v3/sites/http%3A%2F%2Fwww..com/searchAnalytics/query?alt=json returned "Search Analytics load quota exceeded. Learn about usage limits: https://developers.google.com/webmaster-tools/v3/limits.">
I've played around with the requests to get them as basic as possible and I still get the 403 error saying I've exceeded my load quota. Even a request as simple as this fails:
request = {
'startDate': '2020-01-20',
'endDate': '2020-01-20',
'dimensions': ['date']
}
I've confirmed that this is not a credentials issue or a refresh token issue as we can hit other Google API's with our credentials just fine. I can even pull sitemaps from the Search Console API without issue (webmasters.sitemaps().list(siteUrl=site_url).execute()). However, any attempt to use webmasters.searchanalytics().query() results in a 403 regardless of how simple or basic the query request is.
I also checked our quotas and we're nowhere near being close to the max. For today, I show 20 requests against a 100,000,000 query max. Source: https://console.cloud.google.com/apis/api/webmasters.googleapis.com/quotas
I also can't find any indication of api version issues or changes in supported features for search console queries against the API. Everything just stopped working on the 6th and I can't figure out why.
Has anyone else had this happen? Any ideas on where I can look for solutions?
NOTE: I am posting to stack overflow with these tags per Google's recommended way to get their attention: https://developers.google.com/explorer-help/support
I am using the Google Maps Javscript Api, v3 and everything is working well up to a point where the requests for the map images are forbidden with a status of 403. Usually the map stops loading after a period of time in which the page/session is open: it may be 24 hours, it may be more than 48h, I couldn't actually find a more accurate period.
Given the fact that we want to have a live website and a testing one – different domains, I generated 2 different keys, and I am loading them conditionally, but the html rendered is the one expected.
var mapKey = VanillaRate.Domain.Settings.AppSettings.GoogleMapsApiKey;
and the script tag is:
script src="https://maps.googleapis.com/maps/api/js?key=#(mapKey)&libraries=places" async defer
The usage limits were not exceeded, the referrer is well set.
The error appears when the map is zoomed and it's:
Failed to load resource: the server responded with a status of 403 () - maps.googleapis.com/maps/api/js/StaticMapService.GetMapImage?....
Since I couldn’t find any exact posted situation nor documentation about it, it is possible to be a timeout on google servers for security reasons and this is why the requests are forbidden for a session longer than a day?
EDIT: I forgot to mention that after refreshing the tab, everything works well. If it was indeed the usage limit, would the server respond with success after refresh? I've read that in this case, the map wouldn't work all day. Is that right?
If the response is still a HTTP 403 (Forbidden) error, the signature was not necessarily the problem, it may be related to usage limits instead.
This typically means your access to the web service has been blocked on the grounds that your application has been exceeding usage limits for too long or otherwise abused the web service.
I find this answer on google developer. There is no simply way to resolve this problem. Google recommended two solutions:
Reduce requests to the server;
Or, 'purchasing additional allowance for your Google Maps APIs for Work license.'
You can also try to access to the the Google Cloud Support Portal to signal your problem.
I find this informations in google developer here. You can find on this link some solutions like I detail to you and the explanation of your problem.
"The usage limits were not exceeded"
Are you sure? You're loading the places library, in which case this applies:
Google Places API Web Service
Default 1,000 free requests per day,
increased to 150,000 free requests per day after identity
verification.
https://developers.google.com/maps/pricing-and-plans/
See also:
https://developers.google.com/places/web-service/usage
https://developers.google.com/maps/documentation/javascript/places#UsageLimits
I am using Google Maps service (Places, directions). After using for a while I am receiving response :
{ "error_message" : "You have exceeded your daily request quota for this API.", "predictions" : [], "status" : "OVER_QUERY_LIMIT"}.
I found this link Usage limits for services when used with Google Maps Javascript API v3. In this link it is mentioned that "Places API allows 1,000 or 100,000 (if you're verified) requests per 24 hours." But i have not requested it even for 200 times.
Is there some other reason for this response or is there any alternative to get rid of this problem. Thanks in advance.
Usage Limits for Google Maps API Web Services
Yes. There is another possible reason for getting the "Over query" message. And it's a common problem. If you make subsequent query requests too quickly, Google will respond with the over query limit message you posted.
From the Google API spec
Usage limits exceeded
If you exceed the usage limits you will get an OVER_QUERY_LIMIT status code as a response.
This means that the web service will stop providing normal responses and switch to returning only status code OVER_QUERY_LIMIT until more usage is allowed again.
This can happen:
Within a few seconds, if the error was received because your
application sent too many requests per second.
Some time in the next 24 hours, if the error was received because
your application sent too many requests per day. The time of day at
which the daily quota for a service is reset varies between customers
and for each API, and can change over time.
Add delay between requests to eliminate error
You need to insert a delay between requests to the Google Map API to get rid of the message and get the API to respond. How you implement a delay depends upon the platform you're using.
Since last friday, without having changed any code on my website or Google API Console, we started getting the following message:
"This site has exceeded its daily quota for maps. If you are the creator of this site, please visit the documentation to learn more".
What we can't explain is the following:
The Google API Console reports we are well under the maps daily quota
Users get this messages intermittently. If you get the message, for example, and reload the page, you probably wont get the message again.
We started getting an unrelated issue saying that the current site wasn't authorized to use the API. But the whole domain was authorized via the API console, has been authorized for over a year, we haven't touched that setting, this issue is also intermittent.
Any clue as to why is this happening and/or how to fix it?
I'm seeing a 403 "Access to the webpage was denied" error on one specific file being accessed via the Drive SDK. It was working earlier, the app permissions are set correctly, and we're having success with other files using different tokens against the same app.
We're getting the downloadUrl from the SDK successfully, then seeing the error message only after users are redirected to the downloadUrl. Because of that it's hard to track, but we've confirmed that it's working for some, but not for others — it hasn't fully stopped.
The full error text is:
Access to the webpage was denied
You are not authorized to access the webpage at [...] You may need to sign in.
HTTP Error 403 (Forbidden): The server refused to fulfill the request.
We're including the GET download and (valid) access_token parameters, all that.
My question is this: could this be related to the reported Google Drive outage that's currently happening, or is there some sort of throttle/limit to access of a single file over the drive API? I've never seen this behavior before, and this response isn't listed among the standard 403 responses.
I have just seen something similar. I was using a freshly acquired access token, so I don't think it's oauth related. My working theory is that the downloadUrl link was stale. When I got fresh meta data, which had a different value in downloadUrl, it worked using the same access token that had previously failed.
This is only a theory since it isn't documented anywhere, and I would actually expect 410 (or even 301) as a much more appropriate status than 403.