I have read in GCP Cloud Function documentation that, max uncompressed HTTP request size should not be more than 10MB to HTTP Cloud Functions. But, when we are passing merely 200 KB JSON as an input which is consist of an image encoded in base-64 format, we are getting an error 413, Request Entity too large. Please find attached the error screenshot.
screen shot for error 413.
Can anyone explain the reason behind that? And also remedy for that?
Related
On 01/11/2021 we noticed that most of the requests with forceget :urn/metadata/:guid/properties began to respond with 406 error 'diagnostic: "Too large the size of properties JSON, please call this API with query by object id"', although the models are not big.
Everything was good before.
I tried adding query objectid but the result was the same.
I didn't find anything about this in сhangelog. Maybe something happened?
As part of a revit addin that I am running in design automation, I need to extract some data from the file, send it in json format to an external server for analysis, and get the result to update my revit file with new features. I was able to satisfy my requirement by following the indicated in: https://forge.autodesk.com/blog/communicate-servers-inside-design-automation, which worked as I needed, the problem arises when the size of the data to send for the analysis grows, it results in the following error:
[11/12/2020 07:54:08] Error: Payload for "onProgress" callback exceeds $ 5120 bytes limit.
When checking my data it turns out that the payload is around 27000 bytes, are there other ways to send data from design automation for Payloads larger than 5120 bytes?
I was unable to find documentation related to the use of ACESAPI: acesHttpOperation
There is no other way at the moment to send data from your work item to another server.
So either you would have to split up the data into multiple 5120 byte parts and send them like that or have two work items: one for getting the data from the file before doing the analysis and one for updating the file afterwards.
Hi am trying to send about 66.7 MB of compressed (599 uncompressed) of json data from java rest API. However at Angular(7) am getting response as
null. If I try same API with less data it works fine. Data being sent is byte array.Please find below snapshots for request ,response. Is there any workaround /or limitation. ?
We ran into the same problem. It seems since Chrome 80 there is a regression.
https://bugs.chromium.org/p/chromium/issues/detail?id=1097457&q=unexpected%20end%20of%20json%20input&can=2
I have a RestFul server that is suppuse to return a large json object more specifically an array of objects to browsers. For example 30,000 points will have a size of 6.5mb.
But I get this content mismatch error in browser when speed is slow. I feel it is because large data throught rest api breaks up something. Even in Postman sometimes it fails to render even though i see data of 6.5 mb received.
My Server is in NodeJS. and return content-type header is application/json.
My Question is
Would it make more sense if I return a .json file. Will the browser be able to handle. If yes, then I will download the file and make front end changes.
Old URL - http://my-rest-server/data
Proposed Url - http://my-rest-server/data.json
What would be content-type in the proposed url?
Your client can't possibly expect to want all of the data at once but still, want their data fast data.
...but you might want to look into sending data in chunks and streams:
https://medium.freecodecamp.org/node-js-streams-everything-you-need-to-know-c9141306be93
When I called Microsoft Translator Text API's TranslateArray, Error 413 (Request Entity is too Large) occurred.
I recognize API limitations:
The total of all texts to be translated must not exceed 10000 characters.
The maximum number of array elements is 2000.
When the request's Content-Length header is greater than 30721, the request fails with a 413 error even though the above api limitations are observed.
is there any other limitation?
If anyone is still running into this issue, upgrading to the latest google-cloud-translate client should fix the issue. For more information, the PR here fixed the problem for me, which was that the client was using GET requests instead of POST requests.
Note: This should also fix a related error of getting 411 (Length Required) when cutting off a piece of text to only translate the first N characters.