File size limit when uploading to Custom Translator - microsoft-translator

I'm having problems uploading an approx. 70 MB tmx file for training data. Could this be a file size problem? Is there a file size limit?
The error I'm getting is not very descriptive:
Error
Sorry, operation failed. Error details:
{
"isTrusted": true
}

I had more success after individually compressing (zip'ing) my TMX files.

The upload size limit is set at 100 MB, so that file should not have been a problem. The script error you mentioned, {"isTrusted" : true}, is usually seen if your authentication token has expired. Refreshing your browser should reload a new token.

Related

Firebase: Exporting JSON Unable to export The size of data exported at a single location cannot exceed 256 MB

I used to download a node of firebase real-time database every day to monitor some outputs by exporting the .JSON file for that node. The JSON file itself is about 8MB.
Recently, I started receiving an error:
"Exporting JSON Unable to export The size of data exported at a single location cannot exceed 256 MB.Navigate to a smaller part of the database or use backups. Read more about limits"
Can someone please explain why I keep getting this error, since the JSON file I exported just yesterday was only 8.1 MB large.
I probably solved it! I disabled CORS addon in Chrome and suddenly it worked to export :)
To get rid of this, you can use Postman's Import feature because downloading a large JSON file sometimes faces failure in the middle of the way using a browser from the dashboard of the firebase. You can put the traditional cUrl commands on it. You just need to click save the response after the response is reached. To get rid of complex authentication complexity, you make the rule permission of the firebase database to read:true until the download is complete thought you need to ensure security for this. Postman also needs sometimes to preview the JSON even freezing the UI but you don't need to be bothered with it.

Not able to export the data from web form in .csv format via using chrome browser

Our application is deployed on google, we are struggling while downloading the report into excel format. Total data which is there in application is around 5000 (in the form of rows). But when we apply filter then the data is downloaded, file size is around 2 MB.
Is there chrome restriction that it does not support > 2 MB file download? Below is the issue which we get in return.
Failed to load resource: the server responded with a status of 400 (OK)
Can anybody help me out to resolve the issue please.

403 Forbidden when downloading files (Download quota exceeded for this file, so you can't download it at this time)

We're trying to download all the files from a Drive account. The issue can be reproduced with the following code:
import requests
f = requests.get('https://www.googleapis.com/drive/v2/files/%s' % file_id, params={'access_token': access_token})
download_url = f.json()['downloadUrl']
r = requests.get(download_url, params={'access_token': access_token}, stream=True)
with open('/path/to/file', 'wb') as f:
for chunk in r.iter_content(chunk_size=1024):
if chunk:
f.write(chunk)
f.flush()
The third line returns 403 Forbidden.
Here are the interesting points:
This only seems to happen to large files, with sizes in the range of GBs. The smallest file that is affected is 6 GB, with most of the files having a size of 100 GB.
We used the download URL of the file resource returned by the Drive API, but it also happens with the download URL of the revision resource.
When retrying after a few days, some of the affected files were downloaded successfully, but others still had the same error.
edit: Trying to download the file manually shows the error "Download quota exceeded for this file, so you can't download it at this time". Searching for this in Google doesn't show any helpful results, so I still don't know how to proceed.

CKAN : Upload to datastore failed; Resource too large to download

When i try to upload a large csv file to CKAN datastore it fails and shows the following message
Error: Resource too large to download: 5158278929 > max (10485760).
I changed the maximum in megabytes a resources upload to
ckan.max_resource_size = 5120
in
/etc/ckan/production.ini
What else do i need to change to upload a large csv to ckan.
Screenshot:
That error message comes from the DataPusher, not from CKAN itself: https://github.com/ckan/datapusher/blob/master/datapusher/jobs.py#L250. Unfortunately it looks like the DataPusher's maximum file size is hard-coded to 10MB: https://github.com/ckan/datapusher/blob/master/datapusher/jobs.py#L28. Pushing larger files into the DataStore is not supported.
Two possible workarounds might be:
Use the DataStore API to add the data yourself.
Change the MAX_CONTENT_LENGTH on the line in the DataPusher source code that I linked to above, to something bigger.

Upload file in chunks with unknown length

Can you give me an example (html or java) to implement the upload file in chunks. How can I upload a file If the length is unknown at the time of the request? I found this way https://developers.google.com/drive/manage-uploads#resumable but I can't understand how can I say to the server the stream is terminated.
Thank you in advance.
aGO!
I can't give you an example with the google drive sdk, but I can tell you how it's done with the raw HTTP requests.
First you have to initiate a resumable upload session, after which you can make partial uploads with unknown size by setting the Content-Range header to something like this 0-262144/* for the first 256 KB, or whatever the size of your chunks are (they have to be multiples of 256KB as per the documentation).
You make requests like this to upload intermediate chunks until you get to the final chunk, which should be smaller than those fixed intermediate chunks. This last chunk will complete the file upload by setting the Content-Range field to <byte interval of the chunk>/<final size of file>.
The final size of the file can be calculated before the last request: number of requests * 256 KB + size of last chunk.
The algorithm is exemplified better here: http://pragmaticjoe.blogspot.ro/2015/10/uploading-files-with-unknown-size-using.html