Ranger: restrict Ranger file manager to preview/open larger files - r-ranger

Is it possible to restrict Ranger from previewing larger files, for example any file larger than 10mb should be ignored for preview.

Related

Cannot move huge files from sql server to adls gen2 in json.gz using adf

I am trying to move the files I have in sqldw to ADLS gen2 storage. I want the files in json.gz in the blob storage. I tried compressing it and moving, but the files are still too big and the minimum file size in ADLS gen 2 is 2.1mb, some of my files are around 4gb. How to solve this?What I see when I open these files in ADLS gen 2
The error is due to a 2.1 MB file size maximum (not minimum) in the storage browser viewer/editor in the Azure Portal - not ADLS. Currently, ADLS supports block blobs that are up to 190.7 TB in size. You could download the files via the storage browser to view them.
See Azure Blob Storage Limits

VSCode -- Preview (i.e. head) a CSV file as the default open option?

I was working with a large CSV file (> 1 GB), I wanted to preview what the columns were and what the data looked like by exploring the top ~100-1000 rows of the data alongside the headers.
When I open the file, my computer hangs for quite awhile due to the size of the file. However, I noticed there is a preview large file: head (I believe through the RainbowCSV extension?) in the context menu that loads much faster and provides the functionality I would want in the strong majority of cases (previewing large files for structure rather than opening them entirely).
Is there anyway to set this context menu option for previewing a large file as the default behavior when clicking on a .csv file in the File Explorer? I would prefer this so I do not accidentally make my computer hang when clicking on a file.
Bonus points if anyone can point me to an extension that let's you paginate CSV files by default.

How can I find which Revit files are linked in BIM 360 via Forge?

In certain circumstances, BIM360 will serve a zip file of a Revit document along with its links, such as explained here: https://forums.autodesk.com/t5/bim-360-document-management/linked-revit-files-in-bim-360-docs/td-p/8774004
In this circumstances, however, when interacting with GET projects/:project_id/folders/:folder_id/contents the file still is shown as a regular file (potentially the isCompositeDesign attribute distinguishes it) with a .rvt file extension. In addition, the filesize shown in storageSize of the object is the sum of the main Revit file and all of its links. Checking the details in GET buckets/:bucketKey/objects/:objectName/details equally show the size object size attribute to be the sum of the main Revit file and all of its links.
I cannot seem to find functionality in Forge that:
Distinguishes a zip file from a lone file (potentially the isCompositeDesign attribute does this)
Provides a list of which other files are linked into the main file, or a list of the zip file contents and their URNs.
Provides a true filesize of the main revit file itself, not just the sum of all linked files in the zip.
Ideas?
Revit 4 worksharing, publishes a file to BIM360.
This file is named as a .rvt file (ie. 'mybigrevitproject.rvt'), but in fact, it's really a zip file in disguise. If you rename it to zip, download it, and unzip it, you'll find lots of .RVT inside the zip.
There's a neat trick to figuring this out, without downloading the entire file.
Use a range GET on the first 16 bytes, and check for the magic header.
For full details, check out this repo: https://github.com/wallabyway
Here's a snippet of the code that will help:
https://github.com/wallabyway/bim360-zip-extract/blob/master/server.js#L167
I think it's related to this question: Forge Data management returns zip file

Maximum file upload size in a single request for google Drive

For Google Drive on create file API I could see the Maximum file size: 5120 GB.
So my question here is, is this the maximum file size that can be uploaded in a "Single request"?. Or is it like we can have file of maximum size of 5120 GB but it has to be uploaded in chunks of some XXX size(lets say 2 GB max size for a chunk in a single request).
Depends on your speed connection. It is possible to upload a 5 TB file. However, the possibility that it eventually fails to upload is pretty big. It is much wiser to create .rar compressed files and upload them.

Box API large file upload results in corrupt file

I'm trying to upload a large file > 200 MB using new Box API.
Can I upload it in chunks?
Currently the box API does not support uploading a file in chunks. You may include a "Content-MD5" header with your request that contains the SHA1 hash of the file. Box will check this against the uploaded contents to ensure the file is not corrupted in transit.
See: http://developers.box.com/docs/#files-upload-a-file