Handling 10 mb files with GoLang in a container - json

I am currently working on project which require processing on a Jason files (like json.Marshal(), json.Unmarshal() and more), we are working with around 10 MB files and there is an issue we are facing, the Golang library is not able to handle the file, it's crashing. As we are working with docker container, there is limited memory of 25 MB. Can anyone suggest a way to handle 10 MB Jason files efficiently in Golang ? And if possible, a faster way to Marshal and Unmarshal ?
We tried to increase the memory to 50 MB, but same issue of crashing. 50 MB is max we can use.

Related

How is it possible to save a file which contains one or a few bytes (i.e. 20 bytes), without occupying 4 KB disk space for that file?

I'm trying to save log data, and each log data (for example, the transaction numbers in the financial system) is only a few bytes. I do not want to use the Database structure. And I already know about the facts about the Cluster/Sector/Inode of the hard disk and/or the Operating Systems.
However, I think there should be a way of saving one file which is only 20 bytes, while it only occupies only 20 bytes (or 20+n% bytes, i.e. 25 bytes), and not 4 KB in the disk. Yes, I know the problems which may arise if we use millions of one-byte or very small files, particularly with the Indexing and searching speeds. In my case, the benefits of saving such small files outweigh the problems it might have. So I'm wondering if there's any practical way to do so (even if there's any special hardware for it or a particular hard disk made for it which I don't know). I appreciate any kind of information and help.
I've tried many tests on Windows and MacOS. But my tests were all limited to just saving some one-byte or a-few-byte files, and nothing more. I have no information about a practical way to do what I'm looking for.

Three.js freezes Chrome completely, huge texture in GLTF model

I want to load a ludicrous, binary glTF object with only a few polygons (~250), and a huge texture of size 10,000 x 5,500 pixels. The file is "only" 20MB in size.
When I load it using Three.js, Chrome hangs in its entirety for nearly 15 seconds. When looking in the profiler, pretty much nothing is going on during the freezing time.
If you want to load the file yourself, you can download it at https://phychi.com/uni/threejs/models/freezing-monster.glb, and the whole scene can be visited at https://phychi.com/uni/threejs/ (until I've found a solution or given up).
The behavior stays the same, whether I call GLTFLoader.load(), GLTFLoader.loadAsync(), or create my own Promise, and call .then(addToScene), without any awaits.
Does somebody have a magical solution? Or if not, how could I profile it more efficiently, seeing the internal calls? Or should I just open a bug report for Chrome/Three.js?
PS: Windows 10 Personal, Ryzen 5 2600, 32 GB RAM, RX 580 8GB.
The issue should be resolved by upgrading the library to r135(the current release).
The releases r133 and r134 have a change that introduced a performance regression on Windows when using sRGB encoded textures.

Offline Data Augmentation in Google Colab - Problems with RAM

What would be most efficient way to perform OFFLINE data augmentation in Google Colab?
Since I am not from US I cannot purchase Google Chrome for bigger RAM, so I am trying to be "smart" about it. For example, when I finish loading 11000 images, first as NumPy arrays and then creating pandas DataFrame from them, it occupies around 7.5GB of RAM. Problem is, I tried to del every object (NumPy array, tf.data object etc) in order to check if RAM changes, and RAM does not changed.
Is it better to try and be smart about RAM or maybe write to disk any time I augment image and do not keep anything in RAM? If this is the case, is using TFRecords a smart approach for this?

AIR FileStream larger than 10GB

I am developing a program that must load very large files, larger than 10GB
AIR has no problem opening it, reading bytes, etc.
However, it seems I can't really process it past the 10GB point or so by setting the FileStream.position property... any idea why?
Is there any way to read/process bytes from a file of arbitrary size (as long as OS supports it) in AIR?

Memory management AIR IOS

I am building an app in flash cs6 that is very video based... I am using a lot of videos and images....now When I debug it I get 68 mb of file size and in task manager it uses from 70 mb to - 150 mb of ram which is a lot... can anybody give me any suggestions help on how to better memory manage the app....I have a lot of stuff embedded
these are the things I am already doing..
I am using FLV videos at 1024 * 768 there are about 15-20 of the videos and each is about 1 mb to - 6 mb each
I am using addChld and removeChild so each time a video finishes playing I get rid of it..i also remove its even listeners
-I am using all png images
I am also removing sounds after they finish playing
Try it on youre device it may take less ram.