Deep zoom Tile generation of very large size image - zooming

I need to generate deep zoom tiles for an image of size 50,000 x 50,000 pixels. I tried using deep zoom composer software but it keeps loading the file without any success.
Please help me how can i generate deep zoom tiles for such large image. Thanks

There are a number of tools you can use:
http://openseadragon.github.io/examples/creating-zooming-images/
For large images I'm a fan of VIPS:
https://libvips.github.io/libvips/
https://libvips.github.io/libvips/API/current/Making-image-pyramids.md.html

Related

Slicing up large heterogenous images with binary annotations

I'm working on a deep learning project and have encountered a problem. The images that I'm using are very large and extremely detailed. They also contain a huge amount of necessary visual information, so it's hard to downgrade the resolution. I've gotten around this by slicing my images into 'tiles,' with resolution 512 x 512. There are several thousand tiles for each image.
Here's the problem—the annotations are binary and the images are heterogenous. Thus, an annotation can be applied to a tile of the image that has no impact on the actual classification. How can I lessen the impact of tiles that are 'improperly' labeled.
One thought is to cluster the tiles with something like a t-SNE plot and compare the ratio of the binary annotations for different regions (or 'classes'). I could then assign weights to images based on where it's located and then use that as an extra layer in my training. Very new to all of this, so wouldn't be surprised if that's an awful idea! Just thought I'd take a stab.
For background, I'm using transfer learning on Inception v3.

3ds Max BlendedBoxMap support for the Forge Vewer

We are really interested in adding BlendingBoxMaps to certain objects in our model (such as terrain and larger geometry to avoid obvious repeating in the texture).
However, all our test has failed as objects containing BlendedBoxMap (see image below) turns black after translated to SVF. Any guidance would be highly appreciated.
Update:
If the above doesn't work. Is there any alternative to BlendedBoxMapping to achieve good looking textures for larger terrain? We are aware that baking the texture onto large mesh gives very blurry results as the SVF translation reduces all larger texture resolutions to 1024x1024 (which seems to be impossible to avoid) and stretches the 1024x1024 texture as much as needed to fit the large object.
If materials using BlendedBoxMap fail to correctly appear in the viewer, as a workaround, I would suggest trying to bake your material into a single Bitmat.
Here is an example of how to do so using bake to texture:
https://knowledge.autodesk.com/support/3ds-max/learn-explore/caas/CloudHelp/cloudhelp/2016/ENU/3DSMax/files/GUID-37414F9F-5E33-4B1C-A77F-547D0B6F511A-htm.html

Map Application Image size issue?

I have created a Map Application which contain only one image and its image size is 13 MB which takes more loading time when I deploy to Production Server.
If I resize the image then all the coordinates of application which mapped change and quality of image also get worse.
Is there any way to Lazy Load or Partial rendering technique to load the image in application?
When you resize the image define only ONE: Either height or width. The aspect ratio should fix itself automatically. From all the details you have given me, that is all I can help you with.

What sort of approach should I take for scaling sprites?

What sort of approach should I take when I'm writing a game that uses sprites.
Say for example, my phone runs with a 1080p resolution. If I wanted to run my game on my phone without some weird stretching going on, would I have to use a large sprite sheet with huge sprites, or would I just write the game with a small sprite sheet, using the original sizes for each sprite (without upscaling), and just let everything be automatically scaled by LibGDX?
Thank you!
I would recommend storing the image larger. You could then enable mipmapping and tweak the texture filters. (See libgdx texture filters and mipmap)
This way, the image gets automatically scaled into a variety of sizes on runtime, and then the appropriate image gets selected depending on the size the image is shown.

image compression reducer

I am trying to have High quality Images on my site like these ones in the slider here http://www.viewbug.com/ but when I have the actual of the picture, it wont load fast enough, due to the big size. I tried to re size it with photoshop but the quality of my photo would decreased a lot . so for example the following picture on this site http://www.viewbug.com/media/featured/2892642_large.jpg is high quality but small in size 377 kb, and then they re-size it with html code height = 900 and width =640 without ruing the ration dimension and it looks just fine inside the slider. I googled and I didnt find any javascript or html code that does this. how can I compress my images without loosing the quality
I'm a photographer, so I do this a lot. I export my images with compression settings that are invisible to the eyes, but reduce images far smaller than the original. Unfortunately, Photoshop uses a different compression scale than most JPEG programs (and JPEG is the only efficient and compatible photograph format for the Internet), and so for Photoshop, one needs to use specific instructions.
Try this tutorial:
http://inobscuro.com/tutorials/optimizing-images-for-web-35/
so by using punypng u can get the solution for dat. PunyPNG is a free website optimization tool that dramatically reduces the file size of your images without any loss of quality.