Container image patches - containers

I have the following use case: building and delivering a product composed from multiple container images. The product is delivered and deployed to air gapped environments that don't have access to external systems like image registries.
Delivering patches for the product implies rebuilding container images and delivering full container images. Although the changes are small, a patch is more than 10Gb in total size as most of the images are rebuilt (e.g. for addressing new vulnerabilities).
Are there standard ways of delivering image patches only (like only the binary content that changed) instead of the full images?

Related

Should i load image from server or fetch from url?

I am kind of new in web development, I wanted to create a Vuejs app of dota 2, there will be a grid of cards, in each card will be hero profile, I have all these heroes data locally in a json file with img path include, problem us there are so many images about 100 heroes and each have 4 ability so 100 hero avatar + 400 abilities it's kind of huge. Should I deploy these images to my hosting server or change img path to fetch from dota2.com ? Which is better and faster ? What effect on client side? My problem is I don't know if the client load images on the go or they need to download all 500 images first for my website to work.
If I am using url fetch, it mean my website is static or dynamic? Can I publish it to GitHub or need Firebase to hosting it?
Static or dynamic refers to whether the data is constant or changes given a set of conditions. It seems that you have a list of images based on a field of Valve's API result set. I would suggest using those images for at least the following reasons:
A site as popular as valved more likely has caching across multiple regions. This means that if you are in Montana and the user is in Thailand there might be a copy of the image in a server closer to them than your single server; unless you also implement something similar of course.
You will not have to pay for the bandwidth which you are potentially authorized to use given you are using their API and they are given you a specific URL. You should read their terms and conditions.
If the image of the character changes it will update accordingly.
it is possible to serve multiple versions of the images based on the device. For instance a different image may be served for a mobile device than a desktop computer.
Note that the 3rd reason could be though of as being dynamic but it is not on your side. It is static in the sense that you will always use the image retrieved from the API. Also, consider the negative side effect of your json file if Valve changes an image. You should consider recreating the json file every so often.
The client side will be the same either way - a request will be issued for each image to either your server or the Valve server. The speed at which the image is served will be the determining factor.

Long load times of webpage assets

I have a website that runs just fine on my local server. It's quick, responsive, and overall runs great.
But when I put it on the server of my domain host, sometimes it takes excessively long to load assets. For example, one 1MB png file took 2.31 seconds to load:
Chrome's Network Developer Tool reveals to me the following:
So is this likely due to poor implementation of my code or is it possibly a crappy server? (The company is subscribing at the lowest tier possible to host their content) My internet connection is quick so I doubt it's that.
I think it is probably a problem with your host. An image is an image :) there aren't a 100 ways to implement one!
Oversized images always take longer to load, so you should keep your images as small as possible.To lower down the content download time, you can optimize/compress the image without degrading it's visual quality.If you are using any graphics software to optimize the images, you should use “Save for Web” option. This will reduce the size of images and hence image load time.
Furthermore, you can use CDN to serve static assets of your website like, images, CSS,JS, videos, etc. A CDN populates your website files to geographically distributed network of servers called POPs. CDN serves the website resources from the nearest geographical location of a visitor, that means your website assets will load more faster.
Use SSD based host. SSD has excellent read/write rates compared to that of traditional HDDs. Hence, solid state drives perform better than hard disk drives and they are almost 100 times faster than traditional HDDs.
Here's a question for you
Is the image you're trying to load in background, or its an image tag?

Adding new BackgroundTransferRequest's once app is in background

Adding BackgroundTransferRequest's to the BackgroundTransferService once the app is in the background is successful, but the new requests don't actually run until the app comes back to the foreground. Not so great for my scenario of downloading lots of small files that may take a fair amount of time to complete.
I imagine Microsoft has probably implemented this behavior by design(?), but does anyone know a way around this or an alternative approach?
A bit of background to the problem:
I'm developing a Windows Phone 8 map app that allows sections of maps to be downloaded and cached for offline use. This process can lead to 1,000's of map tiles needing to be downloaded.
I've created a process that spawns the full limit of 25 BackgroundTransferRequest's, then adds more to the BackgroundTransferService as requests complete. This all works fine until the app actually goes in to the background.
I have considered doing some web server side processing to allow tiles to be bundled in to a zip and downloaded as a single request, but this is extra complication and will result in twice the space being required on the phone to complete the download and then extract the files before deleting the original package. But, ideally I'd like to find a way to force new BackgroundTransferRequest's to start running in the background.
It's not clear what your actual question is but I'd definitely recommend bundling them into a zip file and then downloading that. It's almost always easier to work with a single file than thousands.
If disk space is really a genuine issue (not just a theoretical one - I've put thousands of map tiles in under 20mb before, but it will depend on image complexity & quality) then you could make a few zip files. then you'd avoid the BTR issue and not take up as much disk space (even temporarily).

Create many images on the server or do resizing in the browser?

at the moment I'm designing a new gallery and I have quite a lot of different image sizes for different ways pf presentation in the frontend. And now I'm wondering if I should create every size on the server or just create a few and then use browser resizing and HTLM/CSS "croping".
So it's many images on the server, a lot of space on the disk and also a lot of requests from the client against fewer requests, fewer space and resizing/cropping with the browser.
I tend to think the second solution is more advanced and modern, but also more complicated?
Thanks for any advice!
It's really a trade-off as you've said yourself. Performance is increasingly important though, and even affects page ranking, which is a good reason to keep various sizes on the server, i.e. only send thumbnails when that's all the user needs to see.
If you wanted to go to the programming effort, you could use a library like ImageMagick to generate smaller images from large ones. This way, you only need to store the large images on the server. The downside being programming effort and extra server load; since space is cheap, probably not worth it unless you need to serve many different variants.
You could create a handler to serve up your images. When a request is made for an image in a size you don't yet have stored in the file system on your server, you could do the resize, save the file and send it to the client.
That way you only have the files that are needed on the server.

Quick question regarding CSS sprites and memory usage

Well, it's more to do with images and memory in general. If I use the same image multiple times on a page, will each image be consolidated in memory? Or will each image use a separate amount of memory?
I'm concerned about this because I'm building a skinning system for a Windows Desktop Gadget, and I'm looking at spriting the images in the default skin so that I can keep the file system looking clean. At the same time I want to try and keep the memory footprint to a minimum. If I end up with a single file containing 100 images and re-use that image 100 times across the gadget I don't want to have performance issues.
Cheers.
What about testing it? Create a simple application with and without spriting, and monitor your windows memory to see which approach is better.
I'm telling you to test it because of this interesting post from Vladimir, even endorsed by Mozilla "use sprites wisely" entry:
(...) where this image is used as a sprite. Note that this is a 1299x15,000 PNG.
It compresses quite well — the actual download size is around 26K - but browsers
don't render compressed image data. When this image is downloaded and
decompressed, it will use almost 75MB in memory (1299 * 15000 * 4).
(At the end of Vladimir's post there are some other great references to check)
Since I don't know how Windows renders it's gadgets (and if it's not going to handle compressed image data), it's dificult IMHO to say exactly which approach is better without testing.
EDIT: The official Windows Desktop blog (not updated since 2007) says the HTML runtime used for Windows Gadgets is MSHTML, so I think a test is really needed to know how your application would handle the CSS sprites.
However, if you read some of the official Windows Desktop Gadgets and Windows sidebar documentation, there's an interesting thing about your decision to not use css sprites, in the The GIMAGE Protocol section:
This protocol is useful for adding
images to the gadget DOM more
efficiently than the standard HTML
tag. This efficiency results
from improved thumbnail handling and
image caching (it will attempt to use
thumbnails from the Windows cache if
the requested size is smaller than 256
pixels by 256 pixels) when compared
with requesting an image using the
file:// or http:// protocols. An added
benefit of the gimage protocol is that
any file other than a standard image
file can be specified as a source, and
the icon associated with that file's
type is displayed.
I would try to use this protocol instead of CSS sprites and do some testing too.
If none of this information would help you, I would try to ask at Windows Desktop Gadgets official forums.
Good luck!
The image will show up one time in the cache (as long as the url is the same and there's no query string appended to the file name). Spriting is the way to go.
Webbrowsers identifies cacheable resources by their ETag response header. If it is absent or differs among requests, then the image may be downloaded and stored in cache multiple times. If you (actually, the webserver) supply an unique and the same ETag header for each unique resource, then any decent webbrowser is smart enough to keep one in cache and reuse it as long as its Expires header allows.
Any decent webserver will supply the ETag header automatically for static resources, it is often autogenerated based on a combination of the local filename, the file length and the last modified timestamp. But often they don't add the Expires header, so you need to add it yourself. After judging your post history here at Stackoverflow I safely assume that you're familiar with Apache HTTPD as web server, so I'd suggest to have a look at mod_expires documentation to learn how to configure it yourself to an optimum.
In a nutshell, serve the sprite image along with an ETag and a far future Expires header and it'll be okay.