Reducing the loading time in an HTML document filled with thousands of PNGs by generating thumbnail JPGs - html

I'd like to make an HTML document with all of my own photos saved on my computer, and use it as an offline catalog, as most image viewing software (commercially available) don't satisfy my needs.
There's about 1000 photos and the vast majority of them are PNGs, about 10-15MB each.
I'm suspecting that each time I'd open this HTML document, loading would take an enormous amount of time.
To prevent that, I was told I should create thumbnail versions of those photos (e.g., 200x200px JPGs), and use them instead, while enclosing them in links that redirect me to the original locally saved PNG files.
Just asking to verify if there's any better alternative to this, or should I really go through this thumbnails solution? If there isn't, how would I mass generate those thumbnails, so I don't do it individually?
I'm on a High Sierra Mac and a Windows 7 PC.
Extra question: Should I, by any chance, use a document coded in a language other than HTML, for this purpose? For example, a language faster or more efficient or more cross-platform, that I didn't think of myself? I'm open to suggestions.
Thank you.

Related

Permanently available file to download for internet speed testing

I am implementing a very simple download speed tester.
To do that, I simply download a large image file from the web and see if it's received within a reasonable time.
I'm using the following file because I saw that in a source code somewhere:
https://cdn.pixabay.com/photo/2017/08/30/01/05/milky-way-2695569_1280.jpg
However, I am afraid that the image might go away some time.
Which image could I use to make sure it will always be there?
Is there perhaps an image that a foundation or so has created especially for such a purpose and who promises that it will be there for a long time?
I was even thinking about downloading a really popular js file because I was thinking that it will be there for a long time like https://code.jquery.com/jquery-3.6.0.min.js, but I am not sure about this either.
How could I handle this task in the most reliable way?
Thank you!
I would recommend Wikimedia commons.
Reasons:
Wikimedia has a very strict guideline that allows only high quality uploads which of course are higher in size.
It's free and persistent (It perisist for years )
You can go with videos for even bigger sizes.
For images it's https://commons.m.wikimedia.org/wiki/Category:Images

Website load/response time (Images)

So, I've made my first website, but I've run into a sort of a problem. In the website, here, I have photos. Now, the image dimensions are fairly large, and I would like to keep it that way, so they would be resized to fit as many screen sizes as possible. Now, as you can see in the site, I have uploaded my pictures to imgur. That was before I had a server to work with, for testing purposes. Will uploading the pictures to my server improve load time of the photos?
Also, on a side note, what are the advantages of uploading them to a server? (one with limited space, mind you. This is mainly why I am asking.)
Would uploading plugins, like say, Jquery, to the server, instead of using, say, <script src="CDN of choice" /> and <link> for css, improve initial load time?
My apologies if this is a stupid question, but bare in mind that I am new to this. If you'd like to, any other critique on the site would be appreciated.
The biggest improvement you can get here is by compressing your images. When saving in photoshop, select save for the web, and to further decrease the file size. Use something like https://tinyjpg.com/ . I was able to compress one of your images that was 2.5mb to 947kb. Do this for all your images and you won't have to worry about CDN and load time. Your javascript and css files are much much smaller than your images and I doubt you will get any real load time improvement by optimizing them.
Uploading your image (and/or other assets) to your server. Although it makes sense at first, but will make you have the following problems:
You lose lots of your allocated bandwidth, the more people visiting your website.
The speed of yourwebsite for your visitors will depend on your server and how far they are from it.
For the second issue, I would always suggest using CDN relay like cloudflare or cloudfront. it makes the load time much faster and also saves the bandwidth.
Also, if bandwidth and spare id a real problem, you should either consider using a better host, or simply use something like Amazon s3 to host your static files.

Html5 Banner File Size

next month google will stop flash ads automatically, so we got a lot of requests to remake our flash banners in html.
The IAB made 2013 some guidelines for html5 banners. The maximum file size should be 100kb..
We're using 2 ways to publish our banners. One is with google doubleclick and the other one is direct publishing on websites.
We got no problem with doubleklick, because they say, that the initial load should be under 100kb and the rest could be loaded after that.
But the publishers for the direct publishing are saying, that the whole banner (all files which are used and compressed as a zip) should be max. 100kb.
Our client has got some designing guidelines, so we have to include the font (which contains 4 formats and 2 font types). But even the font has 400kb.
So the only way we have at the moment is, to convert the text into pictures, to keep the filesize down.. But this isn't the concept of html5...
DOes someone of you got a solution for that problem ?? :O
Thanks
Use doubleClick, because dealing with publisher directly will make you crazy. They don't know about polite loading and are very strict on the file sizes, but when you use doubleClick you can use Polite load Options.
About the typefaces, if you want to put some copies on the website it is better to generate png for that and optimize those png with software like imageOptim.
This way each Copy will be around 1Kb.
But the whole webfont is around 100KB which does not make sence to add in one banner.
Beside first check available free webfonts that you can use without adding any extra weight on the banner.
About the webfonts, you could use Google Fonts and limit by char used.
For instance, if the text is "Hello" only 4 chars is included (H e l o).
And I agree with Majid Kalkatechi, use DFP and Polite loading.
With DoubleClick you will need to host your fonts on https, they won't be included in the file size limit. They might impact performance of the ad unit, loading in later, which will mean webfonts will show temporarily.
Why are you using fonts and not images? Are the ads using dynamic text? If not use images. Normally keeping an ad under 100kb shouldn't be a problem

Thumbnails from HTML pages created and used automatically in web application

I am working on a Ruby on Rails app that visualizes product trees. The tree is built of nodes an everything is rendered in HTML/CSS3. Some of the products make several hundred SQL queries as the tree builds up (up to 800 queries on the biggest tree).
I'd like to have small thumbnails of each tree to present it on an index page. So rendering each tree once again and modifying CSS to make a tiny representation is an option.
But i think it's probably easier to generate thumbnails, crop, cache, and show these on the index page.
Any ideas on how to do this? Any links/articles/blog posts that could help me?
Check out websnapr; it looks like they provide 100,000 free snaps a month.
I should check this site more often. :D Anyway, I've done some more research and it looks like you'll need to set up some server-side scripts that will open a browser to the page, take a screenshot, and dump the file/store in database/etc.
This question has been open for quite a while. I have a proposal which actually fulfills most of the requirements.
Webkit2png can create screenshots which and crop parts of the image. You can specify dimensions, crop areas, and also it provides a thumbnail of the pages.
However, it will not support login in your application out-of-the-box.
Webkit2png is really easy to use in a shell script, so you can just feed it with a number of URLS and it will return all the image files.
More info in this blog post: Batch Screenshots with webkit2png
Webkit2png has an open request to add authentication (so you can use it on logged in pages).

Best solution for turning a website into a pdf

The company I work for we have a CBT system we have developed. We have to go through and create books out of the content that is in our system, I have developed a program that goes through and downloads all of the content out of our system and creates a offline version of the different training modules.
I created a program that creates PDF documents using the offline version of the CBT. It works by using Websites Screenshot to create a screen shot of the different pages and then using iTextSharp it creates a PDF Document from those images.
It seams to be a memory hug and painfully slow. There are 40 CBT Modules that it needs to turn into books. Even though I take every step to clear the memory after each time it creates a book. After about 2 books it crashes because there is no memory left.
Is there a better way to do this instead of having to take a screen shot of the pages that will yield the same look of the web page inside the pdf document?
I have searched and demoed and found that ABCPdf from WebSuperGoo is the best product for .NET. It is the most accurate and doesn't require a printer driver. It uses IE as the rendering engine, so it looks almost exactly like what you get in IE.
PrinceXML is commercial software that generates pdf from websites.
I've used PDFSharp in the past and have had good success in generating PDF's.
It's open source as well, so in the event of troubles like you've mentioned, you're able to hunt and peck to increase performance.
If you control the source it is probably not too difficult to generate pdf directly instead of through a screenshot.
Did you try unloading the dll?
There are also different ways of getting screenshots:
http://mashable.com/2007/08/24/web-screenshots/