Website load/response time (Images) - html

So, I've made my first website, but I've run into a sort of a problem. In the website, here, I have photos. Now, the image dimensions are fairly large, and I would like to keep it that way, so they would be resized to fit as many screen sizes as possible. Now, as you can see in the site, I have uploaded my pictures to imgur. That was before I had a server to work with, for testing purposes. Will uploading the pictures to my server improve load time of the photos?
Also, on a side note, what are the advantages of uploading them to a server? (one with limited space, mind you. This is mainly why I am asking.)
Would uploading plugins, like say, Jquery, to the server, instead of using, say, <script src="CDN of choice" /> and <link> for css, improve initial load time?
My apologies if this is a stupid question, but bare in mind that I am new to this. If you'd like to, any other critique on the site would be appreciated.

The biggest improvement you can get here is by compressing your images. When saving in photoshop, select save for the web, and to further decrease the file size. Use something like https://tinyjpg.com/ . I was able to compress one of your images that was 2.5mb to 947kb. Do this for all your images and you won't have to worry about CDN and load time. Your javascript and css files are much much smaller than your images and I doubt you will get any real load time improvement by optimizing them.

Uploading your image (and/or other assets) to your server. Although it makes sense at first, but will make you have the following problems:
You lose lots of your allocated bandwidth, the more people visiting your website.
The speed of yourwebsite for your visitors will depend on your server and how far they are from it.
For the second issue, I would always suggest using CDN relay like cloudflare or cloudfront. it makes the load time much faster and also saves the bandwidth.
Also, if bandwidth and spare id a real problem, you should either consider using a better host, or simply use something like Amazon s3 to host your static files.

Related

Permanently available file to download for internet speed testing

I am implementing a very simple download speed tester.
To do that, I simply download a large image file from the web and see if it's received within a reasonable time.
I'm using the following file because I saw that in a source code somewhere:
https://cdn.pixabay.com/photo/2017/08/30/01/05/milky-way-2695569_1280.jpg
However, I am afraid that the image might go away some time.
Which image could I use to make sure it will always be there?
Is there perhaps an image that a foundation or so has created especially for such a purpose and who promises that it will be there for a long time?
I was even thinking about downloading a really popular js file because I was thinking that it will be there for a long time like https://code.jquery.com/jquery-3.6.0.min.js, but I am not sure about this either.
How could I handle this task in the most reliable way?
Thank you!
I would recommend Wikimedia commons.
Reasons:
Wikimedia has a very strict guideline that allows only high quality uploads which of course are higher in size.
It's free and persistent (It perisist for years )
You can go with videos for even bigger sizes.
For images it's https://commons.m.wikimedia.org/wiki/Category:Images

Reducing the loading time in an HTML document filled with thousands of PNGs by generating thumbnail JPGs

I'd like to make an HTML document with all of my own photos saved on my computer, and use it as an offline catalog, as most image viewing software (commercially available) don't satisfy my needs.
There's about 1000 photos and the vast majority of them are PNGs, about 10-15MB each.
I'm suspecting that each time I'd open this HTML document, loading would take an enormous amount of time.
To prevent that, I was told I should create thumbnail versions of those photos (e.g., 200x200px JPGs), and use them instead, while enclosing them in links that redirect me to the original locally saved PNG files.
Just asking to verify if there's any better alternative to this, or should I really go through this thumbnails solution? If there isn't, how would I mass generate those thumbnails, so I don't do it individually?
I'm on a High Sierra Mac and a Windows 7 PC.
Extra question: Should I, by any chance, use a document coded in a language other than HTML, for this purpose? For example, a language faster or more efficient or more cross-platform, that I didn't think of myself? I'm open to suggestions.
Thank you.

Website showing wrong images

I have a one-page static website. My website is displaying different images than those referenced in the HTML. For example:
<img src="img/About_Us_Graphic.png" alt="About us photo" id="aboutUsPic" style="margin: auto;">
Will be sometimes displayed as the image that's actually
<img src="img/Facebook_icon.png">
This happens pretty much randomly. Sometimes the pictures are correct, sometimes they're totally different pictures. And when it's a wrong picture, it isn't consistently the same wrong picture. What causes this? How can I fix it?
My site uses Foundation 5 (not sure if that's relevant). Thanks!
I've found situations similar to the one you described to be the symptom of one of a few causes:
Someone is tinkering with the content on the site without you being aware. Ask your team members if they know of anyone who might do this.
Your client-side cache is taking over. To remedy this specific problem, go to your browser and clear out the temporary files. Sometimes you have to also clear out cookies and other historical items.
Client-side proxies. Sometimes proxy servers store caches of what they serve to reduce the load of their requests. When they work in a round-robin fashion, different servers within the proxy circle might have mismatched content. * https://en.wikipedia.org/wiki/Load_balancing_(computing)
Load-balanced web servers. I've seen some situations where servers that are load balancing content will hold onto data. In my specific scenario, a memcache was used and would seemingly hold onto content until its index was refreshed.
Without more information about your set-up, there's not much anyone can do. As oxguy3 suggested - there could even be something in your code causing this.
Please try typing the URL of the image directly in your browser and see if it consistantly comes up the same, then try the same url with "?someArbitraryText" after it where "someArbitraryText" is just some random characters.
E.G. instead of "http://my.server.com/img/About_Us_Graphic.png", use "http://my.server.com/img/About_Us_Graphic.png?arbitrary". Most servers that I've encountered will still serve the image, but if a load balancer, proxy, or memcache is involved it will consider this a different URL and load it from the source rather than from some cached file.
I've seen some cases (such as on salesforce clouds) where doing so will bring up different results.
Let us know what you discover. Any little clue could help someone identify and determine the root cause.

Prevent loading of entire CSS stylesheet - only load needed content?

I'm trying to improve my site speed as I integrate certain features of a theme I purchased into my site and came across this interesting feature on the website of said theme.
The problem I have now is my CSS style sheet is absolutely massive (220kb) and takes ages (3-4 seconds) to load. I'm working to reduce that, but that's a different story...
When I try to load that exact same page on the theme dev's website, it loads much faster. Upon inspection, on my website, I find my 'Size' and 'Content' amounts of the stylesheet in the dev tools is about identical.
On the developer's page, the size is much smaller than the content. Does this mean that ONLY the elements called upon in the HTML are downloaded from the stylesheet? If yes, how does one accomplish this, because it sounds tremendously efficient.
I also thought it was because the dev's page was caching the content on my browser but I've disabled cache and it still only loads a portion of the content.
My website's cache works by the way, I'm just interested in speeding up that initial load time.
Any thoughts?
Relevant imagery:
Cache Disabled:
Dev's website with small size/content ratio - http://i.imgur.com/B3uhOWF.png
My website with ~1:1 size/content ratio - http://i.imgur.com/Ymu2YRz.png
Off the top of my head, you could minify your CSS. There's a few free tools around including the Page Speed Insights for Chrome.
See https://developers.google.com/speed/docs/insights/MinifyResources
It looks like your CSS is already in a minified file, but perhaps the content isn't!
Just so people aren't confused. The solution was to enable gzip on the server, which you can read more about at https://developers.google.com/speed/articles/gzip (but still minify your css as well).
Thanks to user worldofjr, I found the answer to be ENABLE GZIP on the server.
Unbelievable increase in speed.
That should be required reading whenever someone signs up for webhosting. I can't believe my site has gone 1 year without this when it took 2 seconds to enable!!!
"On the developer's page, the size is much smaller than the content. Does this mean that ONLY the elements called upon in the HTML are downloaded from the stylesheet? If yes, how does one accomplish this, because it sounds tremendously efficient."
That's not generally how css works. And what you describe might not be very efficient even if you did somehow implement it. The server would have to know which styles to send, which would require the browser to load the page and send in it's list and the server would have to generate the custom css and send it back...
You certainly can limit which css loads where--and you should. But that has to be done manually, at least in my experience.
Anyway, a few things to consider before you proceed:
I think 200k is really, really big for a stylesheet. Most sites I work on the total CSS weight might be 20k. Or less. I'd investigate why yours is so huge. Maybe something has gone wrong. Minifying might help, but it isn't going to knock you down a whole order of magnitude.
However, even if it is the world's largest CSS file, 200k isn't THAT big. Not 4-5 seconds big. I'm working on a site right now where an entire page weighs ~350kb and I have it loading in under a second. Again, I suspect something else has gone wrong. The size of that CSS is probably indicative of something but I suspect it's not purely the time it takes the file to download.

HTML - reduce byte size

I'm testing a website speed using PageSpeed Insights tool.
In the result page, one of the warnings suggested me to reduce byte size of css, html and js files.
At the first I tried to remove comments, but nothing changed.
How can I do that?
Should I remove spaces and tabs?
It seems to be a very long operation, worth it?
The action of removing spaces, tabs and useless chars is called minify.
You don't need to do that, there are a lot of services that can minimize files for you.
for example:
http://www.willpeavy.com/minifier/
Be care if you have jquery code: sometimes it removes spaces in wrong place.
You have two things to do to reduce page size:
Minify CSS & JS files
In server side, if you are running your website via Apache, you can install APC, for page cahing. You'll have better parformances
APC
In addition to CSS minifier/prettifier tools above, I recommend using proCSSor for optimizing CSS files. It offers variety of advanced options.
Never found those tools to be much use beyond giving some tips for what might be slowing it down. Minifying is unlikely to achieve much. If you want to speed up your site, save the page and see what the largest files are. Generally they will be the image files rather than the code, and see if you can reduce these.
Also, try and test it on two servers - is your host slow?
If your html file is massive, that suggests a problem with the site's structure - it is rare that a page needs to be large.
Finally, large javascript files are most likely to be things like jquery. If Google hosts these, then use the hosted version. That way, it will probably be already in a user's cache and not impact on your loading time.
EDIT, after further testing and incorporating the issues discussed in the comments below:
PageSpeed Insights is an utterly amateurish tool, and there are much more effective ways to speed up the rendering time than minifying the codes.
PageSpeed Insights is an utterly amateurish tool, that as a matter of standard advises to reduce HTML, CSS and JS file sizes, if not minified. A much, much better tool is Pingdom Website Speed Test. That compares rendering speed to the average of the sites it is asked to test, and gives the download times of the site's components.
Just test www.gezondezorg.org on both, and see the enormous difference in test results. At which the Google tool is dead wrong. It advises to reduce the CSS and JS files, while its own figures (click the respective headers) show that doing so will reduce their sizes with 3.8 and 7.9 kB, respectively. That comes down to less than 1 millisecond download time difference! (1 millisecond = 1/1000 of a second; presumed broadband internet).
Also, it says that I did do a good thing: enable caching. That is BS as well, because my .htaccess file tells browsers to check for newly updated files at every visit, and refresh cached files whenever updated. Tests confirm that all browsers heed that command.
Furthermore, that site is not intended to be viewed on mobile phones. There is just way too much text on it for that. Nevertheless, PageSpeed Insights opens default with the results of testing against mobile-phone criteria.
More effective ways to speed up the rendering
So, minifying hardly does anything to speed up the rendering time. What does do that is the following:
Put your CSS codes and Javascripts as much as possible in one file each. That saves browser-to-server (BTS) requests. (Do keep in mind that quite a number of Javascripts need the DOM to be fully loaded first, so in practice it comes down to putting the scripts as much as possible in 2 files: a pre- and a post-body file.)
Optimize large images for the web. Photoshop and the likes even have a special function for that, reducing the file size while keeping the quality good enough for use on the web.
In case of images that serve as full-size background for containers: use image sprites. That saves BTS requests as well.
Code the HTML and JS files so that there is no rendering dependency on files from external domains, such as from Twitter, Facebook, Google Analytics, advertisement agencies, etc.
Make sure to get a web-host that will respond swiftly, has a sufficient processing capacity, and has a(n almost) 100% up-time.
Use vanilla/native JS as much as possible. Use jQuery or other libraries only for tasks that would otherwise be too difficult or too time-consuming. jQuery not only is an extra file to download, it is also processed slower than native JS.
Lastly, you should realize that:
having the server minify the codes on the fly generally results in a much slower response from the server;
minifying a code makes it unreadable;
de-minifying tools are notorious for their poor performance.
Minifying resources refers to eliminating unnecessary bytes, such as extra spaces, line breaks, and indentation. Compacting HTML, CSS, and JavaScript can speed up downloading, parsing, and execution time. In addition, for CSS and JavaScript, it is possible to further reduce the file size by renaming variable names as long as the HTML is updated appropriately to ensure the selectors continue working.
You can find plenty of online tools for this purpose, a few of them are below.
HTML Minify
CSS Minify
JS Minify
good luck!