What is causing such delay of loading my website? - html

I use divi and I created landing page. After finish I figured that it loads like 20 seconds for visitors! There is link to this page my page , I guess it's some kind of block or plugin but I can't figure out which. Any help? Struggling with this for hours.. Thanks.

you are loading a lot of content, images, videos, ... try to implement some lazyload for images & videos.
there are a lot of errors in the console.log
your hosting is slow
Try to fix these. Even Lighthouse times out on that page.
Regards Tom

I think it has to do with the hosting, because TTFB (Time To First Byte) is like 13 seconds.

I have runned a website speed check from GTMetrix, you can see the actual test here:
• https://gtmetrix.com/reports/universeoflight.xyz/xwhl9MyP
I am able to see on the "PageSpeed" and "YSlow" tab that there are few "SERVER" listed optimization recommendations:
Leverage browser caching
Add Expires headers
These recommendations can be easily fulfilled with good WordPress cache plugin. Recently, I have started using WPCacheOn on all of the websites I am managing and I am very happy with it. Simply install and activate the WPCacheOn and no further configurations or settings are needed for faster website.
Additionally another thing you should look into is the "lazy load" optimization. You can also consider using CDN service such as Cloudflare. Ensure that you are using the latest PHP version for optimal performance and security. Another good practice is to lower the images size. In the "Waterfall" tab in GTMetrix you can see that there is the next image:
https://universeoflight.xyz/wp-content/uploads/2019/10/19742.jpg
Which is 2 MB. If you reduce that image size this will also improve the overall loading of of your landing page.

Related

Permanently available file to download for internet speed testing

I am implementing a very simple download speed tester.
To do that, I simply download a large image file from the web and see if it's received within a reasonable time.
I'm using the following file because I saw that in a source code somewhere:
https://cdn.pixabay.com/photo/2017/08/30/01/05/milky-way-2695569_1280.jpg
However, I am afraid that the image might go away some time.
Which image could I use to make sure it will always be there?
Is there perhaps an image that a foundation or so has created especially for such a purpose and who promises that it will be there for a long time?
I was even thinking about downloading a really popular js file because I was thinking that it will be there for a long time like https://code.jquery.com/jquery-3.6.0.min.js, but I am not sure about this either.
How could I handle this task in the most reliable way?
Thank you!
I would recommend Wikimedia commons.
Reasons:
Wikimedia has a very strict guideline that allows only high quality uploads which of course are higher in size.
It's free and persistent (It perisist for years )
You can go with videos for even bigger sizes.
For images it's https://commons.m.wikimedia.org/wiki/Category:Images

Prevent loading of entire CSS stylesheet - only load needed content?

I'm trying to improve my site speed as I integrate certain features of a theme I purchased into my site and came across this interesting feature on the website of said theme.
The problem I have now is my CSS style sheet is absolutely massive (220kb) and takes ages (3-4 seconds) to load. I'm working to reduce that, but that's a different story...
When I try to load that exact same page on the theme dev's website, it loads much faster. Upon inspection, on my website, I find my 'Size' and 'Content' amounts of the stylesheet in the dev tools is about identical.
On the developer's page, the size is much smaller than the content. Does this mean that ONLY the elements called upon in the HTML are downloaded from the stylesheet? If yes, how does one accomplish this, because it sounds tremendously efficient.
I also thought it was because the dev's page was caching the content on my browser but I've disabled cache and it still only loads a portion of the content.
My website's cache works by the way, I'm just interested in speeding up that initial load time.
Any thoughts?
Relevant imagery:
Cache Disabled:
Dev's website with small size/content ratio - http://i.imgur.com/B3uhOWF.png
My website with ~1:1 size/content ratio - http://i.imgur.com/Ymu2YRz.png
Off the top of my head, you could minify your CSS. There's a few free tools around including the Page Speed Insights for Chrome.
See https://developers.google.com/speed/docs/insights/MinifyResources
It looks like your CSS is already in a minified file, but perhaps the content isn't!
Just so people aren't confused. The solution was to enable gzip on the server, which you can read more about at https://developers.google.com/speed/articles/gzip (but still minify your css as well).
Thanks to user worldofjr, I found the answer to be ENABLE GZIP on the server.
Unbelievable increase in speed.
That should be required reading whenever someone signs up for webhosting. I can't believe my site has gone 1 year without this when it took 2 seconds to enable!!!
"On the developer's page, the size is much smaller than the content. Does this mean that ONLY the elements called upon in the HTML are downloaded from the stylesheet? If yes, how does one accomplish this, because it sounds tremendously efficient."
That's not generally how css works. And what you describe might not be very efficient even if you did somehow implement it. The server would have to know which styles to send, which would require the browser to load the page and send in it's list and the server would have to generate the custom css and send it back...
You certainly can limit which css loads where--and you should. But that has to be done manually, at least in my experience.
Anyway, a few things to consider before you proceed:
I think 200k is really, really big for a stylesheet. Most sites I work on the total CSS weight might be 20k. Or less. I'd investigate why yours is so huge. Maybe something has gone wrong. Minifying might help, but it isn't going to knock you down a whole order of magnitude.
However, even if it is the world's largest CSS file, 200k isn't THAT big. Not 4-5 seconds big. I'm working on a site right now where an entire page weighs ~350kb and I have it loading in under a second. Again, I suspect something else has gone wrong. The size of that CSS is probably indicative of something but I suspect it's not purely the time it takes the file to download.

If multiple pages are using the same css file,does it load again and again?

This is a general question but i didn't found any satisfactory answer of this.I am creating a website with 7-8 pages.I have a common css which is being used in all the pages.When i go to my homepage this css gets loaded.Now when i go to some other page,would this css get loaded again from server or from browser local cache?
I read somewhere that you have to make some changes in server's .htaccess file for enabling the browser cache.Does browser itself doesn't use the cached files? I would be hosting the website first time so i have no idea of this stuff.Please guide so that i can make a site with better performance.
It depends on the caching policy enforced by the web server/ Most goods ones will only deliver it once and that may be over multiple visits to your site. Perhaps look at the HTTP headers to find out. Firebug can do this for you
I think this depends on the browser settings. I do notice in chrome I sometimes need to empty the cache. You can also specify it to cache. I am not exactly sure how, but you can look up application caching.

Make website load faster

I'm working on my first web app (www.shopperspoll.com). It's a Django based facebook app and I was wondering how I can go about making the pages load faster. Sorry that the main page is a mess now, started making some major changes yesterday, but you'll notice that it takes a really long for the page to load. I was wondering if that has something to do with the way I've set Django up on my host, or something else. If you could give me some pointers on how I can go about making it faster, I'd really appreciate it!
Thanks!
Couple of points need to be considered for better page loads
1) Put all javascripts externally, and always load scripts at the end of the page (in footer).
2) Always minify javscripts and css.
3) Merge all CSS and javascript into one file during load.
4) Make use of browser caching (for css, javascript and images).
5) Host static files like css, javascript and images over different domain. (Ex : static. shopperspoll.com )
6) Use tools likes firebug & yslow to check the load time.
7) use cdn.
Also refer this URL (better web performance from yahoo) : http://developer.yahoo.com/performance/rules.html
i hope this helps.
As for what Aby said, I'd also suggest using a Content Delivery Network (CDN) - CloudFlare offer a free tier for theirs which could be useful for you whilst early in growth. The majority of speeding up is already covered in Aby's answer, but once you've done that, a CDN can help with putting files geographically closer to your users.

What's the most efficient way to add social media "like" and "+1" buttons to your site?

The task sounds trivial but bear with me.
These are the buttons I'm working with:
Google (+1)
Facebook (Like)
Twitter (Tweet)
LinkedIn (Share)
With a little testing on webpagetest.org I found that it's incredibly inefficient if you grab the snippet from each of these services to place these buttons on your page. In addition to the images themselves you're also effectively downloading several JavaScript files (in some cases multiple JavaScript files for just one button). The total load time for the Facebook Like button and its associated resources can be as long as 2.5 seconds on a DSL connection.
Now it's somewhat better to use a service like ShareThis as you can get multiple buttons from one source. However, they don't have proper support for Google +1. If you get the code from them for the Google +1 button, it's still pulling all those resources from Google.
I have one idea which involves loading all the buttons when a generic looking "Share" button is clicked. That way it's not adding to the page load time. I think this can be accomplished using the code described here as a starting point. This would probably be a good solution but I figured I'd ask here before going down that road.
I found one possible solution if you don't care about the dynamic aspect of these buttons. In other words, if you don't care to show how many people have +1'd or liked your page, you can just use these links...
https://plusone.google.com/_/+1/confirm?hl=en&url={URL}
http://www.facebook.com/share.php?u={URL}
http://twitter.com/home/?status={STATUS}
http://www.linkedin.com/shareArticle?mini=true&url={URL}&title={TITLE}&summary={SUMMARY}&source={SOURCE}
You'd just have to insert the appropriate parameters. It doesn't get much simpler or lightweight than that. I'd still use icons for each button of course, but I could actually use CSS sprites in this case for even more savings. I may actually go this route.
UPDATE
I implemented this change and the page load time went from 4.9 seconds to 3.9 seconds on 1.5 Mbps DSL. And the number of requests went from 82 to 63.
I've got a few more front-end optimizations to do but this was a big step in the right direction.
I wouldn't worry about it, and here's why: if the websites in question have managed their resources properly - and, come on, it's Google and Facebook, etc... - the browser should cache them after the first request. You may see the effect in a service where the cache is small or disabled, but, in all likelihood, all of your clients will already have those resources in their cache before they ever reach your page.
And, just because I was curious, here's another way:
Here's the snippet of relevant code from StackOverflow's facebook share javascript:
facebook:function(c,k,j){k=a(k,"sfb=1");c.click(function(){e("http://www.facebook.com/sharer.php?u="+k+"&ref=fbshare&t="+j,"sharefacebook","toolbar=1,status=1,resizable=1,scrollbars=1,width=626,height=436")})}}}();
Minified, because, hey, I didn't bother to rework the code.
It looks like the StackOverflow engineers are simply calling up the page on click. That means that it's just text until you click it, which dynamically pulls everything in lazily.