Long delay on document load with wordpress - html

I'm working on optimizing a site and I've been able to get all of the pages to load in around 2 seconds just by optimizing the images with photoshop but, I have one page that doesn't seem to react like the rest. When I open up the developer tools in Chrome and watch the waterfall timeline of the loading elements it shows that there is a lag of over 3 seconds just to load the document. The fastest time I've ever been able to get the page to load is around 7 seconds, which is way longer than it should be since the page is only 895 kB.
Is this a server issue or is there some faulty code in the specific page? It makes me wonder because other pages on the same website load much faster even though their size is greater. Any help from you would be awesome!
Link to webpage: http://egi.utah.edu/corporate-associate-program/corporate-associate-list/

A cache plugin probably will solve your problem since no query will be called to db unless something in the page changed.
I used 2 of them and both speed up sites. (But only use one, you cant use both at the same time.)
Wp-rocket http://wp-rocket.me/
Wordpress super cache https://wordpress.org/plugins/wp-super-cache/

I was able to solve the problem with your input. It turns out that it was a image slider plugin (FlexSlider) that was re-generating the images to fit the size of the element which was slowing it down so much. There were 7 large images that it re-generated on the server side as the page was requested. All I had to do was stop the plugin from resizing the images by leaving it to the default setting and I wrote some javascript to do the same thing but locally and it sped up the loading time drastically. It's now loading at under 3 seconds when it was loading at around 10.
Thanks #drew010

Related

thumbnails keep reloading on page scroll

The thumbnails on my photos page take a long time to load, then if you scroll down the page and back up, they have somehow vanished and need to reload. I'm using an old version of chrome on Windows XP, so I'm sure that's half my problem, BUT, it still does this on my cell (galaxy S5). I'm a novice coder so please go easy on me lol. Here is the link to my website and the photos page:
http://www.mikemicalizzicontracting.com/photos.html
I can't really help you about your precise questions, as we don't have your code, but I am in a good mood so I'll just give you some tips to make your website quicker.
Some tips for optimize your images :
1) Resize your images
For now your images are displayed in 207 * 154, which is small, but you download them in their original size. You should resize them so the size of the image (in bits) would be much lower.
You can do that with a lot of online tools, or write a simple script for that in bash for instance. I think you will find a pre-made solution for the script very easily, or just have to download imagick (I am not sure of the name). This solution by scrip also depends on your distribution (wether you are usinc OSX, Windows, Linux..)
2) Cache your images first
It is always painful for the user to wait for images to display. So what you need to do is use browser cache (if you are novice you wil have some trouble but it will be worth it) to store the images data and then display them from this cache.
So the strategy is :
If there is the image in cache, just display it from there
If there isn't, download it from your server
once it is loaded, put it in cache; then display it from there.
3) (More advanced) Preload the images
If your home page is very easy to display (not a lot if images, not that much script to run etc), you can start download some images in the client and just store them in the cache.
This task should always be a second-hand task, it means it should not slow your user navigation. So depending on what the user is doing, you may will have to pause the request and start it again.
If the user go on the photo page when an image is currently downloading for cache, you don't need to start it all over again, just continue.
This last tip can be long to develop, but this is a good thing to do when you want to display very good quality images or just plenty of them.

first background image on a website loads slowly

The first background image I have on my website - loads slowly.
Now, it's a background image but it should fit a large screen so it's a 900K png.
What are the tricks I can make in order for it to load faster? (it's the first thing you see so even the 3-4 seconds it takes is felt...
There are lots of tricks. The lads above in their comments are right, there are things you can do to compress it, and analysis what is taking the time.
Load times and improving them are around 3 components:
- Network Latency (the time it takes to pass data from the browser to your host)
- Image Size (physical size of the image that goes down the pipe)
- Connection Cost (Time taken to establish the connection with the host)
Now, if you we take your scenario, you have an image that is large, but not huge. So how can we make it faster, well we can remove the time it takes to download by:
a) compressing it - many options here, along with the google one above, here is a simple one: https://tinypng.com/
b) remove the network latency by ensuring the host sends the cache-control with a max-age in it. This will mean when you refresh it the host returns a 304 and that browser shows the image from the cache
c) use a CDN such as cloudflare to do the hard stuff for you. A CDN will handle the compression, setting of the cache flag, reduce your network latency and ensures your connections are established quickly. Some like cloudflare have free starting tariffs, so worth a look.
d) lazy load the image using javascript after the page has rendered. It's still good to do all the above, but if you create some javascript to fire on document.ready and set the background using css/img tag, background or whatever mechanism you are currently using then it will start the download of the image at that point, enabling your users to see the page and interact with it ahead of your background being finalised.
Besides improving website performance, I consider it an important courtesy to site visitors to make a point of compressing all images.
(Not to insult anyone's intelligence here but, just to clarify, compressing an image is not the same as "shrinking" it's width/height — although both will reduce file size.)
If you have 1000 visitors and your site has 100 images and you save a half second of load time per image, then you just saved mankind 14 hours of page load time! (and it only takes a few seconds to compress an image.)
I compress the occasional "one-off" manually with:
TinyPNG.com for .pngs and .jpg's (up to 20 images at a time)
ezGIF.com for animated .GIFs.
Batches of images can be compressed programmatically using a library like:
pngquant, or
PHP's built-in ImageMagick.
As far as I know these all use the same compression algorithm, known as "Lossy Compression", which can reduce filesize significantly without any noticeable degradation of image quality.

How do I optimize a very loooong page with TONS of images on it?

I have been tasked with optimizing a marketing landing page for speed. It already does really well, but the problem is its very graphic heavy.
The entire thing I would guestimate is 30+ pages long all on one page (It must be like this, and everything must be images due to conversion reasons).
Here's what I've done so far but I'm not sure if there's something else I'm missing
I re-compressed over a 140 jpg's on the page to slightly smaller sizes
I played around with sprites but most of the images are all large (like entire testimonial boxes that are 600px wide). The sprite images ended up being like 2mb. That page actually took longer to load for some reason (by almost 2s) so I took them off
The MOST important thing is that everything immediately at the top loads as fast as humanly possible and before anything else so that the sale isn't lost by someone starting at a bunch of images loading out of order. I used some preaching images with this method: http://perishablepress.com/press/2009/01/18/css-image-caching/ - It did seem to work quite well on the smaller images, but when I add the background (which is very graphic intensive) everything seems to slow down at once, like its trying to do a certain number (5?) of images at a time. Ideally I'd love to group the first 4 smaller images being pre-cached, and then follow it up with focusing all bandwidth on the background, then the rest of the entire page in standard order..
Google Page Speed score is 90/100. the only thing its concerned about is unused styles but that I'm not really concerned about because its about 2kb total... the overhead from the 7mb of images is way more important.
I have the option to use a CDN for the images but I'm not sure how I'd go about doing this or if it would actually help?
I feel I've done all I can but then again I could totally be missing something...
A CDN would definitely help. And with 140 pictures, I hope it
contains more than just server. Just link directly to the IP of
the servers, to avoid unnecessary DNS lookup.
Minimize HTTP requests. You mention that you've been experimenting
with sprites. I still believe sprites to be the way to go, but you
might not want to create just one, huge image. If you have 140
images, you should probably have about 10 sprites.
Make sure that you have set headers to make the client cache all
content. Remember ETags.
Gzip/compress content for browsers that allow it.
If the source code is lengthy, make sure to flush the buffer early.
Consider lazily loading images below the fold? There are a number of javascript plugins to accomplish this
scrolling pagination + combining images as sprites on a per-page basis.

Why is this web page loading so slow?

Can anyone please help with the iframe in the following page:
http://searchbankproperties.com/bank-owned-properties-orlando/
2 questions:
I don't understand why the page is loading slow..
Is it possible to only display the main content of the page I'm trying to iframe without the header and navigation? See the link below.
http://farm5.static.flickr.com/4123/4941392626_3886396314_b.jpg
The websites is http://searchbankproperties.com
The page I'm trying to iframe is located in http://searchorlandoproperties.com
In order to pintpoint a problem like that I'd recommend that you use a tool like YSlow and look at what resource takes the most time to load. Then try to optimize.
I've tried to load the page on Chrome, not blazingly fast but acceptable (less than 1 second), then tried on Firefox, takes more time (between 2 and 3 seconds). Finally tried on IE... you can go and take a coffee in the meanwhile ;)
That time basically depends on two factors: iframe rendering (that's a browser business, you can't do much about it) and page downloading speed, which depends on how much data has to be downloaded. Try to work on that second factor if you can...
About the second question: it's "theoretically possible" via javascript but before doing this try to see if you can produce a lightweight version of the iframed page on searchorlandoproperties.com.

Reducing load time, or making the user think the load time is less

I've been working on a website, and we've managed to reduce the total content for a page load from 13.7MiB's to 2.4, but the page still takes forever to load.
It's a joomla site (ick), and it has a lot of redundant DOM elements (2000+ for the home page), and make 60+ HttpRequest's per page load, counting all the css, js, and image requests. Unlike drupal, joomla won't merge them all on the fly, and they have to be kept separate or else the joomla components will go nuts.
What can I do to improve load time?
Things I've done:
Added colors to dom elements that have large images as their background so the color is loaded, then the image
Reduced excessively large images to much smaller file sizes
Reduced DOM elements to ~2000, from ~5000
Loading CSS at the start of the page, and javascript at the end
Not totally possible, joomla injects it's own javascript and css and it does it at the header, always.
Minified most javascript
Setup caching and gziping on server
Uncached size 2.4MB, cached is ~300KB, but even with so many dom elements, the page takes a good bit of time to render.
What more can I do to improve the load time?
Check out this article.
http://www.smashingmagazine.com/2010/01/06/page-performance-what-to-know-and-what-you-can-do/
If the link gets removed or lost the tools mentioned are:
YSlow (by Yahoo)
Google's Page speed
AOLs web page test
Smush.it (Image compression tool)
It sounds like you've done a great job of working around the real problem: those giant graphics. You can probably squeeze some more efficiency out of caching, minifying, etc., but there has to be a way to reduce the size of the images. I worked with a team of some of the pickiest designers on Earth and they never required uncompressed JPEGs. Do you mean images cut out of Photoshop and saved on full quality (10)? If so, the real solution (and I appreciate that you may not be able to accomplish this) is to have a hard conversation where you explain to the design company, "You are not your users." If the purpose of the site is to only impress other visual designers with the fidelity of your imagery, maybe it's ok. If the purpose of the site is to be a portfolio that gains your company work, they need to re-asses who their audience is and what the audience wants. Which, I'm guessing, is not 2 minute load times.
Have you enabled HTTP Compression (gzip) on your web servers? That will reduce the transfer size of all text-based files by 60-90%. And your page will load 6-10x faster.
Search StackOverflow or Google for how to enable it. (It varies per server software: Apache, IIS, etc).
Are all the DOM elements necessary? If they are, is it possible to hide them as the page loads? Essentially, you would have your important need-to-be-there dom elements render with the page, and then when the document is loaded, you could unhide the rest of the elements as necessary
$('.hidden').removeClass('hidden')
Evidently, there are plugins for compressing and combining JS/CSS files:
http://extensions.joomla.org/extensions/site-management/site-performance
http://extensions.joomla.org/extensions/site-management/site-performance/7350
I would say you can't do anything. You're close to the absolute limit.
Once you get to a critical point, you'll have to compare the amount of effort you'd need to input into further compressing the site against the effort of throwing the most bandwidth expensive components out the window. For example, you might spend 20 hours further compressing your site by 1%. You might also spend 40 hours on throwing the Joomla theme away and starting again saving 50%. (though probably not).
I hope you find an easy answer. I feel your pain as I am struggling to finish a Drupal project than has been massacred by a designer who hasn't implemented his own code in years and has become institutionalized in a cubicle somewhere near a tuna salad sandwich.