how to speed up this page - html

Is there any way I can speed up this webpage without too much hassle ? http://danydiop.com/node/115
It is too slow to load because it is too long...
thanks

Yes, add paging to your product list, or add a feature to load the next part only when the user scrolls, like Google image search does.
You can also choose to load just a small piece of the product list and add the rest automatically after the page is loaded even when the user doesn't scroll.

Your best bet is to break the product listing into multiple pages. There is simply too much information to display on that page.
You could in theory strip back the HTML used to render the page, and enable http compression on the server, but it won't help the fact that the page is simply too long.

Compress more the images
Enable gzip-encoding for the served pages
Put static files (images, CSS, js) on a cookieless domain
Ensure HTTP pipelining is supported by your server
Use CSS sprites for the images
In general follow the suggestions provided by google pagespeed

1) Use a CDN
2) Minify CSS
3) Minify JS
4) Compress images/use thumbnails and use lightbox to display close-ups.
5) Use Google's CDN for JS libs
6) Reduce HTTP Requests
7) Cache your pages

In addition to the advices given by Zoomzoom83, CAFxX and GolezTrol, you may want to take a look at the YSlow Firefox Extension. It that analyses your page and offers performance advices. From their page:
YSlow analyzes web pages and suggests ways to improve their performance based on a set of rules for high performance web pages. YSlow is a Firefox add-on integrated with the Firebug web development tool. YSlow grades web page based on one of three predefined ruleset or a user-defined ruleset. It offers suggestions for improving the page's performance, summarizes the page's components, displays statistics about the page, and provides tools for performance analysis, including Smush.it™ and JSLint.
https://addons.mozilla.org/en-us/firefox/addon/yslow/

Related

How to remove white space in web page before serving to client?

I am deploying a web application and I am able to compress CSS and js in my web application using page speed module in nginx/apache, but couldn't able to remove HTML white space.
Does anyone has done this before, I have seen this implementation in a major website such as LinkedIn Facebook, and Google.
Does removing white space in HTML add performance boost? As per my understanding removing whitespace reduces some extra bytes.
Here is an example of a condensed version of HTML page from google.
Does removing white space in HTML add performance boost?
Unlikely will you benefit much given that gzip is enabled for your site. The more you save during such a stripping phase the less benefit you would gain from gzipping and vice versa.
BTW, mod_pagespeed has the collapsible module doing what you're asking.
If mod_pagespeed doesn't meet your requirements
Multiple options
if you have a static html pages that just need to be returned to the user it's quite easy to do in offline mode
you can also do it at backend level using your framework batteries. I.e. in case you use a python framework this module could be used
django-html is an HTML minifier for Python, with full support for HTML
5. It supports Django, Flask and many other Python web frameworks. It also provides a command line tool, that can be used for static
websites or deployment scripts.
if none of the options above are viable use some 3rd party modules doing it at nginx level
depending on your needs you might also consider services like cloudflare
You can use tools like https://www.textfixer.com/html/compress-html-compression.php
If you use a text editor like atom or vscode you might as well look for a plugin that does that for you.
It doesn't really affect much on the performance, as you said it will reduce some bytes, but as I suspect you're not building something as big as an Amazon site it doesn't quite matter, moreover it will be a pain to read the code.

HTML - reduce byte size

I'm testing a website speed using PageSpeed Insights tool.
In the result page, one of the warnings suggested me to reduce byte size of css, html and js files.
At the first I tried to remove comments, but nothing changed.
How can I do that?
Should I remove spaces and tabs?
It seems to be a very long operation, worth it?
The action of removing spaces, tabs and useless chars is called minify.
You don't need to do that, there are a lot of services that can minimize files for you.
for example:
http://www.willpeavy.com/minifier/
Be care if you have jquery code: sometimes it removes spaces in wrong place.
You have two things to do to reduce page size:
Minify CSS & JS files
In server side, if you are running your website via Apache, you can install APC, for page cahing. You'll have better parformances
APC
In addition to CSS minifier/prettifier tools above, I recommend using proCSSor for optimizing CSS files. It offers variety of advanced options.
Never found those tools to be much use beyond giving some tips for what might be slowing it down. Minifying is unlikely to achieve much. If you want to speed up your site, save the page and see what the largest files are. Generally they will be the image files rather than the code, and see if you can reduce these.
Also, try and test it on two servers - is your host slow?
If your html file is massive, that suggests a problem with the site's structure - it is rare that a page needs to be large.
Finally, large javascript files are most likely to be things like jquery. If Google hosts these, then use the hosted version. That way, it will probably be already in a user's cache and not impact on your loading time.
EDIT, after further testing and incorporating the issues discussed in the comments below:
PageSpeed Insights is an utterly amateurish tool, and there are much more effective ways to speed up the rendering time than minifying the codes.
PageSpeed Insights is an utterly amateurish tool, that as a matter of standard advises to reduce HTML, CSS and JS file sizes, if not minified. A much, much better tool is Pingdom Website Speed Test. That compares rendering speed to the average of the sites it is asked to test, and gives the download times of the site's components.
Just test www.gezondezorg.org on both, and see the enormous difference in test results. At which the Google tool is dead wrong. It advises to reduce the CSS and JS files, while its own figures (click the respective headers) show that doing so will reduce their sizes with 3.8 and 7.9 kB, respectively. That comes down to less than 1 millisecond download time difference! (1 millisecond = 1/1000 of a second; presumed broadband internet).
Also, it says that I did do a good thing: enable caching. That is BS as well, because my .htaccess file tells browsers to check for newly updated files at every visit, and refresh cached files whenever updated. Tests confirm that all browsers heed that command.
Furthermore, that site is not intended to be viewed on mobile phones. There is just way too much text on it for that. Nevertheless, PageSpeed Insights opens default with the results of testing against mobile-phone criteria.
More effective ways to speed up the rendering
So, minifying hardly does anything to speed up the rendering time. What does do that is the following:
Put your CSS codes and Javascripts as much as possible in one file each. That saves browser-to-server (BTS) requests. (Do keep in mind that quite a number of Javascripts need the DOM to be fully loaded first, so in practice it comes down to putting the scripts as much as possible in 2 files: a pre- and a post-body file.)
Optimize large images for the web. Photoshop and the likes even have a special function for that, reducing the file size while keeping the quality good enough for use on the web.
In case of images that serve as full-size background for containers: use image sprites. That saves BTS requests as well.
Code the HTML and JS files so that there is no rendering dependency on files from external domains, such as from Twitter, Facebook, Google Analytics, advertisement agencies, etc.
Make sure to get a web-host that will respond swiftly, has a sufficient processing capacity, and has a(n almost) 100% up-time.
Use vanilla/native JS as much as possible. Use jQuery or other libraries only for tasks that would otherwise be too difficult or too time-consuming. jQuery not only is an extra file to download, it is also processed slower than native JS.
Lastly, you should realize that:
having the server minify the codes on the fly generally results in a much slower response from the server;
minifying a code makes it unreadable;
de-minifying tools are notorious for their poor performance.
Minifying resources refers to eliminating unnecessary bytes, such as extra spaces, line breaks, and indentation. Compacting HTML, CSS, and JavaScript can speed up downloading, parsing, and execution time. In addition, for CSS and JavaScript, it is possible to further reduce the file size by renaming variable names as long as the HTML is updated appropriately to ensure the selectors continue working.
You can find plenty of online tools for this purpose, a few of them are below.
HTML Minify
CSS Minify
JS Minify
good luck!

How to force Chrome to prerender more pages?

I'm learning about Chrome and Native Client.
Basically i want to increase number of pages that are prerendered
by chrome (now its just one page).
I was thinking about creating a extension that would
allow to prerender more pages.
Is this a way to go or am i left with hard coding it into Chrome and build from scratch?
EDIT
I started a bounty for this question. I would really appreciate some input.
No, there is no way to go, you would need to hardcode it in Chrome and rebuild as you noted.
As you probably already know, Chrome explicitly states that they currently limit the number of pages that can be prerendered:
Chrome will only prerender at max one URL per instance of Chrome (not one per tab). This limit may change in the future in some cases.
There is nothing in their supported API's or their experimental API's that will give you access to modify this. There is not a toggle in chrome://settings/ or chrome://flags/ or anywhere else in Chrome currently that allows you to change this. You can however use the Page Visibility API to determine if a page is being prerendered.
In addition to the one resource on the page that you specify using the <link rel="prerender" href="http://example.org/index.html"> you could also consider prefetching resources.
The issue with prefetching is it will only load the top level resource of at the specified URL. So, if you tried to prefetch the other pages, like so:
<link rel="prefetch" href="http://example.org/index.html">
...then only the index.html resource would be prefetched, not all of the CSS and JavaScript links containeed in the document. One approach could be to write out link tags for all of the contained resources in the page but that could get messy and difficult to maintain.
Another approach would be to wait for the current page to finish loading, and then use JavaScript to create an iframe, that is hidden off the page, targeted at the URLs you want to prefetch all the assets for. The browser would then load all of the content of the URL and it would be in the user's cache when they go to visit the page.
There is also a Chrome extension that combines these two approaches by searching for link tags that define a prefetch and then creates the hidden iframe causing the assets to be downloaded and cached.
If the goal is to optimize around client performance for navigating your site there might be other alternatives like creating a web application that uses a Single Page Application (SPA) style of development to reduce the number of times JS and CSS are loaded and executed. If you are a fan of Google then you might check out their framework for building SPAs called AngularJs.
If it was a good idea to pre-render more pages, Chrome would probably already do it. Pre-rendering too many pages will drain website bandwidth and possibly end up slowing down the whole web, which is the opposite of what you're trying to achieve. So it's most likely intentional that you can only pre-render a single page and you shouldn't try to break that.
Not possible. Chrome manages pre-rendering based on many factors. If this was possible, it could also be easily abused by many sites. You could, depending on your page, keep all content on one page.

Can I "pre-generate" all possible static-html pages of my "dynamic" website?

Some websites like for example http://www.idealo.co.uk seem to serve only static html, although their content is dynamic.
For example if I navigate through a certain category, I get a link to a static html page:
http://www.idealo.co.uk/cat/5666/electric-guitars.html
Now if I apply a custom filter, again I get a link to something that seems to be static html:
http://www.idealo.co.uk/cat/5666F456496-735760-1502100/electric-guitars.html
How is this achieved? Are there any frameworks out there that help to "pre-generate" all possible dynamic pages, in such way that whenever a new input is given, the page already exists (i.e. the static html is already available)?
Background: we run a small search engine for real estate offers. Offers are updated by our scraper once a day (the content is static through the day). The content is searchable on a Ruby-on-Rails website.
As the traffic increases, performance is becoming an issue. I'm wondering if there is any framework / tool that could batch-generate all our searches so that we could serve static html.
Their site isn't dynamic. They're using URL rewriting (e.g. mod_rewrite) to translate the input URLs into a request that can be satisfied by a script.
For example:
/cat/5666/electric-guitars.html
Might be rewritten to:
/cat.php?id=5666
A quick trick to test this is to go to /cat/5666/foo.html
The use of .html in this case is probably to hide what kind of scripting is used on their site, as a weak security-through-obscurity measure.
In response to your problem - no, there's no (easy) way to generate all possible results into static HTML files. You're looking at potentially billions of permutations. If you're having performance issues, look into performance profiling, caching, query optimisation, etc.
What you're describing is, in a sense, caching. With caching your application will generate pages (and even parts of pages) only when their content has changed. Rails has a lot of cache functionality built in, which you can tune to fit your needs. Start by reading the Rails Guide on caching which describes the Rails' capabilities as well as common add-ons. Google around for "Rails 3 caching"—there's tons of information out there. Finally, you can add software to your server stack that does additional caching, such as Squid and Varnish. With the right tools (and research) you can get 95% of the benefit of a static site without the effort of turning your site into a quasi-static Frankenapp by hand.
I finally found this blog post, which points to a few tools that do what I was looking for. I'm adding it here just for future reference:
Hyde
"Hyde is a static website generator powered by Python & Django. Hyde supports all the Django template tags & filters and even has a few of its own. The built-in web server + auto-generator provide instant refresh and unlimited flexibility..."
Jekyll
"Jekyll is a simple, blog-aware, static site generator. It takes a template directory containing raw text files in various formats, runs it through Markdown (or Textile) and Liquid converters, and spits out a complete, ready-to-publish static website suitable for serving with your favorite web server..."
blatter
"Blatter is a tiny tool for creating and publishing static web sites
built from dynamic templates..."

Make website load faster

I'm working on my first web app (www.shopperspoll.com). It's a Django based facebook app and I was wondering how I can go about making the pages load faster. Sorry that the main page is a mess now, started making some major changes yesterday, but you'll notice that it takes a really long for the page to load. I was wondering if that has something to do with the way I've set Django up on my host, or something else. If you could give me some pointers on how I can go about making it faster, I'd really appreciate it!
Thanks!
Couple of points need to be considered for better page loads
1) Put all javascripts externally, and always load scripts at the end of the page (in footer).
2) Always minify javscripts and css.
3) Merge all CSS and javascript into one file during load.
4) Make use of browser caching (for css, javascript and images).
5) Host static files like css, javascript and images over different domain. (Ex : static. shopperspoll.com )
6) Use tools likes firebug & yslow to check the load time.
7) use cdn.
Also refer this URL (better web performance from yahoo) : http://developer.yahoo.com/performance/rules.html
i hope this helps.
As for what Aby said, I'd also suggest using a Content Delivery Network (CDN) - CloudFlare offer a free tier for theirs which could be useful for you whilst early in growth. The majority of speeding up is already covered in Aby's answer, but once you've done that, a CDN can help with putting files geographically closer to your users.