How to force Chrome to prerender more pages? - google-chrome

I'm learning about Chrome and Native Client.
Basically i want to increase number of pages that are prerendered
by chrome (now its just one page).
I was thinking about creating a extension that would
allow to prerender more pages.
Is this a way to go or am i left with hard coding it into Chrome and build from scratch?
EDIT
I started a bounty for this question. I would really appreciate some input.

No, there is no way to go, you would need to hardcode it in Chrome and rebuild as you noted.
As you probably already know, Chrome explicitly states that they currently limit the number of pages that can be prerendered:
Chrome will only prerender at max one URL per instance of Chrome (not one per tab). This limit may change in the future in some cases.
There is nothing in their supported API's or their experimental API's that will give you access to modify this. There is not a toggle in chrome://settings/ or chrome://flags/ or anywhere else in Chrome currently that allows you to change this. You can however use the Page Visibility API to determine if a page is being prerendered.
In addition to the one resource on the page that you specify using the <link rel="prerender" href="http://example.org/index.html"> you could also consider prefetching resources.
The issue with prefetching is it will only load the top level resource of at the specified URL. So, if you tried to prefetch the other pages, like so:
<link rel="prefetch" href="http://example.org/index.html">
...then only the index.html resource would be prefetched, not all of the CSS and JavaScript links containeed in the document. One approach could be to write out link tags for all of the contained resources in the page but that could get messy and difficult to maintain.
Another approach would be to wait for the current page to finish loading, and then use JavaScript to create an iframe, that is hidden off the page, targeted at the URLs you want to prefetch all the assets for. The browser would then load all of the content of the URL and it would be in the user's cache when they go to visit the page.
There is also a Chrome extension that combines these two approaches by searching for link tags that define a prefetch and then creates the hidden iframe causing the assets to be downloaded and cached.
If the goal is to optimize around client performance for navigating your site there might be other alternatives like creating a web application that uses a Single Page Application (SPA) style of development to reduce the number of times JS and CSS are loaded and executed. If you are a fan of Google then you might check out their framework for building SPAs called AngularJs.

If it was a good idea to pre-render more pages, Chrome would probably already do it. Pre-rendering too many pages will drain website bandwidth and possibly end up slowing down the whole web, which is the opposite of what you're trying to achieve. So it's most likely intentional that you can only pre-render a single page and you shouldn't try to break that.

Not possible. Chrome manages pre-rendering based on many factors. If this was possible, it could also be easily abused by many sites. You could, depending on your page, keep all content on one page.

Related

Advantages of the html5Mode?

I've read quite some post about the angularjs html5Mode including this one and I'm still puzzled:
What is it good for? I can see that I can use http://example.com/home instead of http://example.com/#/home, but the two chars saved are not worth mentioning, are they?
How is this related to HTML5?
The answers links to a page showing how to configure a server. It seems like the purpose of this rewriting to make the server return always the same page, no matter what the URL looks like. But doesn't it read to needlessly increased traffic?
Update after Peter Lyons's answer
I started to react in a comment, but it grew too long. His long and valuable answer raises some more questions of mine.
option of rendering the actual "/home"
Yes, but that means a lot of work.
crazy escaped fragment hacks
Yes, but this hack is easy to implement (I did it just a few hours ago). I actually don't know what I should do for in case of the html5mode (as I haven't finished reading this seo article yet.
Here's a demo
It works neither in my Chromium 25 nor in my Firefox 20. Sure, they're both ancient, but everything else I needed works in both of them.
Nope, it's the opposite. The server ONLY gets a full page request when the user first clicks
But the same holds for the hashbang, too. Moreover, a user following an external link to http://example.com/#!/home and then another link to http://example.com/#!/foreign will be always served the same page via the same URL, while in the html5mode they'll be served the same page (unless the burdensome optimization you mentioned gets done) via a different URL (which means it has to be loaded again).
but the two chars saved are not worth mentioning, are they?
Many people consider the URL without the hash considerably more "pretty" or "user friendly". Also, a very big difference is when you browse to a URL with a hash (a "fragment"), the browser does NOT include the fragment in it's request to the server, which means the server has a lot less information available to deliver the right content immediately. As compared to a regular URL without any fragment, where the full path "/home" IS including in the HTTP GET request to the server, thus the server has the option of rendering the actual "/home" content directly instead of sending the generic "index.html" content and waiting for javascript on the browser to update it once it loads and sees the fragment is "#home".
HTML5 mode is also better suited for search engine optimization without any crazy escaped fragment hacks. My guess is this is probably the largest contributing factor to the push for HTML5 mode.
How is this related to HTML5?
HTML5 introduced the necessary javascript APIs to change the browser's location bar URL without reloading the page and without just using the fragment portion of the URL. Here's a demo
It seems like the purpose of this rewriting to make the server return always the same page, no matter what the URL looks like. But doesn't it read to needlessly increased traffic?
Nope, it's the opposite. The server ONLY gets a full page request when the user first clicks a link onto the site OR does a manual browser reload. Otherwise, the user can navigate around the app clicking like mad and in some cases the server will see ZERO traffic from that. More commonly, each click will make at least one AJAX request to an API to get JSON data, but overall the approach serves to reduce browser<->server traffic. If you see an app responding instantly to clicks and the URL is changing, you have HTML5 to thank for that, as compared to a traditional app, where every click includes a certain minimum latency, a flicker as the full page reloads, input forms losing focus, etc.
It works neither in my Chromium 25 nor in my Firefox 20. Sure, they're both ancient, but everything else I needed works in both of them.
A good implementation will use HTML5 when available and fall back to fragments otherwise, but work fine in any browser. But in any case, the web is a moving target. At one point, everything was full page loads. Then there was AJAX and single page apps with fragments. Now HTML5 can do single page apps with fragmentless URLs. These are not overwhelmingly different approaches.
My feeling from this back and forth is like you want someone to declare for you one of these is canonically more appropriate than the other, and it's just not like that. It depends on the app, the users, their devices, etc. Twitter was all about fragments for a good long while and then they realized their mobile users were seeing too much latency and "time to first tweet" was too long, so they went back to server-side rendering of HTML with real data in it.
To your other point about rendering on the server being "a lot of work", it's true but some consider it the "holy grail" of web app development. Look at what airbnb has done with their
rendr framework. See also Derby JS. My point being, if you decide you want rendering in both the browser and the server, you pick a framework that offers that. Not that you have a lot of options to choose from at the moment, granted, but I wouldn't advise hacking together your own.

If multiple pages are using the same css file,does it load again and again?

This is a general question but i didn't found any satisfactory answer of this.I am creating a website with 7-8 pages.I have a common css which is being used in all the pages.When i go to my homepage this css gets loaded.Now when i go to some other page,would this css get loaded again from server or from browser local cache?
I read somewhere that you have to make some changes in server's .htaccess file for enabling the browser cache.Does browser itself doesn't use the cached files? I would be hosting the website first time so i have no idea of this stuff.Please guide so that i can make a site with better performance.
It depends on the caching policy enforced by the web server/ Most goods ones will only deliver it once and that may be over multiple visits to your site. Perhaps look at the HTTP headers to find out. Firebug can do this for you
I think this depends on the browser settings. I do notice in chrome I sometimes need to empty the cache. You can also specify it to cache. I am not exactly sure how, but you can look up application caching.

Make website load faster

I'm working on my first web app (www.shopperspoll.com). It's a Django based facebook app and I was wondering how I can go about making the pages load faster. Sorry that the main page is a mess now, started making some major changes yesterday, but you'll notice that it takes a really long for the page to load. I was wondering if that has something to do with the way I've set Django up on my host, or something else. If you could give me some pointers on how I can go about making it faster, I'd really appreciate it!
Thanks!
Couple of points need to be considered for better page loads
1) Put all javascripts externally, and always load scripts at the end of the page (in footer).
2) Always minify javscripts and css.
3) Merge all CSS and javascript into one file during load.
4) Make use of browser caching (for css, javascript and images).
5) Host static files like css, javascript and images over different domain. (Ex : static. shopperspoll.com )
6) Use tools likes firebug & yslow to check the load time.
7) use cdn.
Also refer this URL (better web performance from yahoo) : http://developer.yahoo.com/performance/rules.html
i hope this helps.
As for what Aby said, I'd also suggest using a Content Delivery Network (CDN) - CloudFlare offer a free tier for theirs which could be useful for you whilst early in growth. The majority of speeding up is already covered in Aby's answer, but once you've done that, a CDN can help with putting files geographically closer to your users.

Prevent site configuration info from showing up on Google

I have a site that's running WordPress.
The main page has an embedded Flash player and an imbedded iframe, and for some reason, all the configuration info from the Flash player is showing up on Google for my site, and nothing else.
How can I have the main site information show up on Google, without having that Flash player config info show up?
And can I customize what shows up at all?
If there's some way to tag the info I don't want to show up, or tag the info I want to show up, I can probably do most ofthe edits myself, I just don't know where to start...
EDIT: I tried most of the suggestions below, and I didn't get anywhere...
Any other ideas?
Thanks a lot!
If you don't want Google, or other crawler to access certain parts of your website you should use a robots.txt file. Inside you specify which parts are accessible and which aren't, when the crawlers get to your website will always look for this file for instructions.
You can check some documentation on how to do it here and here
In order to influence what text is used on the google search result try putting this within your head tags
<meta name="description" content="WHATEVER YOU WANT DISPLAYED ON GOOGLE">
Source: http://static.googleusercontent.com/external_content/untrusted_dlcp/www.google.com/en/us/webmasters/docs/search-engine-optimization-starter-guide.pdf
Some more information from google on controling parts of a page. Apparently there are google off/google on tags.
http://perishablepress.com/press/2009/08/23/tell-google-to-not-index-certain-parts-of-your-page/
Hope this helps.
If you want Google to index only part of your pages, you can't follow normal SEO routines. You should provide a mechanism to understand whether the current client (requester) is a robot or not. If yes, then don't render that part. This is the only way. Otherwise, a robot either gets the whole rendered content, or doesn't have access based on robots.txt file (Robot Exclusion Protocol).
Another way (which is not really smart, and can't be guaranteed to work) is to dynamically inject your content into the page via JavaScript. Because AMAIK, robots don't run JavaScript.
As search spiders won't render javascript generated markup (JS is not run as it is client-side in the browser), a quick fix would be to don't output any of flash / markup initially in the HTML document and then use JS to add the flash stuff on load.
Note: as far as I'm aware, Google is currently testing a JS reading spider so this may not work long term.
Google is returning this data because it simply can't find any content where it normally would. Search engines require content - they're not advanced enough to process your multimedia to determine what it's all about.
Google will IGNORE your meta description if it doesn't feel that it reflects your page content (of which there is only iframes and JS)
Use SWFObject to provide alternate content for users without flash (including search engines) - ensure it's not some dinky text like "download flash here" - but a lengthy descriptive content piece about your site or media that they would normally experience if they could experience.
Use robots.txt or <meta name="robots" content="noindex,follow"> for the iframe content to prevent it from being indexed.
For the love of all things holy, please look at reducing the number of JS files and inline JS on your site (i'd recommend WP-minify since it's so obvious that you love plugins)

how to speed up this page

Is there any way I can speed up this webpage without too much hassle ? http://danydiop.com/node/115
It is too slow to load because it is too long...
thanks
Yes, add paging to your product list, or add a feature to load the next part only when the user scrolls, like Google image search does.
You can also choose to load just a small piece of the product list and add the rest automatically after the page is loaded even when the user doesn't scroll.
Your best bet is to break the product listing into multiple pages. There is simply too much information to display on that page.
You could in theory strip back the HTML used to render the page, and enable http compression on the server, but it won't help the fact that the page is simply too long.
Compress more the images
Enable gzip-encoding for the served pages
Put static files (images, CSS, js) on a cookieless domain
Ensure HTTP pipelining is supported by your server
Use CSS sprites for the images
In general follow the suggestions provided by google pagespeed
1) Use a CDN
2) Minify CSS
3) Minify JS
4) Compress images/use thumbnails and use lightbox to display close-ups.
5) Use Google's CDN for JS libs
6) Reduce HTTP Requests
7) Cache your pages
In addition to the advices given by Zoomzoom83, CAFxX and GolezTrol, you may want to take a look at the YSlow Firefox Extension. It that analyses your page and offers performance advices. From their page:
YSlow analyzes web pages and suggests ways to improve their performance based on a set of rules for high performance web pages. YSlow is a Firefox add-on integrated with the Firebug web development tool. YSlow grades web page based on one of three predefined ruleset or a user-defined ruleset. It offers suggestions for improving the page's performance, summarizes the page's components, displays statistics about the page, and provides tools for performance analysis, including Smush.itâ„¢ and JSLint.
https://addons.mozilla.org/en-us/firefox/addon/yslow/