DOMContentLoaded/load (event), how to increase the speed. - html

I'm trying to do everything I can to increase the load speed of my pages, and in particular the ajax loaded components.
In firebug, my out put looks like this
I'm not entirely sure if I'm reading this correctly, but it is either +2.19s for DOMContentLoaded (or it mayonly be .8 if we are supposed to subtrack that from the waiting response).
But then 4.67s for the 'load' (event).
Both of these seem like significantly long loading times.
I haven't been able to figure out what would cause these sorts of times.
These stats are from loading a straight html page which I normally load via ajax.
But this is just the HTML. No javascript in the page, and the page is being loaded directly, not through an ajax request.
However when i do load this page via ajax, I recognize a significant delay while the page attempts to load.
Any suggestions?
I've been looking through the html in IE debugbar, and it all looks surprisingly clean.
There are 30 images in the page. Could that be what the 'load' event is waiting for? And if so, is there any way to speed this up?
In particular, as the user will never load this page directly, but only via an ajax request, is their a way to improve the page loading performance in ajax. The problem isn't with the ajax loading script, but with the html page specifically.
----------------------EDITTED------------------------------
The results of the page are loaded into a jquery cycle where multiple images are visible at a single time, so using lazyloader provides a pretty horrible user experience. (assuming it is the images which is causing this problem).

Those firebug stats are telling you that:
It takes 2.1 seconds from the time you started the request for your server-side code to start returning a response
It then takes 19ms to download the response (i.e. the HTML)
71ms or so after that, the DOMContentLoaded event fires
After the page has completely loaded (after the images are all downloaded), the load event is fired.
DOMContentLoaded fires as soon as the DOM is ready. The load event waits until the whole page is loaded (including any external resources like images)
Most of the time in your request (other than downloading the images) is spent waiting for the server to construct the HTML. Maybe your server-side code can be optimized to run faster?

Related

Reload/ Refresh Mechanism

I would like to know the mechanism behind reloading/ refreshing a Webpage. My questions are-
1. Does reloading/ refreshing a webpage that completely failed to load earlier always result in instant loading of the entire webpage correctly with all its contents?
2. Or does it result in delayed loading of the entire webpage?
3. Or does it result in delayed loading of the webpage with some contents (Ex. images) missing?
4. Or does it do nothing? ie., the page still fails to load because the server is down?
I know that all these scenarios are possible. But I would like to know what causes each of these above scenarios? (For Ex. is it the server being down or busy or the content not being available anymore?)
Also I would like to know what happens in these two cases-
->when content is fetched from a single server
->when content is fetched from multiple servers
Reloading does not always result in instant loading of the entire webpage correctly. There are many different reason a page may not load correctly, many of them simply reloading the page will not fix. There are so any reasons there really isn't time to explain here.
I should load with the same data it received before reloading the page. Unless something has changed in the interim.
If the server is down. Reloading the page will not result in the page loading correctly the second time.
When content is fetched from a single server the content comes from just that server. When content is served from multiple servers, say the images are being served from a CDN, they are loaded from those places.
Reloading a page does nothing different that when you first visit a page. Although in certain circumstances the local browser cache may be reset.

Polymer Hybrid Rendering

I'm interested in improving the initial page load of an app that is currently rendering entirely clientside. Right now the app loads an initial app frame and then, once the initial page is loaded, fires a request to the server to fetch the data. While the request is processing, the user effectively sees a partially rendered page. Once the data comes back from the server, the page finishes rendering on the client.
What is the best way to remove the delay caused by fetching the initial page and data separately? Should I just bootstrap the data into the initial page load, or should I leverage some sort of server side templating engine (Jade, Handlebars, etc)? It seems like doing the latter means not being able to leverage features like dom-repeat as easily, thus losing the ability to have Polymer handle some of the more complex re-rendering scenarios.
I had the same problem, the page took 4.5 sec to load cause it has to receive data from the client, And I look for ways to make polymer faster, I think that I found out, now I load the page in 1.2 sec (with no cache) and the request to the server take 0.4 sec.
Steps To Make Polymer Faster
Dont show things that you dont need. Use dom-if to not render, like if you have pages that show only if the user click on button, DO NOT RENDER THEM.
You can make request to the server before the body. so you will receive the response faster.
If you want that the user fill that the site load faster you can remove the unresolved property from the body Thus the user will see the components in the loading process (But the user will see something and not a blank screen).
use this script (before import polymer)
window.Polymer=window.Polymer ||{dom:'shadow'};
this make the browser use the shadow dom (if supported) and not the shady dom.
it faster to use shadow dom but not all the browser support it yet.
Use vulcanize https://github.com/polymer/vulcanize to merge all the import files and also minimize the file.
For long list you can use the iron-list, it render only what that is on the screen.
If you use script import, you can use the async parameter and dont block the process of the rendering.
Edit
if you dont want to use iron-list their is new option in dom-repeat
<template is="dom-repeat" items="{{items}}" initial-count="20">
</template>
this is not block the thread for a long time but render the list in parts.

what is the order of html assets when page load

What is the order of loading?
php
html
java script
css
jquery
ajax
please give me little explanation too
Thanks,
1) HTML is downloaded.
2) HTML is parsed progressively. When a request for an asset is reached the browser will attempt to download the asset. A default configuration for most HTTP servers and most browsers is to process only two requests in parallel. IE can be reconfigured to downloaded an unlimited number of assets in parallel. Steve Souders has been able to download over 100 requests in parallel on IE. The exception is that script requests block parallel asset requests in IE. This is why it is highly suggested to put all JavaScript in external JavaScript files and put the request just prior to the closing body tag in the HTML.
3) Once the HTML is parsed the DOM is rendered. CSS is rendered in parallel to the rendering of the DOM in nearly all user agents. As a result it is strongly recommended to put all CSS code into external CSS files that are requested as high as possible in the section of the document. Otherwise the page is rendered up to the occurance of the CSS request position in the DOM and then rendering starts over from the top.
4) Only after the DOM is completely rendered and requests for all assets in the page are either resolved or time out does JavaScript execute from the onload event. IE7, and I am not sure about IE8, does not time out assets quickly if an HTTP response is not received from the asset request. This means an asset requested by JavaScript inline to the page, that is JavaScript written into HTML tags that is not contained in a function, can prevent the execution of the onload event for hours. This problem can be triggered if such inline code exists in the page and fails to execute due to a namespace collision that causes a code crash.
Of the above steps the one that is most CPU intensive is the parsing of the DOM/CSS. If you want your page to be processed faster then write efficient CSS by eliminating redundent instructions and consolidating CSS instructions into the fewest possible element referrences. Reducing the number of nodes in your DOM tree will also produce faster rendering.
Keep in mind that each asset you request from your HTML or even from your CSS/JavaScript assets is requested with a separate HTTP header. This consumes bandwidth and requires processing per request. If you want to make your page load as fast as possible then reduce the number of HTTP requests and reduce the size of your HTML. You are not doing your user experience any favors by averaging page weight at 180k from HTML alone. Many developers subscribe to some fallacy that a user makes up their mind about the quality of content on the page in 6 nanoseconds and then purges the DNS query from his server and burns his computer if displeased, so instead they provide the most beautiful possible page at 250k of HTML. Keep your HTML short and sweet so that a user can load your pages faster. Nothing improves the user experience like a fast and responsive web page.
~source
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
So, in short:
The html will load first obviously as this instructs the browser on the extended requirements: images, scripts, external stylesheets etc
After that, the load order is pretty random - multiple connections will be initiated by most browsers and the order in which they return cannot be predicted.

Prevent incomplete loading of css and javascript files

I have the problem that some webpages on my webserver are sometimes not loaded completely, for instance it seems that css and javascript are not loaded and therefore the website appears unformatted. Is there a way to force the complete loading of the css and javascript files so that these errors can be prevented. Maybe some progress loading can be shown while the whole website is loaded. I'm sure this is a solved problem but I need some directions where to look for implementations.
Regards,
Juan
Is it just that the page takes a long time to load but gets there eventually? If so, you can wait for the onLoad event to fire before starting up your JS code. If the page never gets there, well, that's some kind of server or connection problem, as thirtydot says.

Problem with modifying a page with ajax, and the browser keeping the unmodified page in cache

I have a situation where my page loads some information from a database, which is then modified through AJAX.
I click a link to another page, then use the 'back' button to return to the original page.
The changes to the page through AJAX I made before don't appear, because the browser has the unchanged page stored in the cache.
Is there a way of fixing this without setting the page not to cache at all?
Thanks :)
Imagine that each request to the server for information, including the initial page load and each ajax request, are distinct entities. Each one may or may not be cached anywhere between the server and the browser.
You are modifying the initial page that was served to you (and cached by the browser, in most cases) with arbitrary requests to the server and dynamic DOM manipulation. The browser has to capacity to track these changed.
You will have to maintain state, maybe using a cookie, in order to reconstruct the page. In fact, it seems to me that a dynamically generated document that you may wish to move to and from should definitely have a workflow defined that persists and retrieves it's state.
Perhaps set a cookie for each manipulated element with the key that was sent to the server to get the data?