Prefetching imports in Polymer? - polymer

today i was browsing internet and i found that html import can have rel attribute with prefetch value.
Well, since polymer is only about imports, there is a question. Should i use prefetch, when dynamically loading imports ? In my project i am using Polymer.Base.importHref which i don't know how works. This prefetch imports save file in cache for future use, but do polymer import load this file from cache? does it invoke that prefetch or does it load file without looking into that cache?
And the last question, is it even usefull? does it speed up site or improve some performance?

I discovered this medium article that explains the different prefetch & pre-render options. It doesn't necessarily answers your question, but it does shed some light on that whole subject.
Regarding the Polymer importHref:
I don't think adding prefetch parameters to the importHref will have any effect however, this is just an assumption I am making from reading the dox's.
This method creates a new <link rel="import"> element with the provided URL and appends it to the document to start loading.
From my understanding this would overwrite any set rel=""
But as far as I know the importHref doesn't load resources that have been loaded already so it might be looking into the cache.

Related

Does having multiple CSS files although small affect my page loading time?

MY CONDITION
I have multiple pages index.html, menu.html and I have single stylesheet style.css.
More than half of styling code needed for index.html is not needed in menu.html.
MY CONFUSION
Shall I create 3 CSS files index.css, global.css, menu.css? Each stylesheets focused for corresponding HTML document and global.css for global stylings
MY DOUBT
Will doing so affect my webpage load performance or not? Here, we are just loading multiple files but the no. of lines altogether has been decreased significantly.
As Heretic Monkey pointed out the performance of a website is a very complicated topic and there are no easy answers. Sometimes single large bundle is better, sometimes few smaller files.
If you would like to split up your CSS into multiple files during development but still publish it as a single file on the website, then you could use a css preprosessor like SASS. Check it out: https://sass-lang.com/guide
Anyways it sounds like your project is pretty small so I would not worry about this kind of optimizations yet. It is pretty crazy how much css (and other resources) are used on modern websites.
For example this is one of 4 css bundles that this stackoverflow page is using: https://cdn.sstatic.net/Sites/stackoverflow/primary.css?v=c05ce93d5306
More important on the performance side is that the resources (css and others) are cached properly. That means that user only needs to download the css file once and after that the browser will use locally cached copy of the file. Caching can be configured on your web server or hosting service you are using.
Although keep in mind that caching might be bit risky and if configured wrong your visitor might end up running old styles or javascript. For example here is a good read on the topic: https://simonhearne.com/2022/caching-header-best-practices/

Browser (Chrome, Safari etc.) continues to show the old HTML website, even after updates

I've built a website in HTML, CSS, Bootstrap 4 etc. etc. through Atom. Every time I make changes and upload the new version to the hosting platform (HostPresto in my case) browsers load the previous (presumably cached) version. I know browsers use the cached data for loading and data efficiency etc. However, my aim is to force any browser to reload from scratch, ONLY if some part of the website has changed. How do I do this?
I've seen some answers on stackoverflow to similar queries over the years, where people have suggested using code in the website header to effectively wipe the cookies and cached data each time it is loaded. I don't want to do this all the time, since more often than not, no changes will be made and my website will be less efficient.
My hosting provider has said that it would help if I upload ALL website files each time only a bit has changed. I tried this today and it doesn't work - I still have to clear the cache for it to load properly.
Can anyone help me work out what to do, to get any browser to identify when changes have been made to my site, so that it will load them automatically? Thanks!
I've cracked my own issue, so to speak, so I thought I'd post the answer here, in-case it helps others.
Basically, I noticed that the HTML files would always reflect the latest changes. However, it was the CSS files that wouldn't. The CSS files were the culprits for loading from the browser cache.
To correct this, I have started adding versioning to my CSS files to force the browser to load the new CSS file, ONLY when the CSS is updated. it works like this:
The name of the CSS file isn't changed. That is, and always was styles.css. You might call it something else.
The old call I would make for my CSS file, in the head of the HTML file, was: <link rel="stylesheet" href="css/styles.css">.
The new call is: <link rel="stylesheet" href="css/styles.css?version=1.0">
All I have done is add ?version=1.0 to the end of the CSS file name in the HTML header. If the version number changes, the browser will load the newest version. If the version number is the same, it won't load the newest one.
I imagine this would be quite straightforward for some but since I have managed to sort it myself, I thought it might be useful to share with others. There's now no need for adding hard code into the HTML header that causes efficiency-losing reloads from scratch unnecessarily!

Design Pattern for PolymerJS

Currently i have seen many links and videos of polymer and I am very exciting about polymer but the only thing which keeps me doubtful is the performance problem with using polymer in current date for example if i import 5 external components with each one having a link to external css which means i have to make 5 http request for css files .So what design pattern should I follow to increase the web performance should I remove css links from imported components and concatenate all css files into one file and use on my main page.
When it comes time to optimize for production, you can use the vulcanizer to concatenate imports and move your stylesheets inline, or perform various other transformations (e.g. forcing all JavaScript to be external for CSP).

How to force Chrome to prerender more pages?

I'm learning about Chrome and Native Client.
Basically i want to increase number of pages that are prerendered
by chrome (now its just one page).
I was thinking about creating a extension that would
allow to prerender more pages.
Is this a way to go or am i left with hard coding it into Chrome and build from scratch?
EDIT
I started a bounty for this question. I would really appreciate some input.
No, there is no way to go, you would need to hardcode it in Chrome and rebuild as you noted.
As you probably already know, Chrome explicitly states that they currently limit the number of pages that can be prerendered:
Chrome will only prerender at max one URL per instance of Chrome (not one per tab). This limit may change in the future in some cases.
There is nothing in their supported API's or their experimental API's that will give you access to modify this. There is not a toggle in chrome://settings/ or chrome://flags/ or anywhere else in Chrome currently that allows you to change this. You can however use the Page Visibility API to determine if a page is being prerendered.
In addition to the one resource on the page that you specify using the <link rel="prerender" href="http://example.org/index.html"> you could also consider prefetching resources.
The issue with prefetching is it will only load the top level resource of at the specified URL. So, if you tried to prefetch the other pages, like so:
<link rel="prefetch" href="http://example.org/index.html">
...then only the index.html resource would be prefetched, not all of the CSS and JavaScript links containeed in the document. One approach could be to write out link tags for all of the contained resources in the page but that could get messy and difficult to maintain.
Another approach would be to wait for the current page to finish loading, and then use JavaScript to create an iframe, that is hidden off the page, targeted at the URLs you want to prefetch all the assets for. The browser would then load all of the content of the URL and it would be in the user's cache when they go to visit the page.
There is also a Chrome extension that combines these two approaches by searching for link tags that define a prefetch and then creates the hidden iframe causing the assets to be downloaded and cached.
If the goal is to optimize around client performance for navigating your site there might be other alternatives like creating a web application that uses a Single Page Application (SPA) style of development to reduce the number of times JS and CSS are loaded and executed. If you are a fan of Google then you might check out their framework for building SPAs called AngularJs.
If it was a good idea to pre-render more pages, Chrome would probably already do it. Pre-rendering too many pages will drain website bandwidth and possibly end up slowing down the whole web, which is the opposite of what you're trying to achieve. So it's most likely intentional that you can only pre-render a single page and you shouldn't try to break that.
Not possible. Chrome manages pre-rendering based on many factors. If this was possible, it could also be easily abused by many sites. You could, depending on your page, keep all content on one page.

Will my users old cache file not render my new design correctly?

I may be wrong by saying it is my users browser but i have a question when updating my online store.
I have recently redesigned my homepage store which i am currently in the middle of completing.
It can be found through http://www.hot-water-cylinders.com/
Bascially, i realized going through a computer that had already been to the website and stored it in its cache everything was out of place, the image files linked into the document where the old ones use and basically clearing my browsers data was the only way of fixing the layout,
My question is that will my returning visitors have to deal with this issue too, like having to clear their cache and is the anyway to prevent this?
Thanks,
Kieren.
This is how HTTP caching is supposed to work.
The best way to avoid this is to make the resource URL:s unique. If you're doing all of this manually try putting the new resources in a versioned folder (/2.0/css, /2.0/img, etc).
At first, yes, but the browser cache usually least for only a day. If you want to prevent any cache you need to add a version to your style request as below
<link rel="stylesheet" href="path/to/my/css/mystyle.css?v=1">
The v parameter is relative to your css version, when you upload a new style version you'll need to change the version.