What does it mean by "host bootstrap by yourself"? - html

I'm getting starting with bootstrap and followed a website that says
There are two ways to start using Bootstrap on your own web site. you can Download Bootstrap from getbootstrap.com or
Include Bootstrap from a CDN
also later
you want to download and host Bootstrap yourself
then
If you don't want to download and host Bootstrap yourself, you can include it from a CDN (Content Delivery Network).
what does it mean what is the process

Hosting any css/js file yourself means that you put it on your own website/server.
It means people will download it from your website every time they open it up. (unless it's cached locally by the browser, but at least the very first time)
CDN is used so that people already have the files in their cache from any other website they visited using the same CDN. (For example, a google font)
This drastically reduces loadtimes for first time visitors, but you do risk delays that are out of your control by loading something from an external website (if it's out, yours won't work properly!)
So it's a speed vs risk thing, basically.

hosting it yourself means you download the file and put it in the same place as your website on your web-hosting server.
otherwise, you can reference it in your website with a CDN(content deliver network). these networks hold files for you to use. you add a reference to in your website. and you don't have to keep the bootstrap files on your own server.
<link rel="stylesheet" href="https://maxcdn.bootstrapcdn.com/bootstrap/3.3.7/css/bootstrap.min.css" integrity="sha384-BVYiiSIFeK1dGmJRAkycuHAHRg32OmUcww7on3RYdg4Va+PmSTsz/K68vbdEjh4u" crossorigin="anonymous">
^ this is an example of CDN. they'll probably have a server keeping the file bootstrap.min.css, then they get a domain (bootstrapcnd.com), create a sub-domain(maxcdn). and you can request the resource(the bootstrap.min.css file) from it.
Of the 2 options, you can choose which one is the best for YOU.
i'd list out the "goods" and "bads" of both:
Availability: Hosting on your own server means, you never have to worry what happens about downtime. as long as you have your own server up(where your website files are placed) your resources will be available too. Whereas, if your vendor resources(jQuery, Bootstrap) come from a CDN, the CDN server being down will affect your visitors too. A GOOD CDN Service however, gives up time up of around 99.9%.
Usability: What do you do when you want to update your jQuery or Bootstrap? If you're hosting yourself, you go to the jQuery or Bootstrap website, download the file and put it on your server, then update the reference in your html. With CDN, you just update the version(given that particular CDN has the updated file).
Caching: Every unique visitor to your website will download the resources(jQuery, Bootstrap etc) if it's hosted on your server. With CDN, it these files might already be cached on their browser if they visited a website that uses the same CDN as you. resulting in faster loading time for YOUR page.
Bandwidth: Let's say you're using a very cheap hosting. and they give you like 100 MB bandwidth every month. but you do get a 30 unique visitors daily. your website page size with jQuery is 100 KB. and your monthly bandwidth usage around, (30*100*30/1000 = ) 90MB. with jQuery(~84KB) on CDN it becomes (16*30*30 /1000 = ) 14.4MB. (Again this is a hypothetical case. i don't think you can find a hosting as bad as 100MB a month, but you get the point).
I'll add up more when i remember them. hope it helps.

Related

Using Font Awesome CDN

I am wondering why you get a unique embed code every time for using Font Awesome CDN. Is there any downside to using the same embed code multiple times?
You can create account on their CDN service, which gives you access to some additional features and simplification tools, like specifying which Font Awesome version will be served to your users or cache invalidation if served version is changed.
Since they don't have users functionality on their main page, and they don't do user tracking, they have no way of knowing whether requested embed code will be used by new user or by existing user on new site. To save you hassle of creating multiple account on CDN service and to allow you management of all installations from one page, they ask you to always give the same email address, but they send you unique embed codes each time.
There are no downsides of using one embed code on multiple sites as long as you want the same configuration on all sites. If, for whatever reason, you want one site to use different version than your second site, you should use separate embed codes.
Browser caching. If new fonts are added they won't show on the client unless they do a forced refresh.

Best to import assets from CDNs or bundle them together with custom code? [closed]

Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 2 years ago.
Improve this question
I usually bundle my CSS as one minimized file using Gulp then import assets like FontAwesome, Google fonts or plugin files separately from CDNs. Is that the best option or is it even better performance-wise to download these third party assets then bundle them together with our code as one file?
So the 1st point here is why we use CDN:
A Content Delivery Network (CDN) works by providing alternative server nodes for users to download resources (usually static content like images and JavaScript). These nodes spread throughout the world, therefore being geographically closer to your users, ensuring a faster response and download time of content due to reduced latency.
so if you're planning to host your website across globe than I must say better you use CDN.
alternate thing is you can download and bundle using gulp as you are doing it right now but put your css file to some cdn. It will increase your performance. (and it's cheap)
here's amazon cloud front link :
https://aws.amazon.com/cloudfront/pricing/
Like so many other great answers to great questions, we begin with the nearly ubiquitous opening, "It depends..."
How important is the performance of your website?
What time interval are you choosing as a meaningful measure of latency (microseconds, tenths, seconds?)
How big are the assets?
Who is your audience?
Where is your audience?
Which devices is your audience using?
How often do your resources update/change?
Etc.
I also try to compress the resources and minimize the number of network requests to reduce load times. However, I tend to test various bundling strategies to see if the changes actually produce faster results. That's because there are so many variables to consider (e.g., is the resource already in the users cache, such as jQuery; is the CDN speed fast enough to counteract the reduction in network requests; etc.)
In your example, I like the minimization your doing for your CSS. And, I like that you're considering the potential benefits of bundling your assets. Try putting this setup to the test and get some numbers.
My hunch is that users not near a particular node of your CDN could experience a benefit (i.e., fewer network requests, taking advantage of HTTP pipelining, etc.) However, that depends on the quality of your CDN; how often the resources in the CDN are used on other sites (as noted by #Ryan, if the CDN resources are already cached, the CDN avoids redundant downloads); how many assets you're using; the size of the assets (e.g., bundling essential elements like the main style sheet with large files could slow down the rendering of the page); and in terms of perception, if the progressive rendering of your page is something that users would notice in the latency without bundling (i.e., if a user sees a default font for a split second but then sees the Google font, is it jarring enough to matter, or was their attention elsewhere.)
Finally, if you do test the bundling, please post a comment. We're curious, too :)
If you are using common shared CDN files e.g. https://code.jquery.com/jquery-3.2.1.min.js it is likely that the user already has that version cached in there browser because jQuery is used on a high percentage of websites. What this mean is that when they visit your website they do not have to download jQuery again, leading to faster load times.
If you use for example CloudFront this doesn't have the above benefit but it does mean that the static files will be closer to your users, meaning a lower latency for retrieving the files, therefore quicker load times from the user point of view.
However, I usually like to use shared CDN files and also include local fallbacks - see here

Bootstrap CDN Rendering Delay

Have been working on a site and have used Boostrap via MAX CDN by putting
<link rel="stylesheet" href="https://maxcdn.bootstrapcdn.com/bootstrap/3.3.7/css/bootstrap.min.css">
in the head section.
Google Pagespeed is showing:
Your page has 1 blocking CSS resources. This causes a delay in rendering your page.
None of the above-the-fold content on your page could be rendered without waiting for the following resources to load. Try to defer or asynchronously load blocking resources, or inline the critical portions of those resources directly in the HTML.
Optimize CSS Delivery of the following:
https://maxcdn.bootstrapcdn.com/bootstrap/3.3.7/css/bootstrap.min.css
Is there anyway of fixing this?
Thanks
The other answer is too incorrect, so I am writing this if anyone still needs help on this.
CDNs speed up your production websites too, not only your local development environment. Hosting a static file locally on your own host does not solve the Render Blocking issue that pagespeed tool suggests here. A static file hosted locally is still render blocking.
The solution here is inlining the render blocking CSS files completely. However, I would recommend to not inline the resources for several reasons:
Resources delivered over CDN are delivered faster. If you inline them, the entire payload is expected to deliver at slow pace.
Resources delivered separately are cached by browser, while if you inline CSS resources browser will not be able to reuse them across requests.
Resources inlined are compressed on every request along with the other dynamic contents of the file. You cannot take advantage of pre-compression that comes with CDN.
Bigger HTML document due to inlined CSS files is expected to slow when it comes to parsing.
What should one do to resolve this issue? You might think differently, but I would suggest you to do nothing.
Its not that bad if the browser has to wait a little for resources to download. If you are using CDN, chances are visitors will not perceive the download as most of the CDNs now a days have less than 50ms average global latency.
Resources are cached by browser. While pagespeed tool loads fresh resources on every request, please note that browser may be loading some of the resources from cache, completely eliminating the CDN request.
Resources are shared across websites. If you are using Bootstrap from a public CDN, chances are that the same bootstrap file is already available in browser cache of the visitor, that is downloaded when the user visited another website that used bootstrap. This gives 0ms latency and 100% bandwidth saving for that particular resource for even your first time visitors that have no other resources of your site in their browser cache. Browser can now spend the saved bandwidth elsewhere to speed other things up.
Manually inlining external libraries make it little more difficult to keep traces of all inlined copies of the library and makes the edits and updates hard.
Dynamic inlining adds few more disk seeks per request. This just adds to the server load. These IOs can be saved if the files are served separately. Your OS may just cache the files if they are hot enough. Using CDN will completely eliminate this load.
Although not exactly a solution that will remove this pagespeed warning, its possible to reduce the impact of render blocking resources for real visitors (not performance measurement tools):
Serve resources with aggressive compression to reduce the payload size.
Serve resources with immutable cache-control header to tell the browser to confidently store this file for longer period as this is not going to change in the future. If you use bootstrap cdn by pagecdn, these two optimizations are enabled by default.
If you know a file is going to be loaded immediately after a page load, you can use HTTP/2 Server Push to deliver the file before even the browser asks for it. However, if you do this, you will need to make sure that the same files are not aggressively been pushed on every request (that is not a good option as the files should load from browser cache on second request onwards).
Either change the CDN (which will most likely do nothing for you) or, simply store the file locally. This will up your Google Page Speed rate.
Download a copy of the bootstrap file you are using and store it like so root/css/bootstrap.min.css
Where root is your project folder.
CDN's are used mainly for test purposes (instant access to files) or in larger-scale projects which have multiple requirements that can't always be met locally.
Read this thread to better understand.
While the error that Google gives you might not be a serious issue for your project, it is always a good practice to use your resources locally, so that your website may load by itself, without referencing external sources that resemble in a separate query.
Static files = better load times = happy Google.

Which uses less resources (data and time) using font awesome.min.css locally or using it with the online link

Which uses less resources while loading a website which uses font-awesome icons ?
using online link :
<link rel="stylesheet" href="https://maxcdn.bootstrapcdn.com/font-awesome/4.4.0/css/font-awesome.min.css">
or using locally downloaded fontawesome.min.css file and giving its link in html
The main benefit of having them on a CDN is that the files can be downloaded in parallel to files downloaded from your own website. This reduces latency on every page. So, the flip side of this is a pitfall of hosting locally - increased latency. The main reason for that is that browsers are limited in the number of connections that they can make at the same time to the same webserver. In IE6 this was defaulted to 2 concurrent connections to the same domain - shared between all open windows of IE!! In IE8 it is a bit better, defaulting to 6, which is inline with FF, but still, if you have a lot of images and you are not using sprites, you will experience heavy latency.
Using a CDN, I would always set the library version explicitly rather than getting the latest one. This reduces the risk of new versions breaking your code. Not very likely with jQuery, but possible.
The other main benefit of using a CDN is reduced traffic on your site. If you pay per GB or you are on a virtual server with limited resources, you might find that overall site performance increases and hosting costs come down when you farm off some of your content to a public CDN.
from - Benefits vs. Pitfalls of hosting jQuery locally

Would embedding my CSS/JS/Sprite in my single-page web-app make it load faster?

I have single page web-app that currently consists of four files:
index.html
main.js
style.css
sprites.png
This means that every user who loads the site has to request index.html, parse it for the other three files, and then make three more http requests (serially, I believe) to fetch the remaining files.
It seems to me that it might be (a tiny bit) faster to embed the javascript, css and sprite image (base64 encoded) directly in the index.html file.
The main reasons I can think not to do this, along with my reasons why I don't think they apply in this case, are as follows:
Would prevent any of these additional files from being cached separately. This is not an issue for me because they will never be loaded from another page (since there is only one html page)
If the files were on different CDN servers, they could be downloaded in parallel. (Currently this project is not large enough to merit multiple servers)
I should disclose that this site is a small pet project that is no-where near large enough to merit this kind of meticulous performance tuning, but I like using pet projects as an avenue to explore problems (and their solutions) that I may face in my day job.
This isn't usually done because you increase the size of the entire HTML page. You'll save a couple requests on the first visit, but you'll force the client to reload everything every time they fetch the HTML file.
It would improve performance for users who visit your site once, and only once. For any kind of long-term strategy, it's unsuitable.
When your page is reloaded js, images, and CSS are cached on the client and doesnt need to reload. Also, base64 requires your clients to activate JavaScript to see your page. Lastly, it may very well take a weak client longer to decode your base64 than downloading the files.
So in short, dont overthink some things.