Does using frameworks make website heavy? - html

Recently I made a sample website[1] from scratch. The website's main page consists of merely 168KB out of which the image is 138KB. So the programming stuff includes 168 - 138 = 30KB only. This 30KB is divided as:
index.html -- 14KB
style.css -- 16KB
--------------------
Total -- 30KB
Now Suppose instead of making the website from scratch -- I make it with frameworks then apart from the 138 KB image the programming stuff would include:
bootstrap.min.css -- 119 KB
jquery-2.2.2.js -- 253 KB
index.html -- 14 KB
styke.css -- 16 KB
-----------------------------
Total -- 402 KB
Lets call the website made from scratch WB-1 and the other on WB-2. Suppose a user from a developing country using 2G network or bradband having a speed of 32KB/s tries to access WB-1. The website would take nearly 5 seconds to open. Another user from the same country with same speed tries to access WB-2. It will take him nearly 17 seconds to open the full page. So basically WB-2 is 3 times slower than WB-1. Not only this. Suppose WB-1 is hosted by server S-1 and WB-2 is hosted by server S-2. Also suppose every month 10,000 visitors visit WB-1 and 10,000 visit WB-2. Then S-1 will have to transmit 168 * 10000 = 1.68 GB data and S-2 will have to transmit 568 * 10000 = 5.68 GB data. So now S-2 has to have 5 times larger bandwidth than that of S-1.
Question:
Does using frameworks make the website heavy and increase load on server?
Is their any way with which the user agent would download only those portions of the framework which have been used within the html code? That is suppose I use only 100 class from bootstrap then instead of downloading the whole boorstrap.min.css the user agent would just parse the html and download those classes which have been used in the html code? Similarly for jquery.js the user agent would download only those functions which have been used instead of downloading the whole jquery framework?
Are svg files lighter than jpg or png files? That is, is it worth to convert company's logo from usual raster image to vector image?
Finally do clients really care about this? That is would clients say, "ok I'll pay you higher if you make my website lighter with your custom programming"?

Let's answer each one of your questions:
Does using frameworks make the website heavy and increase load on server?
frameworks usage often makes site heavier (than custom, vanilla code), not heavy... the main issue with the frameworks is the overbloat.... having a lot of code which you won't use in your site... hopefully, most frameworks have customizers to just add the features you need. Frameworks usage is a personal choice, and, as every 3rd party tool, often came along with trade-offs, for example, you take a framework which will add some Kb to the site, but project delivery will be earlier, or with better responsive features, etc.
Is their any way with which the user agent would download only those portions of the framework which have been used by the frameworks? That is suppose I use only 100 class from bootstrap then instead of downloading the whole boorstrap.min.css the user agent would just parse the html and download those classes which have been used in the html code? Similarly for jquery.js the user agent would download only those functions which have been used instead of downloading the whole jquery framework?
Quick answer, no... user agent doesn't analyze what's going to be used or not, and that applies for both CSS and JS... the closest options is to use customizer for the framework (so you get the features for your site). At the JS area, if you just need small features, you could use another library like Zepto, which can fulfill your needs with a smaller size. Another good practice is to use CDN for 3rd party libraries.. that way there would be a good chance user agent has the library cached.
Are svg files lighter than jpg or png files?
Not always, but in most cases, yes... in web dev it is very important to have the proper image format for each case... SVG images has also the advantage of being fully resizable, looking crispy on a mobile phone or a computer... I'd recommend not only for the logo, but every vector-nature image.
Finally do clients really care about this?
Most clients worry for due date over byte size of final project... Most would be delight that you built it "the best way possible", if you shown a cheaper and expensive option, most clients will prefer the cheapest, with current internet speed, some Kb here won't make them lose sleep.
If you want to take a look to some framework customizers:
Bootstrap
Foundation

You can use CDN's.
Svg's are lighter and recommended for logo's.
Doesn't make any difference to the customer until you fulfill the requirements.

A customer will always concerned about the speed of your website, it is an inevitable thing. Google also concerned about the loading time of your website, if you are slow you will be penalized in the search.
It is best to move the images PNG to SVG, there is a big difference. SVG has a huge quality and occupies the entire space, its charging time is very fast. Also you can compress JS, CSS and HTML files. You can also compress image files
You will also limit the charging time if your JS files go before the end tag </ body>

Related

How to minimise the time for new static content to appear on the GitHub Pages CDN?

Assume we are only pushing lightweight static content like small HTML or JS files with no Liquid tags. There are no plugins, and there is no _posts/ directory, and files are never changed once committed.
Because nothing really needs to be built, in theory if we configure incremental_build: true and keep_files: ['.html', '.js'], then the build should be very fast.
However, right now, the GitHub pages build only happens every 5 minutes or so, so effectively there is a lag of 0 to 10 minutes.
Is there a way to reduce the time it takes for the file to appear at [repo].github.io/[path]? Is there some logic to it, for example do more commits or more files or more reads have an effect one way or another?
Github Pages does not respect those options. You could try prebuilding your site, but will possibly increase the total time to deploy. It's also possible that the build is happening instantly but it's taking time for the CDN to receive updates and invalidate caches.
You can try using another host (like running your own Jekyll server on EC2) or having your build upload the static content to S3 instead.
However, I recommend taking a step back and asking why you need less than 10 minute latency on deploy. If there are highly volatile resources you need to serve, then perhaps you need to identify those and serve them in a different way. Static site generators are good at, well, static content, not so much for highly volatile content.
If the volatile resources are page content, then it sounds like you have a use case better served with a mainstream CMS like Wordpress. If it's code, then deploy that separately to S3 and reference it in your site.

Social networks: is there a design pattern for profile pages presentation?

In social networks there two basic ways to deal with profile pages presentation:
1) Every time that a page is requested, a query is sent to the database and the profile page is built.
2) Every time that a profile is updated, a new html page is automatically built for future reference and requests.
There are advantages in both approaches. While in (1) it is not necessary to keep the pages, in (2) is not necessary to access the database every time that a profile page is requested.
Is there a kind of common sense to deal with profile pages? Are there other ways to deal with this problem? Are there issues that I am not considering here?
Any help will be greatly appreciated.
Option 2 will not scale well at all. Even at 0.1% of Facebook's traffic, you'd be generating HTML pages often enough to completely choke your storage's throughput.
Option 1 can be scaled infinitely (well, not infinitely, but you get the point) easily by sharding your DB, adding memcache etc.
Further, what happens when you decide to change the layout of the profile page? You'd have to regenerate the profile page for every user, which would be very expensive.
Option 1 allows you to decouple layout and presentation, from the data.

Multiple css files or one big css file?

Which one is better and faster? why?
using only one file for styling:
css/style.css
or
using several files for styling:
css/header.css
css/contact.css
css/footer.css
css/tooltip.css
The reason Im asking it is that im developing a site for users who have very low internet speed. country uganda. So I want to make it as fast as possible.
Using a single file is faster because it requires less HTTP requests (assuming the amount of styles loaded is still the same).
So it's better to keep it in just one file.
Separating CSS should only be done if you want to keep for example IE specific classes separate.
As per Yahoo's Performance Rules [source], It is VERY IMPORTANT to minimize HTTP requests
From the source
Combined files are a way to reduce the number of HTTP requests by combining all scripts into a single script, and similarly combining
all CSS into a single stylesheet. Combining files is more challenging
when the scripts and stylesheets vary from page to page, but making
this part of your release process improves response times.
It is quite uneasy to develop using combined files, so stick to developing with multiple files but you should combine the files once you are deploying the system on the web.
I really recommend using boilerplate's ant build script. You can find it here.
It Combines and minifies CSS
One css file is better than multiple css files because of the overhead involved by the end user's browser to make multiple requests for each file. Other things you can do yo improve the performance include:
Enable gzip impression on your webserver e.g. on Apache so that the files are compressed before downloading
where possible host your files geographically as close to the majority of your end users as possible
use a CDN network for your static content such as css files
Use CSS sprites
Cache your content
Note that there are tools available to help you do this. See 15 ways to optimise css for more information
This is always a better solution to bundle or combine multiple CSS or JavaScript files into fewer HTTP requests. This causes the browser to request a lot fewer files and in turn reduces the time it takes to fetch them.
With a proper caching, you can gain extra bandwidth and even fewer HTTP request.
Update:
There's a new Bundling feature in ASP.Net 4.5 which you might be interested in.
This allows you to have css files separated at compile-time, and in runtime gain benefit of combined resources into one resource
One resource file is always the fastest approach since you reduce the number of HTTP requests made to fetch those files.
I would suggest to use Yslow which is a great extension for firebug that analyzes web pages and suggests ways to improve their performance.

HTML5 Plotting Library Performance Issue on MAC?

I am looking into plotting a very large data. I've tried with FLOT, FLOTR and PROTOVIS (and other JS based packages) but there is one constant problem I'm faced with. I've tested 1600, 3000, 5000, 8000 and 10k points on a 1000w 500h graph which are rendered all within a reasonable time on PC browsers (IE and FF). But when rendered on MACs FF/Safari, starting with 500 data points, the page becomes significantly slow and/or crashes.
Has anyone come across this issue?
Yes, don't do that. It seems pretty unlikely to me that 10k points are actually going to be visible/useful to the user all at once.
You should aggregate your data (server-side) and then if they want to zoom in on areas of the data, use AJAX requests to get that area and replot.
If you use flot, they have examples showing selection, i.e. here: http://people.iola.dk/olau/flot/examples/zooming.html
(I can't comment the Ryley answer yet, that's why I put some remarks here)
What about an offline use. Html is a great format for documents, set aside the server/client stuff.
JavaScript, Canvas and all those fancy client-side technologies could be used to build nice interactive files, like data reports containing graphs with zoom and pan features ...

Why people always encourage single js for a website?

I read some website development materials on the Web and every time a person is asking for the organization of a website's js, css, html and php files, people suggest single js for the whole website. And the argument is the speed.
I clearly understand the fewer request there is, the faster the page is responded. But I never understand the single js argument. Suppose you have 10 webpages and each webpage needs a js function to manipulate the dom objects on it. Putting 10 functions in a single js and let that js execute on every single webpage, 9 out of 10 functions are doing useless work. There is CPU time wasting on searching for non-existing dom objects.
I know that CPU time on individual client machine is very trivial comparing to bandwidth on single server machine. I am not saying that you should have many js files on a single webpage. But I don't see anything go wrong if every webpage refers to 1 to 3 js files and those js files are cached in client machine. There are many good ways to do caching. For example, you can use expire date or you can include version number in your js file name. Comparing to mess the functionality in a big js file for all needs of many webpages of a website, I far more prefer split js code into smaller files.
Any criticism/agreement on my argument? Am I wrong? Thank you for your suggestion.
A function does 0 work unless called. So 9 empty functions are 0 work, just a little exact space.
A client only has to make 1 request to download 1 big JS file, then it is cached on every other page load. Less work than making a small request on every single page.
I'll give you the answer I always give: it depends.
Combining everything into one file has many great benefits, including:
less network traffic - you might be retrieving one file, but you're sending/receiving multiple packets and each transaction has a series of SYN, SYN-ACK, and ACK messages sent across TCP. A large majority of the transfer time is establishing the session and there is a lot of overhead in the packet headers.
one location/manageability - although you may only have a few files, it's easy for functions (and class objects) to grow between versions. When you do the multiple file approach sometimes functions from one file call functions/objects from another file (ex. ajax in one file, then arithmetic functions in another - your arithmetic functions might grow to need to call the ajax and have a certain variable type returned). What ends up happening is that your set of files needs to be seen as one version, rather than each file being it's own version. Things get hairy down the road if you don't have good management in place and it's easy to fall out of line with Javascript files, which are always changing. Having one file makes it easy to manage the version between each of your pages across your (1 to many) websites.
Other topics to consider:
dormant code - you might think that the uncalled functions are potentially reducing performance by taking up space in memory and you'd be right, however this performance is so so so so minuscule, that it doesn't matter. Functions are indexed in memory and while the index table may increase, it's super trivial when dealing with small projects, especially given the hardware today.
memory leaks - this is probably the largest reason why you wouldn't want to combine all the code, however this is such a small issue given the amount of memory in systems today and the better garbage collection browsers have. Also, this is something that you, as a programmer, have the ability to control. Quality code leads to less problems like this.
Why it depends?
While it's easy to say throw all your code into one file, that would be wrong. It depends on how large your code is, how many functions, who maintains it, etc. Surely you wouldn't pack your locally written functions into the JQuery package and you may have different programmers that maintain different blocks of code - it depends on your setup.
It also depends on size. Some programmers embed the encoded images as ASCII in their files to reduce the number of files sent. These can bloat files. Surely you don't want to package everything into 1 50MB file. Especially if there are core functions that are needed for the page to load.
So to bring my response to a close, we'd need more information about your setup because it depends. Surely 3 files is acceptable regardless of size, combining where you would see fit. It probably wouldn't really hurt network traffic, but 50 files is more unreasonable. I use the hand rule (no more than 5), but surely you'll see a benefit combining those 5 1KB files into 1 5KB file.
Two reasons that I can think of:
Less network latency. Each .js requires another request/response to the server it's downloaded from.
More bytes on the wire and more memory. If it's a single file you can strip out unnecessary characters and minify the whole thing.
The Javascript should be designed so that the extra functions don't execute at all unless they're needed.
For example, you can define a set of functions in your script but only call them in (very short) inline <script> blocks in the pages themselves.
My line of thought is that you have less requests. When you make request in the header of the page it stalls the output of the rest of the page. The user agent cannot render the rest of the page until the javascript files have been obtained. Also javascript files download sycronously, they queue up instead of pull at once (at least that is the theory).