Closed. This question needs details or clarity. It is not currently accepting answers.
Want to improve this question? Add details and clarify the problem by editing this post.
Closed 6 years ago.
Improve this question
I'm trying to make a Site enhancement.
The Bootstrap.min.css took 0.5 second to load complete to our website.
I thinks solve the 0.5 by place all minified CSS in my HTML Document (In production mode). It's a good idea for run-time enhancement ? Does it make HTML Engine slow to load the CSS ?
That depends on your page. Besides a zillion of factors that influence the page speed, two factors are most important in regards to your question.
bandwidth
amount of requests
If you have one of those single page angular websites for example, then putting all css into your html makes sense. This will reduce the amount of requests, while the bandwidth consumption stays the same.
If you have a "normal" website, with the user loading a new html page every time he clicks on a link, then its better to put the css into .css files, so he doesnt have to load the same information over and over again. This will increase the amount of requests, while dramatically reducing the bandwidth consumption (because of browser side caching of the css).
If you want to increase the speed of your website, look into caching, CDN's and tools like those explained here, which will guide you into the right direction: https://developers.google.com/speed/pagespeed/
Related
Closed. This question needs debugging details. It is not currently accepting answers.
Edit the question to include desired behavior, a specific problem or error, and the shortest code necessary to reproduce the problem. This will help others answer the question.
Closed 6 years ago.
Improve this question
My website is really slow because I have a lot of images.
Does somebody know a way to make it fast so I can keep all the images?
Use tiny png to compress you images and replace them with existing images as lower resolution images will helps the page to browse fast.
Place all your javascripts to bottom of you page and use deferred parsing of some javascripts that are not in use on initial load this will help you page to load first and unnecessary scripts will load later.
Try to minify all you js and css which will remove you file size as well as page size.
Above are some basic points we need to take care of while developing a website.
Read this tutorial too to learn more things about it http://learntocodewith.me/posts/make-your-website-fast/
Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 9 years ago.
Improve this question
This is the issue. I am making social networking website and I want to display online/offline user status without refreshing page. I have everything done in Mysql, online status is displayed on page, but without refreshing page changes in online status are not registered. So basically I supose that it could be solved with AJAX request. I want to get changes in mysql query and to display it directly on page without page refreshing.
A couple of different options, but two that I will highlight:
1) jQuery. Straightforward JavaScript library way to asynchronously access the user data as you specify. AJAX calls are built in by design. See http://api.jquery.com/jquery.ajax/. This is pretty much out of the box functionality, and easier to implement if you're crunched for time.
2) If you're totally new to getting back end data into the front end, and refreshing on the fly, I might suggest you choose a more recently designed JavaScript framework, such as AngularJS. This would give structure to the front end of your application. The only real drawbacks for this are learning curve (higher than jQuery) and SEO, as it is a fully JavaScript-driven output. For SEO, there is a well-known workaround: http://www.yearofmoo.com/2012/11/angularjs-and-seo.html.
Hope this high level overview helps put you on the right track...
Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 9 years ago.
Improve this question
I know you do commonly refactor code for the back-end to improve it's speed, security or make it more readable for the next person who takes over your project, but do you refactor html and css? Since they are markup languages it doesn't seem so trivial, besides wiping off a few bytes of your code vs the time input looking for alternatives doesn't seem to be worth the effort, especially if you are working on a tight deadline.
There are innumerable things that can increase or decrease page performance. Like with any optimisation though, you should start with where people are seeing problems or slowdown.
On a broader level, reducing payloads to the smallest possible size makes a big difference. This involves gzip, caching, and minification. You can rewrite your code a thousand times but it probably won't end up much smaller if at all than it would if you were to use gzip and minify your CSS — but don't minify HTML as it's too prone to rendering issues.
On a finer level, specific CSS features such as resizing large images and implementing lots of browser-generated gradients and shadows can bring performance down significantly. If you're noticing sluggishness when scrolling then things like this are probably what you need to focus on. Just one image that's 640x480 or more being resized by CSS can bring performance crashing down in some browsers.
Then of course there's latency. Using content distribution networks or at the very least highly optimised servers will ensure your HTML, CSS, JavaScript, and image files are delivered to users and shown as quickly as possible.
Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 11 years ago.
Improve this question
I am planning redesign of my page (4-5 years old with pagerank 3-4). There will not be any URLs changing, meaning that the same content will stay under the same URL. But I am still bothered, because I heard that changing HTML structure on whole page can have some effect, mainly negative. But there is no way of changing design and layout of the page without changing HTML structure.
Could you please sum up all the things to take into account when redesigning website search-engine-friendly-way ?
I could go into some detail but basically check your site with this to get a detailed breakdown: http://nibbler.silktide.com/ Before and your test site (Preferably on a test domain ie. test.mywebsite.com).
Basic things not to do are: Do not use html tables for anything but displaying data in a grid, do not use semantic html where not needed this is used to highlight things as important.
Order of importance tags on a page
H1 < H2 < H3 < B
Make sure your html is valid and you have all the appropriate meta-tags in place as per the w3c standard you choose for your design.
Content is key, keyword density and page themes are what are important don't dilute a page, if you are going to add a new page.
Make sure you add a site map and submit to all search engines and have a robots.txt file pointing to your local xml sitemap.
For everything that you didn't understand that I said google the phrases in bold and you will find more detail of implementation.
Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 8 years ago.
Improve this question
I maintain an HTML page that contains a list of links to photo galleries. Over the past few years it has gone from a small page to a list that contains HUNDREDS of links. My fear is that it has affected the SEO of the page as a whole; being interpeted by spiders as a link farm. Of course, I have no real way of knowing fo sure but I have started to suspect.
Is there an efficient simple way to deal with a large number of links in a manner that is still easy for the user to browse? While having hundreds of links one of top of the other may not be the best looking method, its easy to search since they are all in chronological order. I am looking to figure out a way that I can keep the page simple without creating more of a maintenance nightmare for myself.
One idea I had was use XML to store the links and use some kind of dropdown so that when a spider hit the page it would not see a mountain of links, just a reference to XML
Use a "pager" script to show, say 10 at a time. They are available in every web framework or you could quickly hack up your own.
... how about this. Put links in separate file(s) (or somehow store them outside of the page, db, flat file, etc.) and load them via ajax call as needed. Say, something like 'Category A' button, when clicked loads links into a div. That should keep it out view for spiders.
Then there's this: http://www.robotstxt.org/meta.html and this: http://en.wikipedia.org/wiki/Nofollow