Searching Stackoverflow on Google I get this result:
Inspecting the HTML source of the Stack Overflow frontpage I can't find any reference of
"A language-indipendent collaboratively edited question and answer site for programmers" stuff.
I thought it was something related to meta description, but it is not true.
Also, where are the Log In, Questions, Ask Question etc. declared? Sitemap.xml?
In few words, could I obtain the same result simply editing my web site content, or is it something configured on some Google webmaster panel?
When the search engines try to determine which content they should use for the snippet showed in the search results, they take a look at two things:
Meta description tag
Content on the page
Because the goal of the search engines is to show a snippet that is as informative and useful for the searchers as possible, the search engines will first evaluate the meta description tag trying to determine how relevant this description is to the keywords (search terms) used by the searcher in his query. A very straightforward indication of relevance is the presence of any of those keywords in the meta description tag. If the meta description is not present or it is determined to be irrelevant to the query performed, the search engines will try to pull content from the page and use it as the snippet for the search results. This behavior is true for several search engines such as Bing (and by extension for Yahoo!, whose search results are powered by Bing), Ask.com, etc.
Ok, let's now take a look at Google's search results for the query "stack overflow":
Stack Overflow
A language-independent collaboratively edited question and answer site for programmers.
stackoverflow.com
As we can see, the snippet includes the following text: "A language-independent collaboratively edited question and answer site for programmers." Surprisingly enough, this line of text doesn't appear in neither the meta description (which is actually missing from the page), nor in the content of Stack Overflow's homepage. Where did that line of text come from then?
Here's the answer: Google and AOL evaluate a third source of information to determine the search result snippet of a page: the DMOZ directory. DMOZ is an open content directory, which includes links to millions of websites, organized by category. Let's do a search for Stack Overflow on DMOZ:
Stack Overflow - A language-independent collaboratively edited question and answer site for programmers.
As we can see, Stack Overflow is listed on DMOZ and its description its being used by Google and AOL to populate the snippet.
If we try the same search for Yahoo! (or Bing), this is what we obtain:
Stack Overflow
Stack Overflow. Questions; Tags; Users; Badges; Unanswered. Ask Question. Top Questions active 171 featured hot week month
Because the meta description tag is missing from Stack Overflow's homepage, and because Yahoo! doesn't use the DMOZ directory as an extra source of information, the only thing that Yahoo! has left is to pull content from the page, with poor results.
Instead of blaming Yahoo!, however, it is Stack Overflow's fault not to have included a meta description tag on their homepage, which would allow them to have more control over what gets displayed in the search results. Remember that the snippet has a strong influence in the Click Through Rate (CTR), which is the percentage of searchers that clicked on Stack Overflow's link from the search results page. Common sense says that it's more likely for someone to click on a descriptive snippet than to click on a snippet that reads "Stack Overflow. Questions; Tags; Users; Badges; Unanswered. Ask Question. Top Questions active 171 featured hot week month".
Finally, regarding the sitelinks, and as David Dorward mentioned, those links are automatically generated by Google and the only control that the webmaster has over them is to decide whether he wants to block them or not. There are a few factors that Google considers when determining if your website deserves to receive sitelinks: link structure/site architecture of your site, number of inbound links to your landing pages, anchor text of those links, traffic coming from Google's search results directly to your landing pages, etc.
The description is just content extracted from the page.
The links aren't under the control of authors.
Just provide semantic markup, as usual (i.e. make sure your navigation is expressed as lists of links, and so on), and hope that Google decides you are important enough to be worth adding those links.
Related
Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about programming within the scope defined in the help center.
Closed 8 years ago.
Improve this question
I published my first website and I am still trying to solve its problems. I have mainly two questions regarding my website:
I have published my website and it suddenly appears in Google, but not in the way I want. For example, my site is www.mysite.com, but in Google www.mysite.com/contact.html or www.mysite.com/blog.html come before the original website URL.
I would like my website to be displayed like a website. (Please, write envato on Google and you will see the first result. Main link is on the top, sub-links are below it). How can I achieve this?
There is not right way to appear in Google. Google decides in which order it presents your pages according to what it thinks is best for its users. You have no control on this. However, you can influence this by creating more backlinks to your preferred URLs and focus on its content too. Make it more valuable than other pages for example.
This is a SEO related question. Next time, it should be asked on Pro Webmasters.
Google will rank what they deem the most relevant result the highest. If they're ranking your contact or blog pages higher, chances are there's not much useful content on the home page.
You can't affect this. Google does it based on an algorithm, and only for sites where they deem it to be sufficiently useful.
Optimise your home page for the keywords / phrases for which related pages are showing. You cannot control what Google deems relevant as per their algo.
Re-optimise your inner pages (unwanted links) for related phrases. Building a site-map with proper hierarchy also helps.
Also, are your inner pages within sub-folders or in the open folder along with your index.html? Defining a proper 'folder-drill-down' bread-crumb also helps search engines in understanding page hierarchy and displaying it like so.
It's for a while I'm researching on the microformat for styling my site information different in google result page.
I found some detail about microformat in this links:
http://microformats.org/wiki/hcard-authoring#The_Importance_of_Names
http://blog.teamtreehouse.com/add-microformats-magic-to-your-site
http://microformats.org/get-started
that will have result like this :
Now, I'm trying to find out could I manipulate microformats to force google show my site information in result page, just like do it for stackoverflow or other most popular sites :
Or Is it possible to do that...?!?!?
Thanks in advance...
You can't force Google to show your website and sub pages like the Stack Overflow example you posted. Your search term was stackoverflow and so the information displayed on the results page was far and away the most relevant. Hence why it displays like that.
If someone searched for your website by name you might get a result like that. You'll need to submit an xml sitemap to Google Webmaster Tools, give it time to index and hopefully your website name will be unique enough.
I guess the main thing is that your website is first on Google's results page for a given search term and the sitemap shows Google what your other pages are.
With respect to microdata - it's really good for giving extra information to search engines. The CSS-tricks one is a perfect example. You'd need a Google+ profile and using the microdata specify that profile as the author.
Again, Webmaster Tools has some great Microdata validation tools. You can even load your pages source code up, highlight the text you want to tag and it'll show you exactly what tags to add and how so it works. Link below:
https://www.google.com/webmasters/markup-helper/
Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 9 years ago.
Improve this question
I want to create a website where people can create events related to education, entertainment, sports, regional, cultural etc. Every user will have its own page which will be publicly available. Now I want to make sure that these events are searchable by all the search engines such as Google, Bing etc, as soon as the event is created. How can I achieve this?
There is nothing special you need to do to make your website crawlable, as long as your text is in html, and not generated by javascript exclusively and can be found on an url search engines will find it.
With that said if you want to speed up the indexing process you can programmatically ping google for every new content entry you make.
eg ping google:
http://www.google.com/webmasters/sitemaps/ping?sitemap=URLOFSITEMAP.xml
to ping bing:
http://www.bing.com/webmaster/ping.aspx?siteMap=http://www.yourdomain.com/sitemap.xml
Most search engines have an option for pinging similar to this. Keep in mind that yahoo and bing use the same technology, so you only need to ping bing.
You also need to populate your XML sitemap everytime you create new content and then automatically ping it using the url above. This will make sure that all your content is indexed as fast as possible.
Important SEO Techniques to follow:
#1 – Optimize your title tags
#2 – Create compelling meta descriptions. Make sure that words used in meta tag descriptions should be reused in a page content.
#3 – Utilize keyword-rich headings
#4 – Add ALT tags to your images.
#5 – Create a sitemap
#6 – Build internal links between pages
#7 – Update your site regularly
#8 - Provide Accessibility for all users.
high-powered link building campaigns
always require that good old optimized content
Optimize your head section within HTML, Use a unique and relevant title and meta description on every page
title description for a tags
Alt text for IMG tags
use proper H1, H2,...H6 tags
Use keywords in your URLs and file names,
Use your keywords as anchor text when linking internally.
Social network marketing can also help, Use press releases wisely.
Start a blog and participate with other related blogs.
Do keyword research at the start of the project.(good free tool is Google’s AdWords Keyword Tool, which doesn’t show exact numbers. so you may use paid versions of Keyword Discovery or WordTracker)
SEO isn’t a one-time event. Search engine algorithms change regularly, so the tactics that worked last year may not work this year. SEO requires a long-term outlook and commitment.
For more help refer:
really nice article:
12 steps for better SEO
21 Tips for SEO
Jump start with below links:
https://support.google.com/webmasters/answer/35291?hl=en
http://www.bing.com/toolbox/seo-analyzer
gud luck :)
Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 9 years ago.
Improve this question
I know that <meta name="Description" content="[description here]" /> can be used but I wonder how to make a description like the one in facebook.
Does this description use the <meta> tag as well? Or is there some other secret behind it?
Edit: I code my site by myself (no wordpress and stuff) :)
I believe this is how it happens.
Google primarily displays multi link listings when they feel a query
has a strong chance of being navigational in nature. I think they can
determine that something is navigational in nature based on linkage
data and click streams. If the domain is well aligned with the term
that could be another signal to consider.
If you have 10,000 legit links for a term that nobody else has more
than a few dozen external citations for then odds are pretty good that
your site is the official brand source for that term. I think overall
relevancy as primarily determined by link reputation is the driving
factor for weather or not they post mini site map links near your
domain.
This site ranks for many terms, but for most of them I don't get the
multi link map love. For the exceptionally navigational type terms
(like seobook or seo book) I get multi links.
The mini site maps are query specific. For Aaron Wall I do not get the
mini site map. Most people usually refer to the site by it's domain
name instead of my name.
Google may also include subdomains in their mini sitemaps. In some
cases they will list those subdomains as part of the mini site map and
also list them in the regular search results as additional results.
Michael Nguyen put together a post comparing the mini site maps to
Alexa traffic patterns. I think that the mini site maps may roughly
resemble traffic patterns, but I think the mini links may also be
associated with internal link structure.
For instance, I have a sitewide link to my sales letter page which I
use the word testimonials as the anchor text. Google lists a link to
the sales letter page using the word testimonials.
When I got sued the page referencing the lawsuit got tons and tons of
links from many sources, which not only built up a ton of linkage
data, but also sent tons of traffic to that specific page. That page
was never listed on the Google mini site map, which would indicate
that if they place heavy emphasis on external traffic or external
linkage data either they try to smooth the data out over a significant
period of time and / or they have a heavy emphasis on internal
linkage.
My old site used to also list the monthly archives on the right side
of each page, and the February 2004 category used to be one of the
mini site map links in Google.
You should present the pages you want people to visit the most to
search bots the most often as well. If you can get a few extra links
to some of your most important internal pages and use smart channeling
of internal linkage data then you should be able to help control which
pages Google picks as being the most appropriate matches for your mini
site map.
Sometimes exceptionally popular sites will get mini site map
navigational links for broad queries. SEO Chat had them for the term
SEO, but after they ticked off some of their lead moderators they
stopped being as active and stopped getting referenced as much. The
navigational links may ebb and flow like that on broad generic
queries. For your official brand term it may make sense to try to get
them, but for broad generic untargeted terms in competitive markets
the amount of effort necessary to try to get them will likely exceed
the opportunity cost for most webmasters.
Source.
Hope this helps.
It depends on the website popularity.
Google does it, you don't.
Google may do it but you can persuade them.And check this out sub sitelinks in google search result
For starters, be sure you have a “sitemap.xml” file. This is a file
that tells the search engine about the pages on your site and makes
it easier for its spiders to crawl and understand it. Your
webmaster or website provider or Content Management System (like
WordPress) should have handled this for you, but it’s worth
checking. If you’re not a master of website technical stuff,
whoever is your technical support person will be able to tell you if
that page is there, and properly set up.
You should register your site with Google Webmaster Tools, if you
haven’t already. The exact process changes from time to time, but
basically, you’ll give Google the URL of your Sitemap file, which
you’ll have from the previous step. You’ll have to put a “Site
Verification Code” on your site to prove to them that you own the
site, and there are a few other simple steps.
Whenever you link one page to another in your site, use anchor text
and alt text that’s descriptive, and as succinct as possible, and
consistent. For example, you’ve linked to your “concierge services”
page from another page using the anchor text “concierge services.”
That’s perfect. Now, don’t link from another page using “guest
services.” You don’t want to be confusing the poor Google spider,
after all.
Why on sites like Stack Overflow, Techcrunch, Smashing Magazine, etc. are the page titles (i.e. the text at the top of the page) clickable URLs that redirect to the same page that the user is on?
Some examples:
I believe that this does not effect SEO as search engines ignore internal links.
Is it for usability purposes?
It allows you to right-click on it and choose Copy link location (or equivalent) so that you can easily paste it in an email for example. This requires less time than copying it from the location bar, and some people run their browser without a visible location bar to save previous screen space.
More than anything, it provides a link to the default state of the page.
For example, for this very stack overflow page it is a user can get here through any of the following non-default links:
Why are Page Titles on some websites (including Stack Overflow) Clickable URLs?
https://stackoverflow.com/questions/904381#foobar
https://stackoverflow.com/questions/904381?sort=date
While the default link is actually:
Why are Page Titles on some websites (including Stack Overflow) Clickable URLs?
If users are unable to get to the default state, they end up bookmarking or emailing the non-default link which propagates to new users and the problem just multiplies.
Clicking on the title link of the post will restore the default state and strip off any query parameters (?sort=date), named anchors (#foobar) and fix the story slug (/why-are-page-titles/...).
I use it to refresh the page (yes, I could press F5 too).
Yes Jakob Nielsen has stated that linking to yourself is a web design mistake (nr 10). And I agree.
More reading info here. (nr 10)
The URL redirects to the beginning of the page, in case you arrived on the page via a specific answer (all answers are also clickable URLs). This way, you get the URL of the question, not of an answer.
Not sure if this is why they did it, but I find it useful to siphon off tabs:
If I look at something briefly and think "I'd like to read this thoroughly in a minute but continue with what I was doing before", I can do this:
I can right click the link, click "open in a new tab" and then click "back" and continue nicely.
It's called a Permalink... The name implies what it is, a permanent link.
It's the same reason that each answer on SO has a link you can copy.
I think it inherits the behavior from CMS where each question is a node, which has 0<= answered question. Now think you go for a search on apache questions.
The result are displayed one after another.
In terms of CMS this is called a teaser. You get a full page with lots of questions where the question's title link to the full article(question + answers)
Its not a must, but you'll find it on most sites which uses a CMS.
As long as it does not harm anyone why would people be against it?
I prefer to have those links available as hitting refresh would reload all elements of the page instead of just following the direct link (to the same page) that uses cached elements.
Makes sense to me, I find it useful! I have a lot of tabs open so I just right click the link and go back.
To me this makes perfect sense, from a SEO view this is also good! It forces it to read the page because it's linked.
UX-wise clickable titles which don't bring the user anywhere may seem unusable though that leads us into the realm of Affordance Theory and whether or not the affordance is perceptible to users.
For example, clickable page titles may provide:
A simple method for bookmarking a page to the desktop from a browser window.
A context menu with additional choices allowing users to share a blog post or article.
A method for updating the location bar so it's pointing at the canonical URL of the page.
For the sites you mentioned, however, it seems more likely the page titles were turned into hyperlinks using absolute URLs so analytics tooling could pick up inbound link clicks – those sending the referer info – resulting in DCMA takedown notices when people copied work and didn't update the URLs.
You'd be surprised what people do when they're being incentivized to produce work contractually.