It's for a while I'm researching on the microformat for styling my site information different in google result page.
I found some detail about microformat in this links:
http://microformats.org/wiki/hcard-authoring#The_Importance_of_Names
http://blog.teamtreehouse.com/add-microformats-magic-to-your-site
http://microformats.org/get-started
that will have result like this :
Now, I'm trying to find out could I manipulate microformats to force google show my site information in result page, just like do it for stackoverflow or other most popular sites :
Or Is it possible to do that...?!?!?
Thanks in advance...
You can't force Google to show your website and sub pages like the Stack Overflow example you posted. Your search term was stackoverflow and so the information displayed on the results page was far and away the most relevant. Hence why it displays like that.
If someone searched for your website by name you might get a result like that. You'll need to submit an xml sitemap to Google Webmaster Tools, give it time to index and hopefully your website name will be unique enough.
I guess the main thing is that your website is first on Google's results page for a given search term and the sitemap shows Google what your other pages are.
With respect to microdata - it's really good for giving extra information to search engines. The CSS-tricks one is a perfect example. You'd need a Google+ profile and using the microdata specify that profile as the author.
Again, Webmaster Tools has some great Microdata validation tools. You can even load your pages source code up, highlight the text you want to tag and it'll show you exactly what tags to add and how so it works. Link below:
https://www.google.com/webmasters/markup-helper/
Related
This is a rephrasing of my original question https://stackoverflow.com/questions/14516983/google-sites-trying-to-script-announcements-page-on-steroids:
I've been looking into ways to make subpages of a parent page appear in a grid like "articles" on the home page of my Google Site — like on a Joomla home page and almost like a standard "Announcements" template, except:
The articles should appear in a configurable order, not chronologically (or alphabetically).
The first two articles should be displayed full-width and the ones beneath in two columns.
All articles will contain one or more images, and at least the first one should be displayed.
The timestamp and author of each subpage/article shouldn't be displayed.
At the moment I don't care if everything except the ordering is hardcoded, but ideally there should be a place to input prefs like the number of articles displayed, image size, snippet length, css styling etc.
My progress so far:
I tried using an iframe with an outside-hosted Javascript (using google.feeds.Feed) that pulls the RSS feed from the "Announcements" template, but I can't configure the order of the articles. One possibility would be to have a number at the beginning of every subpage title and parse it, but it's going to mess up with time and the number would also be visible on the standalone article page. Or could the number be hidden with Javascript?
I tried making a spreadsheet with a row for each article with columns "OrderId", "Title", "Content", "Image" and process and format the data with a Google App Script (using createHTML and createImage), but a) there doesn't seem to be a way to get a spreadsheet image to show up inside the webapp and b) these articles are not "real" pages that can be linked to easily on the menus.
This feature would be super-useful for lots of sites, and to me it just seems odd that it isn't a standard gadget (edit: or template). Ideas, anyone?
I don't know if this is helpful, but I wanted something similar and used the RSS XML announcements feed within a Google Gadget embedded into my sites page
Example gadget / site:
http://hosting.gmodules.com/ig/gadgets/file/105840169337292240573/CBC_news_v3_1.xml
http://www.cambridgebridgeclub.org
It is badly written, messy and I'm sure someone could do better than me, but it seems to work fairly reliably. The xml seems to have all the necessary data to be able to chop up articles, and I seem to remember it has image urls as well, so can play with them (although not implemented in my gadget).
Apologies if I am missing the point. I agree with your feature request - it would be great not to have to get so low-level to implement stuff like this in sites....
How these sites gather all the data - questionhub, bigresource, thedevsea, developerbay?
Is this legal to show data in frame as bigresource do?
#amazed
EDITED : fixed some spelling issues 20110310
How these sites gather all data- questionhub, bigresource ...
Here's a very general sketch of what is probably happening in the background at website like questionhub.com
Spider program (google "spider program" to learn more)
a. configured to start reading web pages at stackoverflow.com (for example)
b. run program so it goes to home page of stackoverflow.com and starts visiting all links that it finds on those pages.
c. Returns HTML data from all of those pages
Search Index Program
Reads HTML data returned by spider and creates search index
Storing the words that it found AND what URL those words where found at
User Interface web-page
Provides feature rich user-interface so you can search the sites that have been spidered.
Is this legal to show data in frame as bigresource do?
To be technical, "it all depends" ;-)
Normally, websites want to be visible in google, so why not other search engines too.
Just as google displays part of the text that was found when a site was spidered,
questionhub.com (or others) has chosen to show more of the text found on the original page,
possibly keeping the formatting that was in the orginal HTML OR changing the formatting to
fit their standard visual styling.
A remote site can 'request' that spyders do NOT go thru some/all of their web pages
by adding a rule in a well-known file called robots.txt. Spiders do not
have to honor the robots.txt, but a vigilant website will track the IP addresses
of spyders that do not honor their robots.txt file and then block that IP address
from looking at anything on their website. You can find plenty of information about robots.txt here on stackoverflow OR by running a query on google.
There is a several industries (besides google) built about what you are asking. There are tags in stack-overflow for search-engine, search; read some of those question/answers. Lucene/Solr are open source search engine components. There is a companion open-source spider, but the name eludes me right now. Good luck.
I hope this helps.
P.S. as you appear to be a new user, if you get an answer that helps you please remember to mark it as accepted, or give it a + (or -) as a useful answer. This goes for your other posts here too ;-)
Searching Stackoverflow on Google I get this result:
Inspecting the HTML source of the Stack Overflow frontpage I can't find any reference of
"A language-indipendent collaboratively edited question and answer site for programmers" stuff.
I thought it was something related to meta description, but it is not true.
Also, where are the Log In, Questions, Ask Question etc. declared? Sitemap.xml?
In few words, could I obtain the same result simply editing my web site content, or is it something configured on some Google webmaster panel?
When the search engines try to determine which content they should use for the snippet showed in the search results, they take a look at two things:
Meta description tag
Content on the page
Because the goal of the search engines is to show a snippet that is as informative and useful for the searchers as possible, the search engines will first evaluate the meta description tag trying to determine how relevant this description is to the keywords (search terms) used by the searcher in his query. A very straightforward indication of relevance is the presence of any of those keywords in the meta description tag. If the meta description is not present or it is determined to be irrelevant to the query performed, the search engines will try to pull content from the page and use it as the snippet for the search results. This behavior is true for several search engines such as Bing (and by extension for Yahoo!, whose search results are powered by Bing), Ask.com, etc.
Ok, let's now take a look at Google's search results for the query "stack overflow":
Stack Overflow
A language-independent collaboratively edited question and answer site for programmers.
stackoverflow.com
As we can see, the snippet includes the following text: "A language-independent collaboratively edited question and answer site for programmers." Surprisingly enough, this line of text doesn't appear in neither the meta description (which is actually missing from the page), nor in the content of Stack Overflow's homepage. Where did that line of text come from then?
Here's the answer: Google and AOL evaluate a third source of information to determine the search result snippet of a page: the DMOZ directory. DMOZ is an open content directory, which includes links to millions of websites, organized by category. Let's do a search for Stack Overflow on DMOZ:
Stack Overflow - A language-independent collaboratively edited question and answer site for programmers.
As we can see, Stack Overflow is listed on DMOZ and its description its being used by Google and AOL to populate the snippet.
If we try the same search for Yahoo! (or Bing), this is what we obtain:
Stack Overflow
Stack Overflow. Questions; Tags; Users; Badges; Unanswered. Ask Question. Top Questions active 171 featured hot week month
Because the meta description tag is missing from Stack Overflow's homepage, and because Yahoo! doesn't use the DMOZ directory as an extra source of information, the only thing that Yahoo! has left is to pull content from the page, with poor results.
Instead of blaming Yahoo!, however, it is Stack Overflow's fault not to have included a meta description tag on their homepage, which would allow them to have more control over what gets displayed in the search results. Remember that the snippet has a strong influence in the Click Through Rate (CTR), which is the percentage of searchers that clicked on Stack Overflow's link from the search results page. Common sense says that it's more likely for someone to click on a descriptive snippet than to click on a snippet that reads "Stack Overflow. Questions; Tags; Users; Badges; Unanswered. Ask Question. Top Questions active 171 featured hot week month".
Finally, regarding the sitelinks, and as David Dorward mentioned, those links are automatically generated by Google and the only control that the webmaster has over them is to decide whether he wants to block them or not. There are a few factors that Google considers when determining if your website deserves to receive sitelinks: link structure/site architecture of your site, number of inbound links to your landing pages, anchor text of those links, traffic coming from Google's search results directly to your landing pages, etc.
The description is just content extracted from the page.
The links aren't under the control of authors.
Just provide semantic markup, as usual (i.e. make sure your navigation is expressed as lists of links, and so on), and hope that Google decides you are important enough to be worth adding those links.
I had a food takeaway website where the users can search the restaurants by giving their area name. I want my website's LONDON search page to be listed when user searches in Google TAKEAWAYS IN LONDON.
I think google doesn't crawl websites with query string. How can we achieve that?
Remember that Google just looks at the page you create, not how you create it.
Now this basically translates your question to "how do we make our dynamic pages visible to Google"? There are a number of tricks. Basically you build links from your homepage to specific other pages. You could have a "Top 5 searches" box, with links to "http://www.example.com/london/takeaways" etc. On the result pages, you can have links to similar queries, e.g. "http://www.example.com/london/finedining".
Basically, Google will see and follow those links, and remember the results per URL. The basic SEO rules till apply - good content, clear structure, etc. Don't bother Google or users with query strings. Use URL rewriting if a query string easier for you internally.
Maybe you're supposed to have a sitemap, which could have a discoverable link to a page of yours whose URL is http://www.food.com/london and whose title and heading is 'TAKEAWAYS IN LONDON' (and whose contents you can retrieve dynamically).
When you go to Google and perform a search, it will return either one of two type of results:
just the title of your webpage, or
the title of your web-page plus, lists subpages it found on that web site
Here is an example of option #2: http://37assets.s3.amazonaws.com/svn/grub-ellis-googlelisting.png
My website on a google.com search only lists my web page title (option #1), how do I get google to list my subpages on the search results (option #2)?
Is is an HTML issue? How do I get Google to know what my subpages are so that it can also list those on a google search.
Those are called "sitelinks" and are automated but you can partially configure them in Google's webmaster's tools. In webmaster's tools, click "sitelinks" in the navigation menu on the left. From the sitelinks page:
Sitelinks are links to a site's interior pages. Not all sites have sitelinks. Google generates these links automatically, but you can remove sitelinks you don't want.
Here is another Google page explaining sitelinks.
You should add a site-map using the Google webmaster tools site, or by maintaining your own. For explanation check out Sitelinks page.
Google has not generated any sitelinks
for your site. Sitelinks are
completely automated, and we show them
only if we think they'll be useful to
the user. If your site's structure
doesn't allow our algorithms to find
good sitelinks, or we don't think that
the sitelinks are relevant to the
user's query, we won't show them.
However, we are always working to
improve how we find and display
sitelinks.
You can also directly enable sitelinks (you don’t have to get lucky) in Google’s Pay-Per-Click platform (AdWords), and it will have a similar very positive impact on your clickthrough rate.
You need to create XML sitemap. Here is all you need to know. Check if your open-source CMS has plugin/add-on/module to do this automatically, there must be generators somewhere too.
http://www.google.lv/search?q=XML+sitemap
http://en.wikipedia.org/wiki/Sitemaps
http://www.google.com/support/webmasters/bin/answer.py?hl=en&answer=156184
You are describing "Search Engine Optimization" with your question. If you have a small site, the best thing you can hope for is to ensure every page has a unique title, links back to your home page, you have a good "site map" so search engines can easily discover ALL of your pages, and most important, your pages are THE definitive place for information about whatever you're selling.
Content is king and once you become the authority, your page will pop up in the 1st 1-2 links.
Contact some local SEO folks in your area and ask for a site evaluation. Many will do it for free with their automated tools. You can use the webmaster tools from bing or google if you're on a tight budget.