How to edit the google description of your site? [closed] - html

Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 9 years ago.
Improve this question
I know that <meta name="Description" content="[description here]" /> can be used but I wonder how to make a description like the one in facebook.
Does this description use the <meta> tag as well? Or is there some other secret behind it?
Edit: I code my site by myself (no wordpress and stuff) :)

I believe this is how it happens.
Google primarily displays multi link listings when they feel a query
has a strong chance of being navigational in nature. I think they can
determine that something is navigational in nature based on linkage
data and click streams. If the domain is well aligned with the term
that could be another signal to consider.
If you have 10,000 legit links for a term that nobody else has more
than a few dozen external citations for then odds are pretty good that
your site is the official brand source for that term. I think overall
relevancy as primarily determined by link reputation is the driving
factor for weather or not they post mini site map links near your
domain.
This site ranks for many terms, but for most of them I don't get the
multi link map love. For the exceptionally navigational type terms
(like seobook or seo book) I get multi links.
The mini site maps are query specific. For Aaron Wall I do not get the
mini site map. Most people usually refer to the site by it's domain
name instead of my name.
Google may also include subdomains in their mini sitemaps. In some
cases they will list those subdomains as part of the mini site map and
also list them in the regular search results as additional results.
Michael Nguyen put together a post comparing the mini site maps to
Alexa traffic patterns. I think that the mini site maps may roughly
resemble traffic patterns, but I think the mini links may also be
associated with internal link structure.
For instance, I have a sitewide link to my sales letter page which I
use the word testimonials as the anchor text. Google lists a link to
the sales letter page using the word testimonials.
When I got sued the page referencing the lawsuit got tons and tons of
links from many sources, which not only built up a ton of linkage
data, but also sent tons of traffic to that specific page. That page
was never listed on the Google mini site map, which would indicate
that if they place heavy emphasis on external traffic or external
linkage data either they try to smooth the data out over a significant
period of time and / or they have a heavy emphasis on internal
linkage.
My old site used to also list the monthly archives on the right side
of each page, and the February 2004 category used to be one of the
mini site map links in Google.
You should present the pages you want people to visit the most to
search bots the most often as well. If you can get a few extra links
to some of your most important internal pages and use smart channeling
of internal linkage data then you should be able to help control which
pages Google picks as being the most appropriate matches for your mini
site map.
Sometimes exceptionally popular sites will get mini site map
navigational links for broad queries. SEO Chat had them for the term
SEO, but after they ticked off some of their lead moderators they
stopped being as active and stopped getting referenced as much. The
navigational links may ebb and flow like that on broad generic
queries. For your official brand term it may make sense to try to get
them, but for broad generic untargeted terms in competitive markets
the amount of effort necessary to try to get them will likely exceed
the opportunity cost for most webmasters.
Source.
Hope this helps.

It depends on the website popularity.
Google does it, you don't.
Google may do it but you can persuade them.And check this out sub sitelinks in google search result
For starters, be sure you have a “sitemap.xml” file. This is a file
that tells the search engine about the pages on your site and makes
it easier for its spiders to crawl and understand it. Your
webmaster or website provider or Content Management System (like
WordPress) should have handled this for you, but it’s worth
checking. If you’re not a master of website technical stuff,
whoever is your technical support person will be able to tell you if
that page is there, and properly set up.
You should register your site with Google Webmaster Tools, if you
haven’t already. The exact process changes from time to time, but
basically, you’ll give Google the URL of your Sitemap file, which
you’ll have from the previous step. You’ll have to put a “Site
Verification Code” on your site to prove to them that you own the
site, and there are a few other simple steps.
Whenever you link one page to another in your site, use anchor text
and alt text that’s descriptive, and as succinct as possible, and
consistent. For example, you’ve linked to your “concierge services”
page from another page using the anchor text “concierge services.”
That’s perfect. Now, don’t link from another page using “guest
services.” You don’t want to be confusing the poor Google spider,
after all.

Related

How to make website searchable for different search engines [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 9 years ago.
Improve this question
I want to create a website where people can create events related to education, entertainment, sports, regional, cultural etc. Every user will have its own page which will be publicly available. Now I want to make sure that these events are searchable by all the search engines such as Google, Bing etc, as soon as the event is created. How can I achieve this?
There is nothing special you need to do to make your website crawlable, as long as your text is in html, and not generated by javascript exclusively and can be found on an url search engines will find it.
With that said if you want to speed up the indexing process you can programmatically ping google for every new content entry you make.
eg ping google:
http://www.google.com/webmasters/sitemaps/ping?sitemap=URLOFSITEMAP.xml
to ping bing:
http://www.bing.com/webmaster/ping.aspx?siteMap=http://www.yourdomain.com/sitemap.xml
Most search engines have an option for pinging similar to this. Keep in mind that yahoo and bing use the same technology, so you only need to ping bing.
You also need to populate your XML sitemap everytime you create new content and then automatically ping it using the url above. This will make sure that all your content is indexed as fast as possible.
Important SEO Techniques to follow:
#1 – Optimize your title tags
#2 – Create compelling meta descriptions. Make sure that words used in meta tag descriptions should be reused in a page content.
#3 – Utilize keyword-rich headings
#4 – Add ALT tags to your images.
#5 – Create a sitemap
#6 – Build internal links between pages
#7 – Update your site regularly
#8 - Provide Accessibility for all users.
high-powered link building campaigns
always require that good old optimized content
Optimize your head section within HTML, Use a unique and relevant title and meta description on every page
title description for a tags
Alt text for IMG tags
use proper H1, H2,...H6 tags
Use keywords in your URLs and file names,
Use your keywords as anchor text when linking internally.
Social network marketing can also help, Use press releases wisely.
Start a blog and participate with other related blogs.
Do keyword research at the start of the project.(good free tool is Google’s AdWords Keyword Tool, which doesn’t show exact numbers. so you may use paid versions of Keyword Discovery or WordTracker)
SEO isn’t a one-time event. Search engine algorithms change regularly, so the tactics that worked last year may not work this year. SEO requires a long-term outlook and commitment.
For more help refer:
really nice article:
12 steps for better SEO
21 Tips for SEO
Jump start with below links:
https://support.google.com/webmasters/answer/35291?hl=en
http://www.bing.com/toolbox/seo-analyzer
gud luck :)

Crawling data or using API

How these sites gather all the data - questionhub, bigresource, thedevsea, developerbay?
Is this legal to show data in frame as bigresource do?
#amazed
EDITED : fixed some spelling issues 20110310
How these sites gather all data- questionhub, bigresource ...
Here's a very general sketch of what is probably happening in the background at website like questionhub.com
Spider program (google "spider program" to learn more)
a. configured to start reading web pages at stackoverflow.com (for example)
b. run program so it goes to home page of stackoverflow.com and starts visiting all links that it finds on those pages.
c. Returns HTML data from all of those pages
Search Index Program
Reads HTML data returned by spider and creates search index
Storing the words that it found AND what URL those words where found at
User Interface web-page
Provides feature rich user-interface so you can search the sites that have been spidered.
Is this legal to show data in frame as bigresource do?
To be technical, "it all depends" ;-)
Normally, websites want to be visible in google, so why not other search engines too.
Just as google displays part of the text that was found when a site was spidered,
questionhub.com (or others) has chosen to show more of the text found on the original page,
possibly keeping the formatting that was in the orginal HTML OR changing the formatting to
fit their standard visual styling.
A remote site can 'request' that spyders do NOT go thru some/all of their web pages
by adding a rule in a well-known file called robots.txt. Spiders do not
have to honor the robots.txt, but a vigilant website will track the IP addresses
of spyders that do not honor their robots.txt file and then block that IP address
from looking at anything on their website. You can find plenty of information about robots.txt here on stackoverflow OR by running a query on google.
There is a several industries (besides google) built about what you are asking. There are tags in stack-overflow for search-engine, search; read some of those question/answers. Lucene/Solr are open source search engine components. There is a companion open-source spider, but the name eludes me right now. Good luck.
I hope this helps.
P.S. as you appear to be a new user, if you get an answer that helps you please remember to mark it as accepted, or give it a + (or -) as a useful answer. This goes for your other posts here too ;-)

Google Semantic results question

http://www.google.co.uk/search?q=mark+zuckerberg+crunchbase
Guys, check out that search, in particular the first result's url. Crunchbase.com > People, and thus the people links to the /people section of the site.
How are they achieving it? I know Google algorithm is intelligent and looks for links and then makes the assumptions itself in cases, but is there any particular markup they are using to help Google to make these connections?
Google is light with details, but here's what they said in their announcement.
The information in these new hierarchies come from analyzing destination web pages. For example, if you visit the ProductWiki Spidersapien page, you'll see a series of similar links at the top, "Home> Toys & Games> Robots." These are standard navigational tools used throughout the web called "breadcrumbs," which webmasters frequently show on their sites to help users navigate. By analyzing site breadcrumbs, we've been able to improve the search snippet for a small percentage of search results, and we hope to expand in the future.

HTML: How to get my subpages listed on a google search

When you go to Google and perform a search, it will return either one of two type of results:
just the title of your webpage, or
the title of your web-page plus, lists subpages it found on that web site
Here is an example of option #2: http://37assets.s3.amazonaws.com/svn/grub-ellis-googlelisting.png
My website on a google.com search only lists my web page title (option #1), how do I get google to list my subpages on the search results (option #2)?
Is is an HTML issue? How do I get Google to know what my subpages are so that it can also list those on a google search.
Those are called "sitelinks" and are automated but you can partially configure them in Google's webmaster's tools. In webmaster's tools, click "sitelinks" in the navigation menu on the left. From the sitelinks page:
Sitelinks are links to a site's interior pages. Not all sites have sitelinks. Google generates these links automatically, but you can remove sitelinks you don't want.
Here is another Google page explaining sitelinks.
You should add a site-map using the Google webmaster tools site, or by maintaining your own. For explanation check out Sitelinks page.
Google has not generated any sitelinks
for your site. Sitelinks are
completely automated, and we show them
only if we think they'll be useful to
the user. If your site's structure
doesn't allow our algorithms to find
good sitelinks, or we don't think that
the sitelinks are relevant to the
user's query, we won't show them.
However, we are always working to
improve how we find and display
sitelinks.
You can also directly enable sitelinks (you don’t have to get lucky) in Google’s Pay-Per-Click platform (AdWords), and it will have a similar very positive impact on your clickthrough rate.
You need to create XML sitemap. Here is all you need to know. Check if your open-source CMS has plugin/add-on/module to do this automatically, there must be generators somewhere too.
http://www.google.lv/search?q=XML+sitemap
http://en.wikipedia.org/wiki/Sitemaps
http://www.google.com/support/webmasters/bin/answer.py?hl=en&answer=156184
You are describing "Search Engine Optimization" with your question. If you have a small site, the best thing you can hope for is to ensure every page has a unique title, links back to your home page, you have a good "site map" so search engines can easily discover ALL of your pages, and most important, your pages are THE definitive place for information about whatever you're selling.
Content is king and once you become the authority, your page will pop up in the 1st 1-2 links.
Contact some local SEO folks in your area and ask for a site evaluation. Many will do it for free with their automated tools. You can use the webmaster tools from bing or google if you're on a tight budget.

What should a main page of a web application be? [closed]

Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 4 years ago.
Improve this question
Designing a web application, how do you design the main page? By this I mean the page that is displayed to a user after entering the base url, like http://www.foo.com.
It would probably depend on a website, but...
stackoverflow welcomes us with list of questions, no silly what is stackoverflow landing page,
last.fm prestens a kind of dashboard, being very popular lately, kind of personalized landing page for registered users
google welcomes us with a search box, but iGoogle i completly diffrent story - looks diffrent for everyone (well, and that's the point actually).
The other thing is, if the user is logged in (provided the website supports logging in), should we present him a diffrent content there then some new, random incomer? And I don't mean some personalized content, but something completly diffrent, like his user profile instead of main page?
From one perspective it could be good - registered users usually know our site, and get a kind of special greeting as soon as they come back. On the other hand, this could cause problems - when I show a website to a friend, then he goes there from his computer and sees something totally diffrent.
And other thing is, when I show a http://www.foo.com to a friend, and it takes me directly to my user profile / dashboard - this isn't sometimes what I'd like to show everyone, as this might show some of my personal data, etc.
What do you do when you design your web applications? What's, in your opinion, best from user's point of view, do my concerns about the website looking diffrent for registered and unregistered users do or don't make any sense? (Again, I don't mean small diffrences, like hiding huge register now link - but showing completly diffrent view then).
It really depends on the focus of your application, but if you were to generalise I would say determine the one or two most critical paths in your application and focus on those.
Registration is probably what you
want to drive more than anything
else, so make it clear how users can
sign up and get involved.
Make it is easy for existing users to sign in.
Consider the amount of text you have
on your front page and reduce and
pair it down as much as possible. Keep the messages and information you
convey here as succinct as possible.
Provide some content immediately
showing what your application or site
provides. Don't make users follow a
link to access the core functionality
of your site immediately e.g. if
you're building an auction site,
ensure there are listings on the
front page.
Consider your audience. If your site is non-technical, the fewer UI elements you present the better. Portal like sites, with lots of compartmentalised functionality and information can be confusing and overwhelming for many non-technical users.
Make it clear how users can get Help if they require it
Without knowing the business area of your site then it's going to be tricky to answer this, but...
You should get the user into the main flow of your website as soon as possible, and the home page is the best place to do this.
If you're an online store, start showing your products.
If you're a search engine, give the user the ability to search.
If you're a blog/news site, show the user the latest news.
Yes - make the experience for a logged on/registered user better (show them THEIR news, show them their recommended products etc), but the purpose of your site should be obvious and accessible from that home page. Get your existing users into their flow as soon as possible, and attact new users in to your site by showing them the meat of your site.
There are plenty of places out there that discuss good web design, making your site "sticky" etc. Check out SmashingMagazine.com (it's one such site) but there are plenty of others.
Oh, and remember that there's one very important user of your home page that you need to accomodate - search engines. Make their life easy, make the content discoverable and indexable, and drive people to your site via Search.
What I've found works best for me is to "role-play" the end-user's experience.
When they initially hit your site, what do they most want to see, or in other words, what are they most likely to be looking for and wanting to do?
I work on many intranet websites for a very large company, and what I've learned is that a home page that has detailed information of the site and what it does is useless and, consequently, my end-users just skip over it in order to get to the pages that they really need. So, my strategy usually consists in a home page that allows them to get straight down to business and whatever they're there to do.
BUT, that's just for the sites that I create. I think it totally depends on your target market and what they're wanting to do.
For the most part, a visitor landing on your page will already know the gist of what your application is about, so there shouldn't be a need to explain in detail what is is you do. Instead, show them that you have the information they are looking for. Screenshots and screencasts are becoming popular these days as a means of getting this across to the short-attention-spanned user.
For registered users, I'd recommend taking them directly to the primary application page instead of the homepage (unless the homepage is the primary application page). For many apps this is a Dashboard (Flickr, Basecamp, Campaign Monitor). If your app's main focus is the homepage, you may want to show them a personalized version of that page (think Google vs. iGoogle).
With all this said, it really does depend on what you are building. Every application is different and there's no right way to do it - only conventions that work for most.
I would start by looking at the type of tasks that can be performed inside your web app, what's important? what's important when they are a new user? what's important when they are a repeat user? what's important when they haven't even registered yet.
Although all of these things happen on the the same page, it's likely that you'll need to define different states. e.g. If a user is on the homepage and not logged in, should we prompt them to login and register.
Perhaps also look at Personas so you can figure out exactly who will be using the app and what is relevant to them.
It should be whatever makes sense for the application, and this should be verified by testing the application with a group of expected users.
The main page should provide a first-time user with enough visual and/or written information to understand what the application is about. They should have some idea as to what actions they can take to interact with the app and what the outcomes of these actions could be.
I know people hate this answer on stackoverflow but there's only one way to find out what the most appropriate thing for your users is - you need to brainstorm ideas with potential users or at the very least you ask them.
I'm not suggesting that you do a focus group, or put a flawed poll up (neither of those things work). Rather, I'm suggesting that you go out and talk to people who will potentially be in your target users and do planning games with them (like card sorting) or go out and do some user testing with paper prototyping.
Anything else is guessing.