How Does Google Chrome know to use HNSearch to search Hacker News? - google-chrome

Hacker News' URL is news.ycombinator.com. When I input the full URL into Chrome, the right most part of the URL bar has the text "Press to search HNSearch". HNSearch is a separate site which indexes and searches Hacker News. It is located at hnsearch.com. There is nothing in the metadata of Hacker News to indicate that HNSearch is the search engine of Hacker News.
So my question is, How does Google Chrome know to use HNSearch to search Hacker News?

Likely Chrome is asking Google for what search forms are available. The search form is using hnsearch.com.

Related

How can i tell google that i have removed the .html from my url?

Hi I have recently removed the '.html' from the end of my url's to make them look more professional which was brilliant. However, now when I see my site on Google the old url which includes the '.html' still appears which produces people with an error page as expected. How can I tell Google that I have new url addresses so that people can visit my site again?
thanks!
Best way to remove .html extensions is by adding it in .htaccess file. This way search engines will "understand" it, but you will not seeing the search result immediately, since search engine crawler, will take some time to update.
And make sure to submit your url in google. If you have google webmaster you will be able to see this process and status of your website more clearly.

Google bot can't fetch some text from my blog

when i see my website from spider's view http://www.feedthebot.com/tools/spider/index.php
There are very less number of words count, its not fetching my links as a text.
my blog address is zemtv .com
and when i perform same operation with my other site dramasonline .com
then its fetching links as text
Please suggest me what to do
It is provably because the links in the your first website contain the attribute rel="bookmark", while the second don't.
I would check the website also using the "fetch as Google" tool that available in the Webmaster Tools website.

GOOGLE: how to prevent subpages from appearing in results

I have a fairly new website which allows people to create their own profiles and such. The issue is that when someone links to their profile from their website/blog, their profile shows up in google searches for my website - and to date the one person who has done this has a NSFW profile. Which means, when you search for my site on Google one of the top results is a NSFW page.
How do I prevent google from listing subpages in the results? Would robots.txt solve this? And if a page is already listed, will adding an entry in robots.txt disallowing access to profile pages in general end up removing it from the results?
robots.txt will solve it to some extent. If there are direct external links, then I have found that google still indexes them.
Go to http://webmaster.google.com, get your website claimed, and then use their URL removal tool.
Yes, see http://www.robotstxt.org/. Just list things like "Disallow: /profile/" etc and google will stop indexing them and after a time, remove them.

Short question about Google indexing of website and Google Webmaster Tools

For all you who know, in Google Webmaster Tools one can submit a sitemap or **sitemap_inde**x file and then google will fetch it and crawl the website when it "has time to".
I have searched for this but can't find an answer anywhere...
In the interface of webmaster tools, there is a section for "sitemaps" which lists all sitemaps submitted to google.
On the right of these sitemap names, there is a column saying something like "webadresses in webindex".
This have always shown 0 for all sitemaps.
I am guessing this means nr of pages indexed in the Sitemap.
My Q is, why is this showing 0 all the time? And is this actually the nr of pages indexed by google?
FYI, I have a very good and SE friendly website.
However, you should know it has only been a week that I have submitted the sitemaps.
Any ideas?
Well, sometimes it can take some time, unfortunatly it's quite random.
It happened to me once that, giving 5 different sitemap for 5 different websites at the same time, 4 was done in a week and 1 in a month...
Anyway,
in your sitemap, did you put <changefreq>monthly</changefreq> for the main page ?
on the "sitemaps" page, click on the sitemap you sent and watch the url of the site map (ie: Sitemap : http://www.mydomain.com/sitemap.xml) and see if there's any typo.
Finally, did you try to hit the "resent" link on that page ?
I have had some experience of the sitemapping process. Some software programs that create the XML sitemap will deliver XML that will get 'stuck'.
Have you tried creating the simplest sitemap possible for your site by hand and submitting that?

Do I need to submit the sitemap to search engines everytime it is updated?

If I have a sitemap_index.xml:
http://www.domain.com/sitemap.xml
2010-09-28
And I change the content or update the page, and then change the lastmod, will I then have to submit it again to the search engines, for example in google webmaster tools (the section where you submit sitemaps)?
Thanks
As long as you've told Google about the sitemap, they'll check it periodically. The more often it changes, the more they'll tend to check it.
If you go to Site configuration | Sitemaps, it'll tell you the last date they downloaded your sitemap.
No. It is however worth taking a look at the sitemaps page on webmaster tools every now and then and seeing if any errors were reported with the sitemap.
#Skilldrick is right!
Also, google states that the results are not effected by the sitemaps anyway. They should only give a guidance to the search spider. He/she will make the final decision!!