In my web site my users can create their own blog.
When user create the blog, all the blog content saved in database and it load the content from db when some one request for it.
My question is that these blogs are searchable in search engines like google?
If not how i make it searchable or what are the ways which i can optimize the discoverableness in search engines?
If your pages are rendered server side, your articles will be crawled by the bots and indexed in search engines. It's about time.
However, you can increase your chances to be indexed faster and better with these simple techniques:
Add enough correct meta tags in your html head Meta tags
Add a robots.txt file in the root of your site Robots.txt
Add a sitemap file in the root of your site Sitemap
Add json-ld description of your blog and each article in the head of your pages Json-ld
Be sure to use semantic html for your content Semantic HTML
Provide social links and social pages that point links to your site
Those are basic, yet effective ways to assure your site to be properly indexed in search engines.
You can also test your SEO eanking with online tools like this one rankgen.com
Related
I am building my own version of thewikigame.com. It is based on the concept of Wikiracing. On the website, the developer embeds the Wikipedia pages into his own website, and tracks how many clicks the user makes to reach the target page. I am not sure how he did this. What would be the best way possible to get the HTML + styling to display these pages on my own website?
As I understand from your question, you need first to download the whole content of enwiki to your website. And to do this you have to download the dump XML for English Wikipedia from this mirror, then import that XML to your wiki using one of these methods.
I'm doing SEO-type work for a client with many diverse site properties-- none of which were built by myself. One of them in particular, to which I'm linking here, appears to be having issues being indexed by search engines. Interestingly, I've tried multiple sitemap generator tools and they too seem to have problems indexing the site; although the site is made up of only a few pages and external links, the sitemap tools-- and I suspect search engines-- are only seeing the homepage itself and nothing else.
In Google webmaster tools, I'm seeing a couple crawl errors (404) relating to home/index.html but nothing else. Also, in Google Analytics, over 80% of the traffic is direct-- i.e. not search traffic-- which seems alarming. The site has been live for about a month, and is being promoted by various sources. Even searching Google using the domain name itself doesn't bring the homepage up in results (!), let alone any related keywords.
My ultimate question is whether or not there appears to be any glaring issues with the code that might prevent proper indexing. I'm noticing that the developer chose to structure the navigation by naming directories, i.e. linking to "home/index.html," "team/index.html," "about/index.html" etc. when it seems optimal to be naming the HTML file itself, i.e. "team.html" and "about.html." Could this be part of the problem?
Thanks for any insight here.
You have two major issues here.
First issue is the root http://www.raisetheriver.org/ has a meta refresh that redirects the page to http://www.raisetheriver.org/home/index.html
Google recommends against using meta refresh, 301 redirects should be used if you want to redirect pages. However I recommend against redirecting the root home page to another page, as a websites home page is expected to be the root.
The second issue is that all the pages on the site are blocked from being indexed in Google as they have the following code in the source code: <meta name="robots" content="noindex"> which instructs search engines not to index the page.
Correct these issue and the site will be able to get indexed in Google and sitemap generators will be able to crawl the site.
Having sub directories for the pages won't be an issue to web crawler because even large site like Amazon, Ebay and many other have sub directory align pages.
This error occurred that due to your sitemap.xml or sitemap.html might contain invalid or broken links and have been indexed to Google. And you can generate sitemap using this site http://www.xml-sitemaps.com/ even I am using this and works perfectly.
And please check manually all directories and pages in your cpanel is working or not. If find any any invalid link you may be able to fix it.
If you have a website www.yourdomain.com and a subdomain blog.yourdomain.com
(both sites containing simular information) what is the best sitemap setup?
Is it best to have one site map for both sites?
(and if so what would this look like?)
Or two separate sitemaps?
Which would be most effective in regards to search traffic optimisation?
If the content is identical, use a rel tag to tell Google (and the other search engines) which URL should be used to accrue page rank.
If you don't Google will split your page rank 'juice' over both pages. Ideally, you want to concentrate your juice on one URL as it will get a better page rank.
Choose the main site or the subdomain to produce your site map for. IT doesn't really hurt anything if you do both.
The rel="canonical" tags go in your html pages.
You can simply create single sitemap as:
http://yourdomain.com/sitemap.xml and define sitemap for all your blog posts urls under this sitemap.
This sitemap will help you to index a large archive of content pages of your blog.yourdomain.com. In this way, Google crawlers can index from a single place within less time.
I am using various online services to create pages for our site and would like to use iframes for including pages in each other (for example header and footer blocks). I have a lot of blocks of content that I need to repeat across landing pages so using this mechanism is convenient.
Will this cause a problem for search engines visiting our site or for our ranking?
Thanks
B
Yes, of course. Try to avoid using iframes as much as possible.
In your case you may use, include or require functions. For example, if you use PHP:
<?php include('header.html') ?>
Update
If you cannot use PHP or alternatives, consider including HTML inside HTML
It certainly won't help your page in the SEO stakes. It shouldn't penalise you for the iframe per se but you will have a lot of your page content ignored by google. If the iframed content contains your header and footer links, google will ignore them which is a big no no.
Is this practice widespread in existing pages?
If the site runs on a number of html files with iframes and you have no way to use php you might be best served by simply putting the header and footer html in the static html and doing a large scale find and replace whenever you need to change the header or footer.
You are working within the constraints of quite old techniques and competing with people who are able to SEO their sites using modern techniques so if SEO is that important you should sort out the way you build pages or do it the long, hard way.
Yes, the search engines essentially ignore what's inside them. That's unfortunate especially if you have keyword rich content within the iframe. Additionally the overall page will have less content in the eyes of the search engines. Always try to serve up the content you want searched in the presentation layer. Avoid placing text, links, images etc. in the behavior layer as that too is hidden from search engines.
SEOmoz's blogs are a good place to learn http://www.seomoz.org/blog
While not optimal iframes can be used to serve up link heavy or SEO irrelevant content keeping the primary page lighter and more search engine structured, in other words, less diluted.
If the portions of your site in the iframe are essential to navigation or search engine spidering they should be avoided at all costs.
It is also important to note that if you do use iframes for content you should block the search engine from spidering those pages by using a robots.txt. Even if your pages are w3 validated they will appear broken to visotrs landing there via a SERP since the page will only include a small portion of the site markup.
Hello fellow programmers,
I am building a website and i read about sitemap.xml, but there is no place where i can find a definition or what it contains.
Can someone help me, what does it do? what is it for? what is in it?
http://www.sitemaps.org/ is the official resource.
The protocol page is probably the most important part of the entire site. It describes how to properly format your sitemap.xml file so that search engines can properly crawl your website.
from sitemaps.org
Sitemaps are an easy way for webmasters to inform search engines about pages on their sites that are available for crawling. In its simplest form, a Sitemap is an XML file that lists URLs for a site along with additional metadata about each URL (when it was last updated, how often it usually changes, and how important it is, relative to other URLs in the site) so that search engines can more intelligently crawl the site.
It provides information for search engines about the structure of your site. Wikipedia article.
It's an XML file that contains all of the URLs in your application, along with some other information that is used to make your site easier to crawl for search engines.