I am using github pages and hexo to make my own blog.
I wrote few articles and it looks find.(https://bmy4415.github.io/2018/05/08/ssh-key/)
(This site is written in Korean).
But I cannot search my blog on Google Search even if search as "bmy4415's note" which is my blog's main name.
I also used hexo plugin 'hexo-generator-seo-friendly-sitemap' and 'hexo-generator-search' but it seems not effective. Is there any additional setting that will show my blog on Google Search? Thanks.
Google does NOT know everything on its own.
Go to Google webmasters` tool and fill needed data and verify your web ownership. Googlebot will start visiting and tracking your site.
Plus have a look at robots.txt
Related
Hi I have recently removed the '.html' from the end of my url's to make them look more professional which was brilliant. However, now when I see my site on Google the old url which includes the '.html' still appears which produces people with an error page as expected. How can I tell Google that I have new url addresses so that people can visit my site again?
thanks!
Best way to remove .html extensions is by adding it in .htaccess file. This way search engines will "understand" it, but you will not seeing the search result immediately, since search engine crawler, will take some time to update.
And make sure to submit your url in google. If you have google webmaster you will be able to see this process and status of your website more clearly.
I have an internal deployment of mediawiki. In some articles are external links. I have another page that makes API calls to the wiki to pull articles into another website. When I pull those articles in, links do not get pulled in properly. Here is an example.
Wiki article:
Use [http://example.com THIS LINK] to contact the vendor.
API URL:
https://mysite.com/mediawiki/api.php?action=query&format=json&prop=extracts&titles=Vendor
API results:
Use THIS LINK to contact the vendor.
Notice the link is completely stripped away. I've also tried to add my own html into mediawiki for links but mediawiki escapes < and > symbols and so the API see's '<' and '>'. Also mediawiki displays html and not an actual link.
How do I make mediawiki API calls and keep link information?
For this, you can use action=parse instead. The query would look like this:
https://mysite.com/mediawiki/api.php?action=parse&format=json&page=Vendor&prop=text
I have a WordPress blog account already (abc.wordpress.com). And I have my own web site: www.xyz.com
I would like to integrate my WordPress blog content into my own site. Hopefully something like blog.xyz.com or just replace the home page of xyz.com with abc.wordpress.com
I know that I can download WordPress' code from wordpress.org and run my own WordPress. And having my own MySQL database, but WordPress is always releasing new code. I don't have the time to keep updating the source on my end to match it.
I'm running my own site as a hobby, so I prefer to let WordPress.com to manage the content for me and continue reuse my own blog at abc.wordpress.com, but make the content show up in my own site: xyz.com
I hope I was clear when explaining this.
Anyone knows a way to do this?
Thanks.
If your main worry is about the updates, I would say don't be. A simple click of the 'Updates' button in the wordpress admin is all you need to do in order to apply the updates for wordpress. A notification will pop up alerting you of any updates.
And as Calle has already mentioned, you can retrieve your content via RSS, or you could just export your current content from Wordpress.com, import the content into your own site, and manage it there. Everything would be in one spot.
Good Luck.
I don't know how good you are with programming but there's a PHP library called Simple Pie which would help you retrieve your content via RSS (which Wordpress automatically generates for you). The adress is here: http://simplepie.org/
If you are not very good with programming, perhaps you can get someone to do it for you or find a script which is already written somewhere. I do think RSS is definitely the best way to go.
I also think you exaggerate the problems of hosting Wordpress yourself. It's not something that you have to keep updated with, and if you want to, all you have to do is log in from time to time, perhaps once a month (how often are you writing articles?), and click "update" and Wordpress will do everything for you. Both for your plugins and WP version.
For the ability to use your own domain (xyz.com) and have wordpress redirect users from abc.wordpress.com(your wordpress blog) to your domain requires a premium account.
If you have a premium account then you can just log in to wordpress.com, click 'upgrades' and select 'domains'. From there you will see the option "Map an Existing Domain" and you will want to enter your domain here. Now your wordpress.com blog is what will show when users enter your domain's url (xyz.com).
Alternatively, if you need a workaround with a free wordpress.com account then you want to just embed your blog and for that you will need to use an RSS feed. Note: this method will not maintain your wordpress styles it will merely transport the content. Also by default not all browsers support RSS feeds.
You can view your blog's current feed by adding 'feed' to the end of your wordpress.com url, i.e. abc.wordpress.com/feed. You can read more about feeds here (http://en.support.wordpress.com/feeds/). Now you are just left with the task of figuring out how to embed the feed into your page.
One final hail-mary you might attempt is just redirecting your domain to your blog. Reference on how to do this different ways here: (http://css-tricks.com/redirect-web-page/). Example, place this tag in the section of your domains pages:
<meta http-equiv="refresh" content="0; URL='http://google.com'" />
(this will redirect after 0 seconds to the specified url)
I have a fairly new website which allows people to create their own profiles and such. The issue is that when someone links to their profile from their website/blog, their profile shows up in google searches for my website - and to date the one person who has done this has a NSFW profile. Which means, when you search for my site on Google one of the top results is a NSFW page.
How do I prevent google from listing subpages in the results? Would robots.txt solve this? And if a page is already listed, will adding an entry in robots.txt disallowing access to profile pages in general end up removing it from the results?
robots.txt will solve it to some extent. If there are direct external links, then I have found that google still indexes them.
Go to http://webmaster.google.com, get your website claimed, and then use their URL removal tool.
Yes, see http://www.robotstxt.org/. Just list things like "Disallow: /profile/" etc and google will stop indexing them and after a time, remove them.
For all you who know, in Google Webmaster Tools one can submit a sitemap or **sitemap_inde**x file and then google will fetch it and crawl the website when it "has time to".
I have searched for this but can't find an answer anywhere...
In the interface of webmaster tools, there is a section for "sitemaps" which lists all sitemaps submitted to google.
On the right of these sitemap names, there is a column saying something like "webadresses in webindex".
This have always shown 0 for all sitemaps.
I am guessing this means nr of pages indexed in the Sitemap.
My Q is, why is this showing 0 all the time? And is this actually the nr of pages indexed by google?
FYI, I have a very good and SE friendly website.
However, you should know it has only been a week that I have submitted the sitemaps.
Any ideas?
Well, sometimes it can take some time, unfortunatly it's quite random.
It happened to me once that, giving 5 different sitemap for 5 different websites at the same time, 4 was done in a week and 1 in a month...
Anyway,
in your sitemap, did you put <changefreq>monthly</changefreq> for the main page ?
on the "sitemaps" page, click on the sitemap you sent and watch the url of the site map (ie: Sitemap : http://www.mydomain.com/sitemap.xml) and see if there's any typo.
Finally, did you try to hit the "resent" link on that page ?
I have had some experience of the sitemapping process. Some software programs that create the XML sitemap will deliver XML that will get 'stuck'.
Have you tried creating the simplest sitemap possible for your site by hand and submitting that?