Help Google Crawler to Crawl your Blog More Effectively

by Taswir Haider in SEO

Google indexation and ranking higher in Google SERP to get organic search visitors is the primary aim for almost all the bloggers. But have you made your blog search engine friendly for Google crawler? There was a time when it used to take 1 week or more to get a page indexed by Google. Those days have gone and now Google crawl and index your new page within hours, if not minutes.

From my case here at TheTopBlogger I get new post indexed within 1 or 2 minutes after publishing and I was surprised when I started to see this first time. But frankly speaking, for Bing and Yahoo there is no luck so far as their algorithm is not so improved and fast like Google, so it takes couple of weeks to get my posts indexed in Bing and Yahoo. If you have got luck for this, please share your tips with me. : ) Even Bing had problem with sitemap submission (sitemap submission wasn’t successful even after years! and many users have complained this through forums and blogs) which they have fixed at the end of 2012.

Help Google Crawler to Crawl your Blog More Effectively

Help Google Crawler to Crawl Effectively

Ok! The first task for better indexing is to make sure that Google crawler can move smoothly through out your blog.

1. Google Webmaster Tools

The first task after you setup your blog is to sign up for Google Webmaster tools. Then add your site and verify it. Submission to Google Webmaster tools help to index your blog faster, notify to any critical error, report crawl error and malware problem and many more.

2. Sitemap to Google

Submit your blog sitemap to Google. A sitemap is important for crawler to understand the structure of your site more effectively. For submitting sitemap you can use Google Sitemap Generator or any plugin of your choice. Moreover make sure that you create a link to your sitemap from your homepage footer- as Google bot crawl homepage more frequently, it will let bot to track the changes faster.

3. Robots.txt

Make sure that Google and other search engine crawlers are not blocked by your robots.txt. You can check the validity of your blog’s robots.txt file by using the tool from SEOBook. You might also like to block the directories through robots.txt which you do not want to be crawled by Google crawler.

In addition to above, make sure-

4. You add fresh content frequently. Read how to generate topic ideas if you are facing shortage of new topic ideas.

5. Having a good uptime for your server with almost 0 downtime

6. Focus on increasing website speed

7. Use Ping service effectively with WordPress Ping list

8. Interlink your blog posts with manual interlink and automatic interlinking method both so that Google crawlers move from one page to another page and index more effectively. While interlinking, make sure to use anchor text variation for interlinking the same post from several different posts.

Subscribe to RSS, Get Free Updates!

Previous post:

Next post: