Receive More Updates

Date: - Time:

Saturday, 11 June 2016

How To Add Sitemap And Robots.txt In Wapka, Blogger And Mywapblog For Fast Google Crawling

You may wonder why is your website not being crawled by Google but these two things called ROBOT.TXT and SITEMAP has a great function in making your website being crawled by Google so I will be teaching you webmasters on how to activate  these functions in your different website platforms.

Google search engine is the world best search engine and can supply tour website with lot of traffic only if you allow it and set your Seo, sitemap, robot.txt properly and when all things are set properly, you can make the top in Google search engine which will bring you high volume of traffics everyday.

What is a robot txt file?

The robots.txt file is a simple text file placed on your web server which tells webcrawlers like Googlebot if they should access a file or not. Web site owners use the /robots.txt file to give instructions about their site to web robots; this is called The Robots Exclusion Protocol.

It works likes this: a robot wants to vists a Web site URL, say Before it does so, it firsts checks for, and finds:

User-agent: *
Disallow: /

The "User-agent: *" means this section applies to all robots. The "Disallow: /" tells the robot that it should not visit any pages on the site.

There are two important considerations when using /robots.txt:
robots can ignore your /robots.txt. Especially malware robots that scan the web for security vulnerabilities, and email address harvesters used by spammers will pay no attention. The /robots.txt file is a publicly available file. Anyone can see what sections of your server you don't want robots to use. So don't try to use /robots.txt to hide information.

Which robot.txt should I Use for fast crawling from Google?

Use this below:

User-agent: *Disallow:


This sign * Means all search engines are allowed to crawl your website.
This sign Disallow: means you have allow all search engines to access your site.
You can use this robot.txt in your BLOGGER AND MYWAPBLOG website but you can add this below in your wapka site for fast crawling.

For wapka websites:

User-agent: *Disallow:Http://
So with these robot.txt files, your website will be easy to be crawled by Google, it improves Seo, it makes your site  to be index within 2minutes and remember to change that to your site name.


This document describes the XML schema for the Sitemap protocol.
The Sitemap protocol format consists of XML tags. All data values in a Sitemap must be entity-escaped. The file itself must be UTF-8 encoded.

The Sitemap must:

Begin with an opening <urlset> tag and end with a closing </urlset> tag.
Specify the namespace (protocol standard) within the <urlset> tag.
Include a <url> entry for each URL, as a parent XML tag.
Include a <loc> child entry for each <url> parent tag.
All other tags are optional. Support for these optional tags may vary among search engines. Refer to each search engine's documentation for details.
Also, all URLs in a Sitemap must be from a single host, such as or

Screenshot of Sitemap of

By now you would have know what is called a sitemap and is also responsible to improve your website  Seo.

How can I insert robot.txt and sitemap in my website?

You may not know how to insert robot in your BLOGGER, SITEMAP and WAPKA but I will teach you now with screenshots and clear explanation.

Insert robot.txt in wapka

Go to Edit site>>Global settings>>HEAD tags (meta,style,...)>>Edit robots file (robots.txt)>>Insert the robot.txt above and submit.

Screenshot here below:

That's all for wapka and your website will be crawled and index all day and night and your traffics will be much in everyday.

Insert robots.txt in blogger

This code is divided into three sections. Let’s first study each of them after that we will learn how to add custom robots.txt file in blogspot blogs.

1. User-agent: Mediapartners-Google

This code  for Google Adsense robots which help them to serve better ads on your blog. Either you are using Google Adsense on your blog or not simply leave it as it is.

2. User-agent: *

This is for all robots marked with asterisk (*). In default settings our blog’s labels links are restricted to indexed by search crawlers that means the web crawlers will not index our labels page links because of below code.

Disallow: /search

That means the links having keyword search just after the domain name will be ignored. See below example which is a link of label page named SEO. 

And if we remove Disallow: /search from the above code then crawlers will access our entire blog to index and crawl all of its content and web pages.

Here Allow: / refers to the Homepage that means web crawlers can crawl and index our blog’s homepage.

Disallow Particular Post
Now suppose if we want to exclude a particular post from indexing then we can add below lines in the code.

Disallow: /yyyy/mm/post-url.html

Here yyyy and mm refers to the publishing year and month of the post respectively. For example if we have published a post in year 2013 in month of March then we have to use below format.

Disallow: /2013/03/post-url.html

To make this task easy, you can simply copy the post URL and remove the blog name from the beginning.

Disallow Particular Page

If we need to disallow a particular page then we can use the same method as above. Simply copy the page URL and remove blog address from it which will something look like this:

Disallow: /p/page-url.html

3. Sitemap:

This code refers to the sitemap of our blog. By adding sitemap link here we are simply optimizing our blog’s crawling rate. Means whenever the web crawlers scan our robots.txt file they will find a path to our sitemap where all the links of our published posts present. Web crawlers will find it easy to crawl all of our posts. Hence, there are better chances that web crawlers crawl all of our blog posts without ignoring a single one. 

Note: This sitemap will only tell the web crawlers about the recent 25 posts. If you want to increase the number of link in your sitemap then replace default sitemap with below one. It will work for first 500 recent posts.  


If you have more than 500 published posts in your blog then you can use two sitemaps like below:



How to add custom robot.txt in blogger

  • Go to your blogger blog. 
  • Navigate to Settings >> Search Preferences ›› Crawlers and indexing ›› Custom robots.txt ›› Edit ›› Yes
  • Now paste your robots.txt file code in the box.
  • Click on Save Changes button. 
See your default blogger robot.txt below:

User-agent: Mediapartners-GoogleDisallow:User-agent: *Disallow: /searchAllow: /Sitemap:
Check my blogger robot.txt file 

You can check it by adding /robots in blogger
That's all!

How to add sitemap in wapka, blogger and mywapblog

I will teaching you how to add this to the platforms mentioned in the sub-heading of the title above.

Insert sitemap in wapka

Go to Edit site>>global head settings>>HEAD tag>>Edit sitemap file (sitemap.xml) and then insert your sitemap using the sites number. Insert only site ID, of sites, which you want to place in the file sitemap.xml
You can add also links of forums in form fID.
For example: 0,1,2,10,223,f123,f456. As shown below.

That's all and submit.

Insert your sitemap in blogger and how to submit it to Google webmasters tool 

Open the Sitemap Generator and type the full address of your blogspot blog (or your self-hosted Blogger blog). Click the Create Sitemap button and this tool will instantly generate the necessary text for your sitemap. Copy the entire generated text to your clipboard (see screenshot below).

Next go to your Blogger dashboard and under Settings – > Search Preferences, the enable Custom robots.txt option (available in the Crawling and Indexing section). Paste the clipboard text here and save your changes.
And we are done. Search engines will automatically discover your XML sitemap files via the robots.txt file and you don’t have to ping them manually.Below are some easy steps you have to go through in terms to submit your blogspot blog sitemap.

How to Submit it to Google webmasters tools

Sign in to Google Webmaster Tools.Click on the blog title for which you want to add sitemap. Click on Sitemaps button. At the top right corner of the page, press Add/Test sitemap button. Once you click the button, a small box will appear .

Add the below code in the text field.


This is the sitemap code for your blogger blog which you need to add. Press “Submit Sitemap” button. Refresh the page.

Congratulation! You have finished the process of submitting your blog sitemap.

Note: The above sitemap will work for 500 posts only. If you have more than 500 posts published in your blog, then you have to add one more sitemap. The Whole procedure will be same but at this time you have to add this code.


The sitemap which we submit to Google Webmaster Tools is a XML sitemap which is used by search engines to find our content easily. There is HTML sitemap too which is for our blog readers. We should add sitemap page to our blog so that our readers can easily view all of our blog posts at the single location. It’ll be easy for them to read the post which they want to.

I know you cannot tell me this article is not helpful? Drop your comments below and if you have suggestions or any problem while setting this up kindly drop your comments below.


  1. Hello! Thank you so much for this post. Please I've added my blogger robot.txt file following the steps you outlined.

    But I have a problem. When I try to check it by doing the
    , it just shows me that this page isn't available. It doesn't show me anything.

    Please can you advise me?

    1. Visit it like this:

  2. How do you see mine

  3. This comment has been removed by the author.

  4. This post is actually useful to any URL indexing in Google search console. Robots.txt file must be necessary of any blog. It will be submitted at any of the links will not be submitted at the link, robots.txt file gives the suggestion for Google indexing.
    You can add this robots.txt in your blogger blog.
    User-agent: Mediapartners-Google
    User-agent: *
    Disallow: /search
    Allow: /

    Customize: Replace with your blog name.
    Thank you.
    Read more


Is this post helpful?

Your comment(s) are highly appreciated to this post. Be the first to comment and make sure you also share to your friends and families about this post, for them to also benefit from it.

Share to all social networks like Facebook, Twitter and others social network you may know.

Are you finding it difficult to drop a comment here? Kindly click here and learn how to comment on