Welcome, visitor! Log in

KB Plugins blog

The best Wordpress plugins are free

KB Robots.txt

When robots (like the Googlebot) crawl your site, they begin by requesting http://example.com/robots.txt and checking it for special instructions. Use this plugin to create and edit your robots.txt file from within WordPress (using Options -> Robots.txt).

Whenever a user (or a robot, more likely) appends “robots.txt” to your blog URL (e.g. http://blog.example.com/robots.txt), this plugin will serve up the robots.txt file that you created in the WordPress admin menu.

This plugin should work with most versions of WordPress, but it is particularly intended for WP-MU installations, since it allows each WPMU blog to have a unique robots.txt file.

Download this plugin from the wordpress.org repository .


Note that robots make only top-level requests for robots.txt files. If you have WordPress installed in a subdomain or in your root, this plugin will work as intended. But if you have WordPress installed in a subdirectory, then this plugin won’t do much for you, since the search engines won’t look for http://example.com/blog/robots.txt, only for http://example.com/robots.txt. Just to make sure this is clear,

  • WordPress in a subdomain: http://blog.example.com/
  • WordPress at the root: http://example.com/
  • WordPress in a subdirectory: http://example.com/blog/


If the FAQs at the download page don’t answer your questions, you can post support questions below.


  1. Tene
    Posted May 3, 2007 at 2:10 am | Permalink

    Brilliant plug in, placed a note about it in WPMU forum. WPMU doesn’t behave exactly intuitively at the blogs.dir level, and this plugin neatly sidesteps the whole issue to meet a very important requirement for subdomains and full external domains running off WPMU.
    Have you considered serving a Google Sitemap using the same strategy?

  2. Posted May 3, 2007 at 4:04 pm | Permalink


    I’m not sure about the Google Sitemap thing. There would need to be some sort of caching mechanism, or else large blogs would take a very long time to generate a sitemap each time. I guess people could write a sitemap by hand, but that’s not so fun.

  3. Tene
    Posted May 5, 2007 at 1:22 pm | Permalink

    The caching could be achieved by generating the sitemap after new content is created, and storing the output in wp options – equivalent to the config text you store for robots. So when Google requests sitemap.xml, you just serve the text straight out of the table.

    Maybe there is a way to queue a long running job – e.g. after updating content, launch the sitemap generation in batch mode? – that would avoid any performance hit for the author.

    There is a 10Mb limit on file size – while a compressed version doesn’t seem to be mandatory, producing and storing a .gz version to support very large blogs may present issues.

    The reason I suggested this is that – as I read it from the wpmu forum – getting sitemaps working in WPMU means installing the plugin, pre-populating the sitemap.xml and sitemap.xml.gz (to avoid permission errors), modifying .htaccess, and modifying allowed files. Messy. Your dynamic generation solution would avoid the .htaccess mods and physical file altogether, which for WPMU installations makes a *lot* of sense.

    Possibly Arne Brachold’s code (for generating the sitemap itself) would be reusable – so it might just be a matter of wrappering Arne’s map generation code within your page serving model?

  4. Posted May 7, 2007 at 10:32 am | Permalink

    I see. Not a bad idea, but I probably won’t have time to do it.

  5. Posted September 8, 2007 at 8:30 pm | Permalink

    I installed this plugin and changed the robots.txt file through the interface, but nothing has changed in the actual file itself. What am I missing here? I’m not using WP-MU, just the regular WordPress.org install.


  6. Posted September 9, 2007 at 9:06 am | Permalink

    I’m not sure what you mean, Alex. There isn’t any file involved here. The robots.txt gets served from your blog’s database, just like your posts do. The plugin doesn’t create or edit any files.

    Incidentally, if you do have a file called robots.txt installed in your blog’s directory, then you might need to delete it for this plugin to work correctly (depending on your htaccess settings).

  7. Posted September 9, 2007 at 11:12 am | Permalink

    Thanks for the speedy reply! I’m not sure what the problem was either. I tried deleting the robots.txt file that was already there, but when I entered my changes through KB Robots in the WordPress interface and clicked “View Robots.txt” after save the changes, none of them were reflected in what I saw. I ended manually creating the robots.txt and putting it back in the blog directory.


  8. Posted October 23, 2007 at 12:49 pm | Permalink

    Hi Adam,

    I installed your plugin. This what I entered in the robots.txt plugin window.
    User-agent: *

    But when I do I do [link] this is what I get, and in the plugin when it says check the the robots.txt file after I submit I get this also:
    Sitemap: [link]

    So uninstalled the google sitemap and analytics, and still the same thing. I was wondering could you help me solve this problem.


  9. Posted October 24, 2007 at 6:13 pm | Permalink

    I fixed it. I wasnt putting it in the root directory.

  10. Posted October 24, 2007 at 6:17 pm | Permalink

    Hmm. It doesn’t go in the root, it goes in the plugins folder, but if you’ve got it working, great.

  11. Posted February 19, 2008 at 8:45 am | Permalink

    I am using WP-MU. and installed your plugin. and pasted the wordpress robot text from your one of the post. but now when i want to see the robot.txt then inplace to going to [link] it goes to
    [link] URl

    Please help me and tell me what to do..

  12. Posted February 19, 2008 at 10:56 am | Permalink

    I don’t know what to say. I use it on WPMU without problems. One thing I haven’t tried is using it on the root WPMU blog–I’ve only tried it on member blogs. Maybe that’s the problem. Or, it could be that your version of WPMU doesn’t support the action hook that the plugin uses. I think it’s the init hook, but I don’t remember.

    Regardless, I don’t have the time to look into this at the moment, and probably won’t for several weeks. But if you’re running WPMU, you are probably familiar enough with PHP and WP to debug this fairly quickly. If you find a bug, let me know and I’ll patch the plugin. Good luck.

  13. Scott
    Posted July 29, 2008 at 9:05 am | Permalink

    Hi Adam,

    I am writing to ask a question about your WP MU robots.txt plugin “KB Robots.txt”

    It sounds great and thanks for creating it.

    I am new to using WP MU and just installed it on a site of mine however, I already have a WP blog on my main domain so WP MU is in a sub-directory.

    Therefore, it’s like [link] (where blogs is my WP MU install)

    Since you say, “But if you have WordPress installed in a subdirectory, then this plugin won’t do much for you, since the search engines won’t look for [link]” — does this mean it’s useless for me to install your plugin as well as even having a robots.txt file with WP MU installed in a sub-directory?

    Even though I have a robots.txt for my root domain could I not somehow add robot information for my WP MU?

    For example, say right now in my main root folder [link] I have a robots.txt file with User-agent: *
    Allow: /

    Could I not simply amend this and also put Allow: /blogs/ ?

    This way my robots.txt file is in my main root folder (public_html) and the SE’s can hopefully follow on to the MU blogs??

    If the answer is “yes” then should I duplicate all(most) of the instructions in my robots.txt file but add /blogs/ to each? Does that make sense? So if I had User-agent: * Allow: / I would also have Allow: /blogs/ for command?

    I apologize for this long email. Just wondering if I can apply robot info to my WP MU in my sub-directory somehow :)

    Thanks again

  14. Posted July 29, 2008 at 10:22 am | Permalink

    Putting a robots.txt file into a subdirectory will not accomplish anything, so, no, the plugin won’t help you. More info:


  15. gadget
    Posted September 10, 2008 at 9:19 am | Permalink

    Adam, from the comments above it sounds as though this is a must have plugin. I just don’t fully understand the benefits I’ll get from it though. Is there any other info I can read to understand it better? Sorry to have to ask, I’m sure you’re very busy.

  16. monroe
    Posted September 26, 2008 at 4:06 am | Permalink

    Hi – I need to create the robot.txt file to exclude the robots just from specific pages on the wp site rather than whole areas – like the archives. I looked at the examples you have in the plugin – I just need some clarification on how to exclude specific pages.

  17. Rami Abughazaleh
    Posted December 8, 2008 at 4:21 am | Permalink


    Thank you for KB Robots.txt wordpress plugin. It works very well!

    I just wanted to let you know that there is a bug:

    If /wp-admin/ requires https \ ssl then firefox will complain that the submit button will transfer data in a non secure way, etc. and that is because http:// is hard-coded in kb-robots-txt.php line 59.

    The fix for me was just to replace http:// with https:// however I would like the plugin to dynamically determine this from checking the php $_SERVER['HTTPS'] variable for example.

    Thank you.

  18. John
    Posted March 16, 2009 at 3:06 pm | Permalink

    Just reporting that the plugin works flawlessly on WPMU 2.7 with subdomains – very nice job. Thanks for putting it together.

  19. Posted March 19, 2009 at 9:58 am | Permalink

    Hi Adam,

    Just a quick note to say thanks for this.


  20. Posted March 26, 2009 at 3:56 pm | Permalink

    I’ve installed the plug-in on WP 2.7.1 and I would like to think that I’m of average intelligence but I can’t seem to figure out how to use this. I’ve went to [link] and it returns a 404 not found. I know you’re a busy man and I am checking the plug-ins documentation, thanks in advance.

  21. Posted April 19, 2009 at 11:11 am | Permalink

    I was looking for a robots.txt manager plugin and looked at your limitations – and I think you are over limiting your plugin. I use wordpress in a subdirectory named wordpress. To get the robots.txt you generate to be used I tried a symbolic link in the document root – that did not work. So I added a rewrite rule to my document root .htaccess file:
    # use wordpress robots.txt file
    <IfModule mod_rewrite.c>
    RewriteEngine On
    RewriteBase /
    RewriteRule ^robots\.txt$ /wordpress/robots.txt

    This is working great at this time. I am not a security expert – so if anyone sees a security issue with this, please let me know.

    Great plugin – thanks for creating it and sharing it with everyone.

  22. Posted September 23, 2009 at 5:02 am | Permalink

    this is brilliant! thanks so much for sharing

  23. Jon
    Posted January 9, 2010 at 6:16 pm | Permalink

    Hey I’m confused. Where does it create the robot.txt cause i don’t see it in my roots directly…i’m also using wp-hive and so if i know how to manually create the robot.txt file i can move it to the directory that i need it to be in plz email me back

  24. Peter
    Posted January 11, 2010 at 11:50 am | Permalink

    This plugin doesn’t work with most of the now WP wersions, that’s why Adam is not responding on your massages.

    [immature profanity removed by site owner]

  25. Posted January 11, 2010 at 4:12 pm | Permalink

    @Peter: Actually, if somebody offered a “massage,” I might make responding a higher priority. I generally write plugins for my own use, share them on wp.org, then don’t tend to them further unless I have a need to. I don’t actually use this plugin anymore, hence minimal attention to it. That being said, it’s a very simple plugin that should work on any version of WP unless they changed something drastic in the code.

    @Jon: It does not create a robots.txt file. Just like how WP doesn’t create an HTML file for each blog post.

    @Ken C: Shouldn’t require any changes to htaccess for this plugin to work, as long as you’ve got WP installed in the root, e.g. example.com, not example.com/blog/.

  26. tunecrew
    Posted June 3, 2010 at 2:40 pm | Permalink

    just installed this on a fresh WP 3.0RC1 install

    one thing i noticed is that it will not work unless you have triggered a change in the settings that generates an .htaccess file, e.g. changed your permalinks settings.

    then it works brilliantly

  27. A
    Posted June 7, 2010 at 10:02 am | Permalink

    I am trying to have one of my pages NOT be indexed. I used the Ultimate Noindex Nofollow tool plugin – but that is not working – the page numbers I specify are getting indexed anyway. Someone suggested I use a Robots.txt file instead.
    since I am not sure how to do that, I looked through the forum and found KB Robots.txt plugin.
    I’m wondering – if i turn OFF (uncheck) the option in the Google XML sitemaps plugin to have the URL be placed in a virtual robots.txt file, can I then also use KB Robots to generate a robots.txt file with the pages I don’t want indexed disallowed?
    Not sure why the Ultimate plugin is not working, but I am looking for a work around


  28. Terry
    Posted August 18, 2010 at 4:00 pm | Permalink

    The instructions say, “Use this plugin to create and edit your robots.txt file from within WordPress (using Options -> Robots.txt).”

    Someone please tell me where in the world I might find this “Options -> Robots.txt”.

    The only Options menu item I see is in the Super Admin Menu (of the blog network), and that leads to a Network Options page which has:

    Operational Settings,
    Dashboard Settings,
    Registration Settings,
    New Site Settings,
    Upload Settings, &
    Menu Settings.

    But no Robots.txt editing area.

    What’s the deal, am I blind?

  29. Posted August 19, 2010 at 7:42 am | Permalink

    @tunecrew: Yes, as stated in the instructions, it won’t work unless you’re using permalinks.

    @A: I’m not familiar with the other plugin that you’re asking about.

    @Terry: First off, relax. Second, remember that you’re using a very old plugin that hasn’t been updated in a long time, nor will it be anytime soon. The WP developers have renamed the “Options” menu to the “Settings” menu. Look there.

  30. Posted September 1, 2010 at 11:19 am | Permalink

    Great plugin, but you should have posted the default (kind of) robots.txt rules set for wordpress blogs with the post.

  31. Posted September 5, 2010 at 6:59 am | Permalink

    The upgrade to WordPress 3.0 has caused changes in the number of items listed by Google Webmaster Tools as restricted by Robots.txt to go down from 4500 to only 700. Is there a bug w/ 3.0? The plugin has worked GREAT until now, so I may just re-install it to see it it fixes.

  32. mike
    Posted September 17, 2010 at 11:18 am | Permalink

    Hi there! Thanks for the plugin. I am able to access mydomain.com/robots.txt however in webmaster tools it shows 404 error, BELOW is how google fetches the page, and it has 404 header.. Can you please tell me why that is generated and HOW to fix?? Thank you!

    HTTP/1.1 404 Not Found
    Date: Fri, 17 Sep 2010 17:48:45 GMT
    Server: Apache/2.0.63 (Unix) mod_ssl/2.0.63 OpenSSL/0.9.8e-fips-rhel5 mod_auth_passthrough/2.1 mod_bwlimited/1.4 FrontPage/
    X-Powered-By: PHP/5.2.9
    Vary: Accept-Encoding
    Keep-Alive: timeout=15, max=100
    Connection: Keep-Alive
    Transfer-Encoding: chunked
    Content-Type: text/plain

    User-agent: *
    Disallow: /wp-admin
    Disallow: /wp-includes
    Disallow: /wp-content/plugins
    Disallow: /wp-content/cache
    Disallow: /wp-content/themes
    Disallow: /trackback
    Disallow: /comments
    Disallow: /category/*/*
    Disallow: */trackback
    Disallow: */comments
    Disallow: /media/wp-includes
    Disallow: /media/wp-content
    Allow: /media/uploads

  33. Phil
    Posted October 16, 2010 at 4:43 am | Permalink

    hey adam.

    thanks for the plugin. quick question , for the sake of clarity, as i am a bit new here…

    when i enter the code in the space provided in the plugin and then click on the View robots.txt link that has recorded the update, i see nothing…it’s just blank.

    does this mean that *nothing* has happened, or that your plugin has created a *virtual* robots.txt file that the spiders will crawl…

    in other words, i am asking, do i *really* not need to edit the robot.txt file via cpanel > file manager???

    many thanks,


  34. Ok
    Posted December 20, 2010 at 10:03 am | Permalink

    I just installed your plugin and i accidentally deleted the code and pasted with some other code in there and i got a error something like wrong heading or user.Now when i go in to plugins i cant find the KB Robots.txt in anywhere active,all,inactive. All im tryin to do now is restoring the default code in it or remove it totally

  35. Posted September 8, 2011 at 12:28 am | Permalink

    This is my blog
    I want to modify its robot.txt file .Tell me how can i do this ?Also do tell me how to increase traffic over my blog?

  36. Posted September 8, 2011 at 7:33 am | Permalink

    Vinil: You need a self-hosted blog if you want to tinker with things like that. Yours is hosted by wp.com. As far as increasing traffic, no plugin solves that. The only way to get more traffic is through content and link building.

  37. T Casey
    Posted November 25, 2011 at 8:20 am | Permalink

    Hi, Thank you for your very helpful robots.txt plugin.I have been reading up on its use, but I’m still not sure about how detailed ‘allow:’ needs to be as in the case of

    User-agent: Mediapartners-Google
    Allow: /

    User-agent: Adsbot-Google
    Allow: /

    User-agent: Googlebot-Image
    Disallow: /wp-content

    Does this just apply to Google, or should I be specifically allowing other user agents?

    Thanks again.

  38. Posted November 28, 2011 at 12:27 pm | Permalink

    There is no “allow” directive. You’ll see it mentioned on some sites, but it’s baloney. Anything you don’t explicitly “disallow” is automatically allowed.