Welcome to the Treehouse Community
Want to collaborate on code errors? Have bugs you need feedback on? Looking for an extra set of eyes on your latest project? Get support with fellow developers, designers, and programmers of all backgrounds and skill levels here with the Treehouse Community! While you're at it, check out some resources Treehouse students have shared here.
Looking to learn something new?
Treehouse offers a seven day free trial for new students. Get access to thousands of hours of content and join thousands of Treehouse students and alumni in the community today.
Start your free trial

Shams Nelson
2,888 Pointsrobot.txt??
Does anybody have any clue as to what this means and how to resolve it??
http://www.100coachingtips.com/: Googlebot can't access your site
Over the last 24 hours, Googlebot encountered 1 errors while attempting to access your robots.txt. To ensure that we didn't crawl any pages listed in that file, we postponed our crawl. Your site's overall robots.txt error rate is 100.0%.
You can see more details about these errors in Webmaster Tools.
Recommended action If the site error rate is 100%:
Using a web browser, attempt to access http://www.100coachingtips.com/robots.txt. If you are able to access it from your browser, then your site may be configured to deny access to googlebot. Check the configuration of your firewall and site to ensure that you are not denying access to googlebot. If your robots.txt is a static page, verify that your web service has proper permissions to access the file. If your robots.txt is dynamically generated, verify that the scripts that generate the robots.txt are properly configured and have permission to run. Check the logs for your website to see if your scripts are failing, and if so attempt to diagnose the cause of the failure.
If the site error rate is less than 100%:
Using Webmaster Tools, find a day with a high error rate and examine the logs for your web server for that day. Look for errors accessing robots.txt in the logs for that day and fix the causes of those errors. The most likely explanation is that your site is overloaded. Contact your hosting provider and discuss reconfiguring your web server or adding more resources to your website. If your site redirects to another hostname, another possible explanation is that a URL on your site is redirecting to a hostname whose serving of its robots.txt file is exhibiting one or more of these issues.
After you've fixed the problem, use Fetch as Google to fetch http://www.100coachingtips.com/robots.txt to verify that Googlebot can properly access your site.
5 Answers

Wayne Priestley
19,579 PointsHi Shams,
Could you paste the text from your robots.txt file in a post please.

Shams Nelson
2,888 PointsHey Wayne, thanks for the reply. Here is the text from the robot.txt file:
User-agent: * Disallow: /wp-admin/ Disallow: /wp-includes/

Kevin Korte
28,149 PointsAre you using any plugins that generate a site map, or anything of the such. Something like a SEO plugin?

Wayne Priestley
19,579 PointsHi Shams,
Can you try this:
User-agent: *
Disallow: /wp-admin.php
Disallow: /wp-includes/
Okay, I don't know if you had it laid out like this and it just appears all on one line because it's posted in the forum.
The first Disallow is for the webpage wp-admin.php thats me presuming that this is the admin page.
The second Disallow is for a directory which I'm assuming wp-includes is.
All the above code is case sensitive.
Hope this helps.

Shams Nelson
2,888 PointsThey were on separate lines originally. SO just changing the "/" to ".php" could solve the problem? What is the problem anyways?

Wayne Priestley
19,579 PointsWhen you have /wp-admin/
that means you want to disallow a directory called wp-admin.
wp-admin is actually a webpage not a directory, so changing it to /wp-admin.php
tells the robot to disallow that webpage, I'm assuming includes is a directory hence keeping both /
This is just an idea though, so let us know how it goes.