Googlebot cannot access CSS and JS files on your site! Don’t be alarmed…

The Googlebot

… it’s easily fixed.

You might have received a message like this, lots of people have:

Google webmaster message

Most websites have a robots.txt file (your developer will know about it if you don’t).  This tells Google what to crawl and what not to crawl.  It is basically a list of allowed and disallowed URLs.  Google sees this and does not crawl the disallowed URLs.

Google has now decided that it wants to fully index everyone’s site including all JavaScript and website styling files.  All we need to do is to delete some of the disallowed lines from the robots.txt file.

So with a WordPress website a robot.txt that looks like this will certainly need to have the items in red removed:

# global
User-agent: *
Disallow: /cgi-bin/
Disallow: /wp-admin/
Disallow: /wp-includes/
Disallow: /wp-content/cache/
Disallow: /trackback/
Disallow: /feed/
Disallow: /comments/
Disallow: /category/*/*
Disallow: */trackback/
Disallow: */feed/
Disallow: */comments/
Disallow: /*?

Yoast SEO, arguably the best SEO plugin for WordPress, now recommend a completely empty robots.txt file, more on that here »

In summary, keeping the Googlebot happy is just a case of removing a few lines of code from a file.  Nothing that cannot be quickly fixed by your developer!


SHARE THIS ...

Share on FacebookTweet about this on TwitterPin on PinterestShare on LinkedInShare on Google+Email this to someone

© Tracey Rickard. If you want to use any of my content please ask me first, you can't use it without permission that's stealing. You can use an excerpt as long as it is linked back to this article.