Webmaster Tools Google Bot Warning

Written by: Jason Bayless | August 24, 2015

Webmaster Tools is a free service provided by Google to help monitor and maintain a website’s presence in Google’s search results. More information about this service is available at http://support.google.com/webmasters/answer/4559176?hl=en

In the middle of this year, Google changed the name of this service to Google search console.

A few months late, On July 28, 2015, Google sent a mass bot warning via the search console announcing that ‘Googlebot cannot access your JavaScript and CSS’ This message caused panic among Google users all over the world.

What is a Googlebot ?

This is search software that is used by Google to gather documents from the web and add them to the Google search engine. Large computers are used to search billons of web pages using this software. Detailed information regarding the Googlebot can be accessed at


In the Googlebot warning on July 28 2015, Google reminded it’s users that Googlebot’s inability to access files could result in ‘suboptimal rankings’. This sounded scary especially to web marketing professionals.

The webmasters Googlebot warning read as follows:

”Google systems have recently detected an issue with your homepage that affects how well our algorithms render and index your content. Specifically, Googlebot cannot access your JavaScript and/or CSS files because of restrictions in your robots.txt.file. These files help Google understand that your website works properly so blocking access to these assets can result in suboptimal rankings”

This warning was issued because apparently, many websites and blogs had been blocking Googlebot from accessing the CSS and JavaScript files. Fortunately it is possible to solve this problem by allowing Googlebot full access to these important files.

How to allow Googlebot full access to a website or blog

If a certain website or blog is blocking Googlebots access to the above mentioned files, the most important thing to do is to edit the site’s robots.txtfile. This can be done by following these steps

It is necessary to check the robots.txtfile for the following codes

  1. js$*
  2. inc$
  3. css$
  4. php$

If any of these lines are seen, they should be removed immediately because they are the ones that block Googlebot from accessing the website rendering it invisible to users.

After this has been done, the next step is to run the website through Google’s Fetch and render tool which can confirm whether the problem has been fixed or it hasn’t.

The robots.txt testing tool in Google’s search console can also be used to see whether there are other crawling problems. Crawling is the process by which Googlebot discovers updated and new pages in order to add them to the Google index.

This valuable information is available at

Google Bot Cannot Access Your Javascript And CSS Files

Failure to make these changes, needless to say, can have far reaching and adverse consequences on a business.

Consequences of low search engine ranking

One of the repercussions of an inferior search engine ranking is that it lowers a website’s traffic and the business revenues. Marketing experts have established that there is a direct co-relation between search engine ranking and product or service rating. This in a simple language means that an inferior website cannot offer high quality service or attain profitability.

Another consequence of poor search engine ranking is that it portrays negatively the image of the website or business to consumers, suppliers and the general public. This can reduce the propensity to visit the site or purchase the products promoted by the website.

Reduced profitability, low sales, and a negative image can in turn cause many other problems like the failure to meet important obligations such as payment of staff wages and taxes.

In conclusion therefore, it can be stated that the webmasters Googlebot warning was a wakeup call to websites to maintain high search engine visibility of face the regrettable consequences of a severe financial meltdown.