Google: "Blocked by ROBOTS.TXT"

Recently, Google has become more transparent regarding what they show in Google Search Console.

Users have reported seeing an increase in errors, specifically Blocked by ROBOTS.TXT.


It's simply a text file that's uploaded to your website, that tells web crawlers such as Google and Bing which portions of the website they are allowed to visit.

Which portions of the website do we tell ROBOTS.TXT not to visit?

There are quite a few, but common ones you'll see as errors or warnings in Google are:

URL PathPage on store


Stock Notification


PayPal Express Checkout


Product Feeds e.g. Google Feeds


Customer Account: Forgot Password


Certain pages with forms


Customer Account: Logout


Customer Account: Dashboard


Preview Store URLs


Cart Service


All Cart Pages


All Customer Account pages


The Checkout


All Checkout pages


Confirm Payment Page


Unsubscribe from Newsletter


Review Product


Cart Page


Start of Checkout page


Checkout Form page


Confirm Payment page

Why do we block these pages?

We typically do not want Google and other crawlers to crawl and rank these pages on their search engine.

For example, if a customer is on your store on the checkout, they will have a specific URL for their checkout page such as /cart/*. If Google did crawl this and rank it in their search engines, that link would be public, and anyone could visit the checkout and see what's in that customer's cart.

These pages offer zero SEO value to your store, and if ranked in Google you'd see a large bounce increase.

What should you do with the errors/ warnings?

In general, nothing.

It doesn't hurt your SEO or ranking at all. The rest of your store will be crawled as normal, and the pages and products that you do want to be visible on the search engines will be.

Last updated