Google: "Blocked by ROBOTS.TXT"
Last updated
Last updated
Recently, Google has become more transparent regarding what they show in Google Search Console.
Users have reported seeing an increase in errors, specifically Blocked by ROBOTS.TXT.
It's simply a text file that's uploaded to your website, that tells web crawlers such as Google and Bing which portions of the website they are allowed to visit.
There are quite a few, but common ones you'll see as errors or warnings in Google are:
URL Path | Page on store |
---|---|
We typically do not want Google and other crawlers to crawl and rank these pages on their search engine.
For example, if a customer is on your store on the checkout, they will have a specific URL for their checkout page such as /cart/*. If Google did crawl this and rank it in their search engines, that link would be public, and anyone could visit the checkout and see what's in that customer's cart.
These pages offer zero SEO value to your store, and if ranked in Google you'd see a large bounce increase.
In general, nothing.
It doesn't hurt your SEO or ranking at all. The rest of your store will be crawled as normal, and the pages and products that you do want to be visible on the search engines will be.
/notify
Stock Notification
/ExpressCheckout
PayPal Express Checkout
/Feeds
Product Feeds e.g. Google Feeds
/ForgotPwd
Customer Account: Forgot Password
/Forms
Certain pages with forms
/logout
Customer Account: Logout
/MyAccount
Customer Account: Dashboard
/preview/
Preview Store URLs
/CartService
Cart Service
/cart/*
All Cart Pages
/account/*
All Customer Account pages
/checkout
The Checkout
/checkout/*
All Checkout pages
/confirm/*
Confirm Payment Page
/unsubscribe/*
Unsubscribe from Newsletter
/review/*
Review Product
/SidebySideCartView
Cart Page
/SidebySideCheckout
Start of Checkout page
/SidebySideCheckoutForm
Checkout Form page
/SidebySideConfirm
Confirm Payment page