robots.txt
robots.txt is a text file at the root of your domain (yourstore.com/robots.txt) that tells search engine crawlers which pages they can and cannot access. It uses Disallow and Allow directives to control crawl behavior. Shopify generates a default robots.txt automatically.
Why It Matters for Shopify Stores
For Shopify stores, robots.txt prevents search engines from crawling admin pages, checkout, API endpoints, and thank-you pages — pages you don't want indexed. Accidentally blocking important pages (like /collections or /products) in robots.txt can cause catastrophic drops in organic traffic. The Shopify-generated default is usually correct, but custom robots.txt modifications for SEO apps or international setups can introduce errors that block Google from key content.
How to Check Your Store
Visit yourstore.com/robots.txt in a browser to see your current file. Check that /collections, /products, /pages, and /blogs are not disallowed. Use Google Search Console's URL Inspection tool to verify important pages are crawlable.
Use the free shopify seo score toolHow to Fix It
In Shopify, go to Online Store > Preferences > SEO > robots.txt template. Only modify this if you have a specific reason. Never disallow /products, /collections, or /blogs unless you specifically want those pages excluded from Google.
Related Terms
Check your store's robots.txt with our free tools
Get a full audit across all 6 performance categories — including seo — in under 60 seconds.
Run a Free Store Audit