Robots.txt

Configure search engine crawler access rules to control how search bots interact with your store

What is Robots.txt?

Robots.txt is a standard web file that communicates with search engine crawlers, telling them which pages and resources they can or cannot access on your Shopify store. It acts as a gatekeeper, controlling how search engines like Google, Bing, and others interact with your site.

Proper robots.txt configuration helps prevent indexing of duplicate content, admin pages, or low-value pages while ensuring important product and content pages are accessible. This improves crawl efficiency and protects your SEO by directing search engine resources to pages that matter most.

Key Benefits

Control Search Engine Access

Direct crawlers away from duplicate or low-value pages to protect SEO

Improve Crawl Efficiency

Help search engines focus on your most important product and content pages

Boost Google Images

Enable special rules to improve product image indexing in Google Images

Custom Rules Management

Create specific rules for different search engines with easy-to-use interface

Main Features

Comprehensive robots.txt management with visual editor and testing tools. Control crawler access with basic or advanced custom rules for precise optimization.

View Current Rules: See active robots.txt configuration

Edit Rules: Modify crawler access rules

Test URLs: Verify URL accessibility

Restore Default: Reset to Shopify defaults

Basic Rules Toggle

Enable standard Shopify rules

Google Images Boost

Optimize image crawler access

Custom Rules Table

View and manage all custom rules

Add Custom Rules

Create engine-specific rules

Delete Rules

Remove unwanted rules

Video Tutorial

Learn with guided walkthrough

Getting Started

Tools › Robots.txt
1

View Current Configuration

Click View Robots.Txt to see your store's current crawler access rules

2

Enable Basic Protection

Toggle Basic Rules Only for standard Shopify crawler settings, or enable Boost Google Images for enhanced image indexing

3

Add Custom Rules

Select search engine, choose Allow/Disallow, enter path, and click Add Custom Rule for advanced control

4

Test and Verify

Use Test A Url to verify that important pages remain accessible to search engines

Important Settings

Basic Rules Only

Enable standard Shopify rules to protect admin pages and system resources from crawling

Boost Google Images

Add special rules to improve product image visibility in Google Images search results

Search Engine Selection

Target specific crawlers (Google, Bing, etc.) or use wildcard (*) for all search engines

Rule Type

Choose between Allow (grant access) or Disallow (block access) for each crawler rule

CAUTION

Use With Caution

Can Block Important Pages

Incorrect disallow rules can prevent search engines from indexing your products and content. Always test URLs before saving to avoid blocking critical pages. Mistakes can harm search rankings significantly.

Changes Affect All Crawlers

Modified robots.txt rules apply immediately to all search engine crawlers. Overly restrictive rules can reduce organic traffic and product visibility. Use the Restore Default option if issues occur.

Verify Your Implementation

Test A Url (In-App)

Use the built-in Test A Url button to verify specific URLs are accessible to search engines before saving rules

Google Robots.txt Tester

Access Google's tester

Use Google Search Console's robots.txt testing tool to validate your configuration

Direct File Access

Visit yourstore.myshopify.com/robots.txt in your browser to view the live robots.txt file

Start your quest to SEO glory!

No Credit Card Required
Cancel Anytime
Free Migration Support