---
title: Creating and customizing a robots.txt file
summary: null
url: >-
  https://www.fastly.com/documentation/guides/full-site-delivery/responses/creating-and-customizing-a-robots-file
---


The robots.txt file tells web robots how to crawl webpages on your website. You can use the Fastly control panel to create and configure a robots.txt file. If you follow the instructions in this guide, Fastly will serve the robots.txt file from cache so the requests won't hit your origin.

## Creating a robots.txt file

To create and configure your robots.txt file, follow the steps below:

1. <Partial name='step-login' inline />
1. <Partial name='step-select-service' inline />
1. <Partial name='step-click-edit' inline />
1. <Partial name='step-click-content-tab' inline />
1. Click the **robots.txt** switch to enable the robots.txt response.

   ![the robots.txt quick config](/img/robots-txt-quick-config.png)

1. In the **TXT Response** field, customize the response for the robots.txt file.
1. Click **Save** to save the response.
1. <Partial name='step-activate-deploy' inline />

## Manually creating and customizing a robots.txt file

If you need to customize the robots.txt response, you can follow the steps below to manually create the synthetic response and condition:

1. <Partial name='step-login' inline />
1. <Partial name='step-select-service' inline />
1. <Partial name='step-click-edit' inline />
1. <Partial name='step-click-content-tab' inline />
1. Click **Set up advanced response**.

   ![a synthetic response dialog](/img/new-synthetic-response-robots-txt.png)

1. Fill out the **Create a synthetic response** fields as follows:
   * In the **Name** field, enter an appropriate name. For example `robots.txt`.
   * Leave the **Status** menu set at its default `200 OK`.
   * In the **MIME Type** field, enter `text/plain`.
   * In the **Response** field, enter at least one User-agent string and one Disallow string. For instance, the above example tells all user agents (via the `User-agent: *` string) they are not allowed to crawl anything after `/tmp/` directory or the `/foo.html` file (via the `Disallow: /tmp/*` and `Disallow: /foo.html` strings respectively).
1. Click **Create**.
1. Click the **Attach a condition** link to the right of the newly created response.

   ![a req.url condition for robots](/img/new-condition-robots-txt-req-url.png)

1. Fill out the **Create a condition** fields as follows:
   * From the **Type** menu, select the desired condition (for example, `Request`).
   * In the **Name** field, enter a meaningful name for your condition (e.g., `Robots`).
   * In the **Apply if** field, enter the logical expression to execute in VCL to determine if the condition resolves as true or false. In this case, the logical expression would be the location of your robots.txt file (e.g., `req.url.path == "/robots.txt"`).
1. Click **Save**.
1. <Partial name='step-activate-deploy' inline />

> **HINT:** For an in-depth explanation of creating custom responses, check out our [Responses Tutorial](/guides/full-site-delivery/responses/responses-tutorial).

## Why can't I customize my robots.txt file with global.prod.fastly.net?

Adding the `.global.prod.fastly.net` extension to your domain (for example, `www.example.com.global.prod.fastly.net`) via the browser or in a curl command can be used to test how your production site will perform using Fastly's services.

To prevent Google from accidentally crawling this test URL, we provide an internal robots.txt file that instructs Google's webcrawlers to ignore all pages for all hostnames that end in `.prod.fastly.net`.

![a default robots.txt file](/img/robots-file-default.png)

This internal robots.txt file cannot be customized via the Fastly control panel until after you have set the CNAME DNS record for your domain to point to `global.prod.fastly.net`.

## Related content

* [Response object API documentation](/reference/api/vcl-services/response-object/)
