robots.md - hugo - [fork] hugo port for 9front
(HTM) git clone https://git.drkhsh.at/hugo.git
(DIR) Log
(DIR) Files
(DIR) Refs
(DIR) Submodules
(DIR) README
(DIR) LICENSE
---
robots.md (1634B)
---
1 ---
2 title: robots.txt template
3 linkTitle: robots.txt templates
4 description: Hugo can generate a customized robots.txt in the same way as any other template.
5 categories: []
6 keywords: []
7 weight: 180
8 aliases: [/extras/robots-txt/]
9 ---
10
11 To generate a robots.txt file from a template, change the [site configuration]:
12
13 {{< code-toggle file=hugo >}}
14 enableRobotsTXT = true
15 {{< /code-toggle >}}
16
17 By default, Hugo generates robots.txt using an [embedded template].
18
19 ```text
20 User-agent: *
21 ```
22
23 Search engines that honor the Robots Exclusion Protocol will interpret this as permission to crawl everything on the site.
24
25 ## robots.txt template lookup order
26
27 You may overwrite the internal template with a custom template. Hugo selects the template using this lookup order:
28
29 1. `/layouts/robots.txt`
30 1. `/themes/<THEME>/layouts/robots.txt`
31
32 ## robots.txt template example
33
34 ```text {file="layouts/robots.txt"}
35 User-agent: *
36 {{ range .Pages }}
37 Disallow: {{ .RelPermalink }}
38 {{ end }}
39 ```
40
41 This template creates a robots.txt file with a `Disallow` directive for each page on the site. Search engines that honor the Robots Exclusion Protocol will not crawl any page on the site.
42
43 > [!note]
44 > To create a robots.txt file without using a template:
45 >
46 > 1. Set `enableRobotsTXT` to `false` in the site configuration.
47 > 1. Create a robots.txt file in the `static` directory.
48 >
49 > Remember that Hugo copies everything in the [`static` directory][static] to the root of `publishDir` (typically `public`) when you build your site.
50
51 [embedded template]: {{% eturl robots %}}
52 [site configuration]: /configuration/
53 [static]: /getting-started/directory-structure/