Skip to content

Hugo module which creates robots.txt using the Dark Visitors API.

Notifications You must be signed in to change notification settings

lkhrs/hugo-dark-visitors

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

9 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

hugo-dark-visitors

Give AI company scraper bots a gentle "no" with this Hugo module. Uses the Dark Visitors API to import the latest robots.txt.

Requires Hugo 0.141.0+

Installing

  1. Grab your Dark Visitors API key from the Projects page
  2. Add the API key to the HUGO_DARKVISITORS environment variable
  3. Import the module
  4. Tell Hugo to generate robots.txt
  5. Configure API options (optional)

Import the module in your Hugo config:

module:
  imports:
    - path: github.com/lkhrs/hugo-dark-visitors

Tell Hugo to generate robots.txt by adding this line to your Hugo config:

enableRobotsTXT: true

API options

By default, the module uses the "AI Data Scraper" category. The API supports more categories that you can request by adding them to your Hugo config:

params:
  darkVisitors:
    - AI Assistant
    - AI Data Scraper
    - AI Search Crawler

Customizing robots.txt

You can override the template provided by this module and use a partial to add the rules after your custom directives. Add layouts/robots.txt to the root of your Hugo site and import the partial inside robots.txt:

{{ partial "dark-visitors.html" . }}

Add your custom directives before or after the partial.

Reducing API calls

Here are two methods for reducing API calls in Hugo:

  1. Hugo's file cache (see Configure file caches)
  2. Only import the module on production builds (see Configuration directory and Issue #1)

About

Hugo module which creates robots.txt using the Dark Visitors API.

Resources

Stars

Watchers

Forks

Packages

No packages published

Languages