Control robots.txt files from environment variables and templates.
Project description
Django Env Robots (.txt)
Serve different robots.txt from your production | stage | etc servers by setting environment variables. Rules are managed via templates. By default it excludes robots entirely.
Installation
Install from PyPI:
pip install django-env-robots
Then add the following to your project's INSTALLED_APPS
.
'django_env_robots',
Usage
settings.py
Set the following:
SERVER_ENV
identifies the nature of the server and thus the robots.txt template that will be used.ROBOT_SITEMAP_URLS
a list of relative urls to your sitemap(s).
E.g:
SERVER_ENV = 'production'
ROBOTS_SITEMAP_URLS = ['/sitemap.xml', '/other_sitemap.xml']
urls.py
from django_env_robots import urls as robots_urls
...
urlpatterns = [
path("robots.txt", include(robots_urls)),
]
robots templates
Create corresponding template files for each SERVER_ENV you will be using.
These live in your projects templates
directory in a robots
subfolder.
For example, if SERVER_ENV
can be production
or stage
, then create:
templates/robots/production.txt
templates/robots/stage.txt
e.g:
User-agent: *
Disallow: /admin/*
{% for sitemap_url in sitemap_urls %}Sitemap: {{ sitemap_url }}
{% endfor %}
Other considertions
A robots.txt being served from a Whitenose public directory will win over this app. That is because of whitenoise's middleware behaviour - quite correct but watch out for that.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Hashes for django_env_robots-0.0.3-py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | c1c92b4abbbc1ba85539c368df5c3d254554a00c7a68eeac93b604a5d35e2304 |
|
MD5 | 5f501f7e88607c71ccb389866c0a5e1b |
|
BLAKE2b-256 | 58b52ca5af5850bb7067fea247b48d4e3ee44b32949e3706e129b1074f96ed99 |