Control robots.txt files from environment variables and templates.
Project description
Django Env Robots (.txt)
Serve different robots.txt from your production | stage | etc servers by setting environment variables. Rules are managed via templates. By default it excludes robots entirely.
Installation
Install from PyPI:
pip install django-env-robots
Then add the following to your project's INSTALLED_APPS
.
'django_env_robots',
Usage
settings.py
Set the following:
SERVER_ENV
identifies the nature of the server and thus the robots.txt template that will be used.
E.g:
SERVER_ENV = 'production'
urls.py
from django_env_robots import urls as robots_urls
...
urlpatterns = [
path("robots.txt", include(robots_urls)),
]
robots templates
Create corresponding template files for each SERVER_ENV you will be using.
These live in your projects templates
directory in a robots
subfolder.
For example, if SERVER_ENV
can be production
or stage
, then create:
templates/robots/production.txt
templates/robots/stage.txt
e.g:
User-agent: *
Disallow: /admin/
Sitemap: https://www.example.com/sitemap.xml
Sitemap: https://www2.example.com/sitemap.xml
Other considertions
A robots.txt being served from a Whitenose public directory will win over this app. That is because of whitenoise's middleware behaviour - quite correct but watch out for that.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Hashes for django_env_robots-0.0.5-py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | dc8ff6418197ee64c43b7062a1d8fc0045a7cd161e90056f2145661507a9b2f6 |
|
MD5 | 9f6e7c425c6f2b89e81ab51c5878170d |
|
BLAKE2b-256 | f5e8025fb002b19faa56aad69553c9907166e78cefebcbe52c33881d06b1f0e0 |