Control robots.txt files from environment variables and templates.
Project description
Django Env Robots (.txt)
Serve different robots.txt from your production | stage | etc servers by setting environment variables. Rules are managed via templates. By default it excludes robots entirely.
Installation
Install from PyPI:
pip install django-env-robots
Then add the following to your project's INSTALLED_APPS
.
'django_env_robots',
Usage
settings.py
Set the following:
SERVER_ENV
identifies the nature of the server and thus the robots.txt template that will be used.
E.g:
SERVER_ENV = 'production'
urls.py
from django_env_robots import urls as robots_urls
...
urlpatterns = [
path("robots.txt", include(robots_urls)),
]
robots templates
Create corresponding template files for each SERVER_ENV you will be using.
These live in your projects templates
directory in a robots
subfolder.
For example, if SERVER_ENV
can be production
or stage
, then create:
templates/robots/production.txt
templates/robots/stage.txt
e.g:
User-agent: *
Disallow: /admin/
Sitemap: https://www.example.com/sitemap.xml
Sitemap: https://www2.example.com/sitemap.xml
Other considertions
A robots.txt being served from a Whitenose public directory will win over this app. That is because of whitenoise's middleware behaviour - quite correct but watch out for that.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Hashes for django_env_robots-0.0.4-py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | 9a2f705f825ed4ef7cd4a810db575b3d9da4e37019d44cd3aab2c47aa1469bbd |
|
MD5 | 7ce675ebb285438ba59ef83cc883f3f0 |
|
BLAKE2b-256 | 586fe3c56385ebf4608314d41916050dd6d3191d9a0e039e2c3f342a31b5cada |