Control robots.txt files from environment variables and templates.
Reason this release was yanked:
clumsy implementation
Project description
Django Env Robots (.txt)
Serve different robots.txt from your production | stage | etc servers by setting environment variables. Rules are managed via templates.
Installation
Install from PyPI:
pip install django-env-robots
Then add the following to your project's INSTALLED_APPS
.
'django_env_robots',
Usage
settings.py
# robots
SERVER_ENV = Env.get('SERVER_ENV', 'production')
ROBOTS_ROOT = os.path.join(BASE_DIR, 'robots')
ROBOTS_SITEMAP_URLS = Env.list('ROBOTS_SITEMAP_URLS', '/sitemap.xml')
urls.py
from django_env_robots import urls as robots_urls
...
urlpatterns = [
path("robots.txt", include(robots_urls)),
]
Other considertions
A robots.txt being served from a Whitenose public directory will win over this app. That is because of whitenoise's middleware behaviour - quite correct but watch out for that.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Close
Hashes for django_env_robots-0.0.1-py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | cc2fc48162e88a40119c92311cdc798abaa69d4fd107697c108e91e5800c4959 |
|
MD5 | 043a098ac21392bfd89c5361e8b5436a |
|
BLAKE2b-256 | 88893ba2acfcb4a6ec61fafdb8a62eaaf52eaebd84a046d43bfa6e867eff2c55 |