Pure-Python robots.txt parser with support for modern conventions
Project description
Protego is a pure-Python robots.txt parser with support for modern conventions.
Install
To install Protego, simply use pip:
pip install protego
Usage
>>> from protego import Protego >>> robotstxt = """ ... User-agent: * ... Disallow: / ... Allow: /about ... Allow: /account ... Disallow: /account/contact$ ... Disallow: /account/*/profile ... Crawl-delay: 4 ... Request-rate: 10/1m # 10 requests every 1 minute ... ... Sitemap: http://example.com/sitemap-index.xml ... Host: http://example.co.in ... """ >>> rp = Protego.parse(robotstxt) >>> rp.can_fetch("http://example.com/profiles", "mybot") False >>> rp.can_fetch("http://example.com/about", "mybot") True >>> rp.can_fetch("http://example.com/account", "mybot") True >>> rp.can_fetch("http://example.com/account/myuser/profile", "mybot") False >>> rp.can_fetch("http://example.com/account/contact", "mybot") False >>> rp.crawl_delay("mybot") 4.0 >>> rp.request_rate("mybot") RequestRate(requests=10, seconds=60, start_time=None, end_time=None) >>> list(rp.sitemaps) ['http://example.com/sitemap-index.xml'] >>> rp.preferred_host 'http://example.co.in'
Using Protego with Requests:
>>> from protego import Protego >>> import requests >>> r = requests.get("https://google.com/robots.txt") >>> rp = Protego.parse(r.text) >>> rp.can_fetch("https://google.com/search", "mybot") False >>> rp.can_fetch("https://google.com/search/about", "mybot") True >>> list(rp.sitemaps) ['https://www.google.com/sitemap.xml']
Comparison
The following table compares Protego to the most popular robots.txt parsers implemented in Python or featuring Python bindings:
Protego |
RobotFileParser |
Reppy |
Robotexclusionrulesparser |
|
---|---|---|---|---|
Implementation language |
Python |
Python |
C++ |
Python |
Reference specification |
||||
✓ |
✓ |
✓ |
||
✓ |
✓ |
|||
+40% |
+1300% |
-25% |
API Reference
Class protego.Protego:
Properties
sitemaps {list_iterator} A list of sitemaps specified in robots.txt.
preferred_host {string} Preferred host specified in robots.txt.
Methods
parse(robotstxt_body) Parse robots.txt and return a new instance of protego.Protego.
can_fetch(url, user_agent) Return True if the user agent can fetch the URL, otherwise return False.
crawl_delay(user_agent) Return the crawl delay specified for the user agent as a float. If nothing is specified, return None.
request_rate(user_agent) Return the request rate specified for the user agent as a named tuple RequestRate(requests, seconds, start_time, end_time). If nothing is specified, return None.
visit_time(user_agent) Return the visit time specified for the user agent as a named tuple VisitTime(start_time, end_time). If nothing is specified, return None.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
File details
Details for the file Protego-0.3.1.tar.gz
.
File metadata
- Download URL: Protego-0.3.1.tar.gz
- Upload date:
- Size: 3.2 MB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/5.0.0 CPython/3.12.2
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | e94430d0d25cbbf239bc849d86c5e544fbde531fcccfa059953c7da344a1712c |
|
MD5 | 200c5f8947240a59ecee2b12efd26fd5 |
|
BLAKE2b-256 | 8a12cab9fa77ff4e9e444a5eb5480db4b4f872c03aa079145804aa054be377bc |
Provenance
File details
Details for the file Protego-0.3.1-py2.py3-none-any.whl
.
File metadata
- Download URL: Protego-0.3.1-py2.py3-none-any.whl
- Upload date:
- Size: 8.5 kB
- Tags: Python 2, Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/5.0.0 CPython/3.12.2
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 2fbe8e9b7a7dbc5016a932b14c98d236aad4c29290bbe457b8d2779666ef7a41 |
|
MD5 | 68ec8dbe4fd0f1481eb2b8d1ca9ff839 |
|
BLAKE2b-256 | 74efece78585a5a189d8cc2b4c2d2b92a0dc025f156a6501159b026472ebbedc |