Brings automatic support for robots.txt files in requests.
Project description
Currently just a proof of concept, the module strives to be an extension to requests that brings automatic support for robots.txt.
How to use
Simply use RobotsAwareSession instead of the built-in requests.Session. If a resource is not allowed, a RobotsTxtDisallowed exception is raised.
How do I run the tests?
The easiest way would be to extract the source tarball and run:
$ python test/test_robotstxt.py
Change Log
0.1.0
initial published version
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
File details
Details for the file requests-robotstxt-0.1.0.tar.gz
.
File metadata
- Download URL: requests-robotstxt-0.1.0.tar.gz
- Upload date:
- Size: 3.9 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 502292aac0e7c2ef7de34921d4e59ff38a82b4412255f551f8f99d1718213563 |
|
MD5 | 620945ddd2fa5d81a09d0d6ac81bd5d8 |
|
BLAKE2b-256 | d2962c5d5d1420370b8f6193f562573623095245cc09e4820385b4e7d43540a5 |