Package for buildout based scrapy spider development
Project description
This package provides some core component for buildout based scrapy spider development. Such scrapy spider packages can get installed, scheduled and processed with the mongodb based s01.worker daemon using the JSON-RPC proxy located in the s01.client package. The package also provides some recipes which allows to use external files as scrapy settings.
Changes
0.5.0 (unreleased)
removed unused dependencies and imports
initial release
0.0.7 (2011-01-02)
alpha version released for development and testing the tool chain
write logging.Error to sys.stderr where we can read in subprocess
0.0.6 (2010-12-31)
alpha version released for development and testing the tool chain
print logging.ERROR to stdout which is required for error handling in subprocess
0.0.5 (2010-12-29)
alpha version released for development and testing the tool chain
implemented a different scrapy item and field concept. Use a field property and not a dict based item and field. Implemented ScrapyFieldProperty and a ScrapyItemBase class. Added tests for show how the scrapy item and field works including converter and serializer.
implemented new extractor which can handle the new scrapy item and field concept
implemented different basic ScrapyFieldProperty converter methods
0.0.4 (2010-12-22)
alpha version released for development and testing the tool chain
remove spider name from crawl recipe
0.0.3 (2010-11-29)
alpha version released for development and testing the tool chain
fix hex data parts in settings content
use s01.worker as default logging handler name
0.0.2 (2010-11-29)
alpha version released for development and testing the tool chain
added settings recipe
0.0.1 (2010-11-21)
alpha version released for development and testing the tool chain
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.