Skip to main content

Scrapy spider middleware to clean up query parameters in request URLs

Project description

https://travis-ci.org/scrapy-plugins/scrapy-querycleaner.svg?branch=master https://codecov.io/gh/scrapy-plugins/scrapy-querycleaner/branch/master/graph/badge.svg

This is a Scrapy spider middleware to clean up the request URL GET query parameters at the output of the spider in accordance with the patterns provided by the user.

Installation

Install scrapy-querycleaner using pip:

$ pip install scrapy-querycleaner

Configuration

  1. Add QueryCleanerMiddleware by including it in SPIDER_MIDDLEWARES in your settings.py file:

    SPIDER_MIDDLEWARES = {
        'scrapy_querycleaner.QueryCleanerMiddleware': 100,
    }

    Here, priority 100 is just an example. Set its value depending on other middlewares you may have enabled already.

  2. Enable the middleware using either QUERYCLEANER_REMOVE or QUERYCLEANER_KEEP (or both) in your setting.py.

Usage

At least one of the following settings needs to be present for the middleware to be enabled.

Supported settings

QUERYCLEANER_REMOVE

a pattern (regular expression) that a query parameter name must match in order to be removed from the URL. (All the others will be accepted.)

QUERYCLEANER_KEEP

a pattern that a query parameter name must match in order to be kept in the URL. (All the others will be removed.)

You can combine both if some query parameters patterns should be kept and some should not.

The remove pattern has precedence over the keep one.

Example

Let’s suppose that the spider extracts URLs like:

http://www.example.com/product.php?pid=135&cid=12&ttda=12

and we want to leave only the parameter pid.

To achieve this objective we can use either QUERYCLEANER_REMOVE or QUERYCLEANER_KEEP:

  • In the first case, the pattern would be cid|ttda:

    QUERYCLEANER_REMOVE = 'cid|ttda'
  • In the second case, pid:

    QUERYCLEANER_KEEP = 'pid'

The best solution depends on a particular case, that is, how the query filters will affect any other URL that the spider is expected to extract.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

scrapy-querycleaner-1.0.0.tar.gz (3.2 kB view details)

Uploaded Source

Built Distribution

scrapy_querycleaner-1.0.0-py2.py3-none-any.whl (3.8 kB view details)

Uploaded Python 2 Python 3

File details

Details for the file scrapy-querycleaner-1.0.0.tar.gz.

File metadata

File hashes

Hashes for scrapy-querycleaner-1.0.0.tar.gz
Algorithm Hash digest
SHA256 3f3fdc7558076e7a0dfdadb803d42661c372b75f05c436a7a721e28d16ec5d5a
MD5 e38bc7780bb86d577ebdb5f3676a9919
BLAKE2b-256 1915ede0e13684f7eb1d685e3428c78899408ffb72f4a63c613240feddbbb8af

See more details on using hashes here.

Provenance

File details

Details for the file scrapy_querycleaner-1.0.0-py2.py3-none-any.whl.

File metadata

File hashes

Hashes for scrapy_querycleaner-1.0.0-py2.py3-none-any.whl
Algorithm Hash digest
SHA256 a40002384a277db89797fcf6c029bec46b05642c1e646f1476385b803745b333
MD5 156fa3f1a03f8c64a1dad77c345e2778
BLAKE2b-256 802aa3d6b7779dff0932017fcbfdad4b67b710160e524ac2914103ee963dee71

See more details on using hashes here.

Provenance

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page