Skip to main content

Scrapy spider middleware to clean up query parameters in request URLs

Project description

https://travis-ci.org/scrapy-plugins/scrapy-querycleaner.svg?branch=master https://codecov.io/gh/scrapy-plugins/scrapy-querycleaner/branch/master/graph/badge.svg

This is a Scrapy spider middleware to clean up the request URL GET query parameters at the output of the spider in accordance with the patterns provided by the user.

Installation

Install scrapy-querycleaner using pip:

$ pip install scrapy-querycleaner

Configuration

  1. Add QueryCleanerMiddleware by including it in SPIDER_MIDDLEWARES in your settings.py file:

    SPIDER_MIDDLEWARES = {
        'scrapy_querycleaner.QueryCleanerMiddleware': 100,
    }

    Here, priority 100 is just an example. Set its value depending on other middlewares you may have enabled already.

  2. Enable the middleware using either QUERYCLEANER_REMOVE or QUERYCLEANER_KEEP (or both) in your setting.py.

Usage

At least one of the following settings needs to be present for the middleware to be enabled.

Supported settings

QUERYCLEANER_REMOVE

a pattern (regular expression) that a query parameter name must match in order to be removed from the URL. (All the others will be accepted.)

QUERYCLEANER_KEEP

a pattern that a query parameter name must match in order to be kept in the URL. (All the others will be removed.)

You can combine both if some query parameters patterns should be kept and some should not.

The remove pattern has precedence over the keep one.

Example

Let’s suppose that the spider extracts URLs like:

http://www.example.com/product.php?pid=135&cid=12&ttda=12

and we want to leave only the parameter pid.

To achieve this objective we can use either QUERYCLEANER_REMOVE or QUERYCLEANER_KEEP:

  • In the first case, the pattern would be cid|ttda:

    QUERYCLEANER_REMOVE = 'cid|ttda'
  • In the second case, pid:

    QUERYCLEANER_KEEP = 'pid'

The best solution depends on a particular case, that is, how the query filters will affect any other URL that the spider is expected to extract.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

scrapy-querycleaner-0.1.0.tar.gz (3.2 kB view details)

Uploaded Source

Built Distribution

scrapy_querycleaner-0.1.0-py2.py3-none-any.whl (3.8 kB view details)

Uploaded Python 2 Python 3

File details

Details for the file scrapy-querycleaner-0.1.0.tar.gz.

File metadata

File hashes

Hashes for scrapy-querycleaner-0.1.0.tar.gz
Algorithm Hash digest
SHA256 df8d9a7e3e81ae7f82939890971b66a67a475e1e28bea64248ce509173a34914
MD5 ab0f6da43aadf5dab3f565ab7e4f75e6
BLAKE2b-256 8af0fe2183b9545cd646114018e33ce9d06361ed7e93f77b28c0dc0c279d6129

See more details on using hashes here.

Provenance

File details

Details for the file scrapy_querycleaner-0.1.0-py2.py3-none-any.whl.

File metadata

File hashes

Hashes for scrapy_querycleaner-0.1.0-py2.py3-none-any.whl
Algorithm Hash digest
SHA256 84fc57f2e3e37ada3aede6ff41ef3ad2070f9ea227910c535b172ab2f1eff81b
MD5 d4013e5dc8ea7195fce08ee14e01c40d
BLAKE2b-256 788c3b41aabf92dcbdd197ffc8f0a8e21e5ca5fb373a6ee8b4a06fd342140c25

See more details on using hashes here.

Provenance

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page