Skip to main content

pycql is a pure Python parser implementation of the OGC CQL standard

Project description

pycql

PyPI version Build Status Documentation Status

pycql is a pure Python parser implementation of the OGC CQL standard

Installation

pip install pycql

Usage

The basic functionality parses the input string to an abstract syntax tree (AST) representation. This AST can then be used to build database filters or similar functionality.

>>> import pycql
>>> ast = pycql.parse(filter_expression)

Inspection

The easiest way to inspect the resulting AST is to use the get_repr function, which returns a nice string representation of what was parsed:

>>> ast = pycql.parse('id = 10')
>>> print(pycql.get_repr(ast))
ATTRIBUTE id = LITERAL 10.0
>>>
>>>
>>> filter_expr = '(number BETWEEN 5 AND 10 AND string NOT LIKE "%B") OR INTERSECTS(geometry, LINESTRING(0 0, 1 1))'
>>> print(pycql.get_repr(pycql.parse(filter_expr)))
(
    (
            ATTRIBUTE number BETWEEN LITERAL 5.0 AND LITERAL 10.0
    ) AND (
            ATTRIBUTE string NOT ILIKE LITERAL '%B'
    )
) OR (
    INTERSECTS(ATTRIBUTE geometry, LITERAL GEOMETRY 'LINESTRING(0 0, 1 1)')
)

Evaluation

In order to create useful filters from the resulting AST, it has to be evaluated. For the Django integration, this was done using a recursive descent into the AST, evaluating the subnodes first and constructing a Q object. Consider having a filters API (for an example look at the Django one) which creates the filter. Now the evaluator looks something like this:

from pycql.ast import *
from myapi import filters   # <- this is where the filters are created.
                            # of course, this can also be done in the
                            # evaluator itself
class FilterEvaluator:
    def __init__(self, field_mapping=None, mapping_choices=None):
        self.field_mapping = field_mapping
        self.mapping_choices = mapping_choices

    def to_filter(self, node):
        to_filter = self.to_filter
        if isinstance(node, NotConditionNode):
            return filters.negate(to_filter(node.sub_node))
        elif isinstance(node, CombinationConditionNode):
            return filters.combine(
                (to_filter(node.lhs), to_filter(node.rhs)), node.op
            )
        elif isinstance(node, ComparisonPredicateNode):
            return filters.compare(
                to_filter(node.lhs), to_filter(node.rhs), node.op,
                self.mapping_choices
            )
        elif isinstance(node, BetweenPredicateNode):
            return filters.between(
                to_filter(node.lhs), to_filter(node.low),
                to_filter(node.high), node.not_
            )
        elif isinstance(node, BetweenPredicateNode):
            return filters.between(
                to_filter(node.lhs), to_filter(node.low),
                to_filter(node.high), node.not_
            )

        # ... Some nodes are left out for brevity

        elif isinstance(node, AttributeExpression):
            return filters.attribute(node.name, self.field_mapping)

        elif isinstance(node, LiteralExpression):
            return node.value

        elif isinstance(node, ArithmeticExpressionNode):
            return filters.arithmetic(
                to_filter(node.lhs), to_filter(node.rhs), node.op
            )

        return node

As mentionend, the to_filter method is the recursion.

Testing

The basic functionality can be tested using pytest.

python -m pytest

There is a test project/app to test the Django integration. This is tested using the following command:

python manage.py test testapp

Django integration

For Django there is a default bridging implementation, where all the filters are translated to the Django ORM. In order to use this integration, we need two dictionaries, one mapping the available fields to the Django model fields, and one to map the fields that use choices. Consider the following example models:

from django.contrib.gis.db import models


optional = dict(null=True, blank=True)

class Record(models.Model):
    identifier = models.CharField(max_length=256, unique=True, null=False)
    geometry = models.GeometryField()

    float_attribute = models.FloatField(**optional)
    int_attribute = models.IntegerField(**optional)
    str_attribute = models.CharField(max_length=256, **optional)
    datetime_attribute = models.DateTimeField(**optional)
    choice_attribute = models.PositiveSmallIntegerField(choices=[
                                                                 (1, 'ASCENDING'),
                                                                 (2, 'DESCENDING'),],
                                                        **optional)


class RecordMeta(models.Model):
    record = models.ForeignKey(Record, on_delete=models.CASCADE, related_name='record_metas')

    float_meta_attribute = models.FloatField(**optional)
    int_meta_attribute = models.IntegerField(**optional)
    str_meta_attribute = models.CharField(max_length=256, **optional)
    datetime_meta_attribute = models.DateTimeField(**optional)
    choice_meta_attribute = models.PositiveSmallIntegerField(choices=[
                                                                      (1, 'X'),
                                                                      (2, 'Y'),
                                                                      (3, 'Z')],
                                                             **optional)

Now we can specify the field mappings and mapping choices to be used when applying the filters:

FIELD_MAPPING = {
    'identifier': 'identifier',
    'geometry': 'geometry',
    'floatAttribute': 'float_attribute',
    'intAttribute': 'int_attribute',
    'strAttribute': 'str_attribute',
    'datetimeAttribute': 'datetime_attribute',
    'choiceAttribute': 'choice_attribute',

    # meta fields
    'floatMetaAttribute': 'record_metas__float_meta_attribute',
    'intMetaAttribute': 'record_metas__int_meta_attribute',
    'strMetaAttribute': 'record_metas__str_meta_attribute',
    'datetimeMetaAttribute': 'record_metas__datetime_meta_attribute',
    'choiceMetaAttribute': 'record_metas__choice_meta_attribute',
}

MAPPING_CHOICES = {
    'choiceAttribute': dict(Record._meta.get_field('choice_attribute').choices),
    'choiceMetaAttribute': dict(RecordMeta._meta.get_field('choice_meta_attribute').choices),
}

Finally we are able to connect the CQL AST to the Django database models. We also provide factory functions to parse the timestamps, durations, geometries and envelopes, so that they can be used with the ORM layer:

from pycql.integrations.django import to_filter, parse

cql_expr = 'strMetaAttribute LIKE "%parent%" AND datetimeAttribute BEFORE 2000-01-01T00:00:01Z'

# NOTE: we are using the django integration `parse` wrapper here
ast = parse(cql_expr)
filters = to_filter(ast, mapping, mapping_choices)

qs = Record.objects.filter(**filters)

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

pycql-0.0.12.tar.gz (25.8 kB view details)

Uploaded Source

Built Distribution

pycql-0.0.12-py2.py3-none-any.whl (33.6 kB view details)

Uploaded Python 2 Python 3

File details

Details for the file pycql-0.0.12.tar.gz.

File metadata

  • Download URL: pycql-0.0.12.tar.gz
  • Upload date:
  • Size: 25.8 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.4.1 importlib_metadata/3.7.3 pkginfo/1.7.0 requests/2.25.1 requests-toolbelt/0.9.1 tqdm/4.59.0 CPython/3.9.2

File hashes

Hashes for pycql-0.0.12.tar.gz
Algorithm Hash digest
SHA256 8684a9c187e3a6981876e7cbd34195c8be89ff455cabd9c286dc781c628750d5
MD5 a1b67699db02dc90b1636e36bbc108ff
BLAKE2b-256 641a3f3e88a564fabb83f3b5999c8fa9274a77504dae4b101dc38d48a9847da4

See more details on using hashes here.

File details

Details for the file pycql-0.0.12-py2.py3-none-any.whl.

File metadata

  • Download URL: pycql-0.0.12-py2.py3-none-any.whl
  • Upload date:
  • Size: 33.6 kB
  • Tags: Python 2, Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.4.1 importlib_metadata/3.7.3 pkginfo/1.7.0 requests/2.25.1 requests-toolbelt/0.9.1 tqdm/4.59.0 CPython/3.9.2

File hashes

Hashes for pycql-0.0.12-py2.py3-none-any.whl
Algorithm Hash digest
SHA256 cadbf776bc9eb2f797fce73aacb5be5683d6c739c4e6e879d9bd2684a49c79f9
MD5 d3c93c235fb2f4254a5bce1c2850ad9c
BLAKE2b-256 7a482cb82629f59919bd55bf72c34ae96c4a36fc6260375408405d4d489463ac

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page