Skip to main content

API for CSV converted by udata-hydra

Project description

Api-tabular

This connects to hydra and serves the converted CSVs as an API.

Run locally

Start hydra via docker compose.

Launch this project:

docker compose up

You can now access the raw postgrest API on http://localhost:8080.

Now you can launch the proxy (ie the app):

poetry install
poetry run adev runserver -p8005 api_tabular/app.py        # Api related to apified CSV files by udata-hydra
poetry run adev runserver -p8005 api_tabular/metrics.py    # Api related to udata's metrics

And query postgrest via the proxy using a resource_id, cf below. Test resource_id is aaaaaaaa-1111-bbbb-2222-cccccccccccc

API

Meta informations on resource

curl http://localhost:8005/api/resources/aaaaaaaa-1111-bbbb-2222-cccccccccccc/
{
  "created_at": "2023-04-21T22:54:22.043492+00:00",
  "url": "https://data.gouv.fr/datasets/example/resources/fake.csv",
  "links": [
    {
      "href": "/api/resources/aaaaaaaa-1111-bbbb-2222-cccccccccccc/profile/",
      "type": "GET",
      "rel": "profile"
    },
    {
      "href": "/api/resources/aaaaaaaa-1111-bbbb-2222-cccccccccccc/data/",
      "type": "GET",
      "rel": "data"
    },
    {
      "href": "/api/resources/aaaaaaaa-1111-bbbb-2222-cccccccccccc/swagger/",
      "type": "GET",
      "rel": "swagger"
    }
  ]
}

Profile (csv-detective output) for a resource

curl http://localhost:8005/api/resources/aaaaaaaa-1111-bbbb-2222-cccccccccccc/profile/
{
  "profile": {
    "header": [
        "id",
        "score",
        "decompte",
        "is_true",
        "birth",
        "liste"
    ]
  },
  "...": "..."
}

Data for a resource (ie resource API)

curl http://localhost:8005/api/resources/aaaaaaaa-1111-bbbb-2222-cccccccccccc/data/
{
  "data": [
    {
        "__id": 1,
        "id": " 8c7a6452-9295-4db2-b692-34104574fded",
        "score": 0.708,
        "decompte": 90,
        "is_true": false,
        "birth": "1949-07-16",
        "liste": "[0]"
    },
    ...
  ],
  "links": {
      "profile": "http://localhost:8005/api/resources/aaaaaaaa-1111-bbbb-2222-cccccccccccc/profile/",
      "swagger": "http://localhost:8005/api/resources/aaaaaaaa-1111-bbbb-2222-cccccccccccc/swagger/",
      "next": "http://localhost:8005/api/resources/aaaaaaaa-1111-bbbb-2222-cccccccccccc/data/?page=2&page_size=20",
      "prev": null
  },
  "meta": {
      "page": 1,
      "page_size": 20,
      "total": 1000
  }
}

This endpoint can be queried with the following operators as query string (replacing column_name with the name of an actual column):

# sort by column
column_name__sort=asc
column_name__sort=desc

# exact value
column_name__exact=value

# differs
column_name__differs=value

# contains (for strings only)
column_name__contains=value

# in (value in list)
column_name__in=value1,value2,value3

# less
column_name__less=value

# greater
column_name__greater=value

# strictly less
column_name__strictly_less=value

# strictly greater
column_name__strictly_greater=value

For instance:

curl http://localhost:8005/api/resources/aaaaaaaa-1111-bbbb-2222-cccccccccccc/data/?score__greater=0.9&decompte__exact=13

returns

{
  "data": [
    {
      "__id": 52,
      "id": " 5174f26d-d62b-4adb-a43a-c3b6288fa2f6",
      "score": 0.985,
      "decompte": 13,
      "is_true": false,
      "birth": "1980-03-23",
      "liste": "[0]"
    },
    {
      "__id": 543,
      "id": " 8705df7c-8a6a-49e2-9514-cf2fb532525e",
      "score": 0.955,
      "decompte": 13,
      "is_true": true,
      "birth": "1965-02-06",
      "liste": "[0, 1, 2]"
    }
  ],
  "links": {
    "profile": "http://localhost:8005/api/resources/aaaaaaaa-1111-bbbb-2222-cccccccccccc/profile/",
    "swagger": "http://localhost:8005/api/resources/aaaaaaaa-1111-bbbb-2222-cccccccccccc/swagger/",
    "next": null,
    "prev": null
  },
  "meta": {
    "page": 1,
    "page_size": 20,
    "total": 2
  }
}

Pagination is made through queries with page and page_size:

curl http://localhost:8005/api/resources/aaaaaaaa-1111-bbbb-2222-cccccccccccc/data/?page=2&page_size=30

Contributing

Pre-commit hook

This repository uses a pre-commit hook which lint and format code before each commit. Please install it with:

poetry run pre-commit install

Lint and format code

To lint, format and sort imports, this repository uses Ruff. You can run the following command to lint and format the code:

poetry run ruff check --fix && poetry run ruff format

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

udata_hydra_csvapi-0.2.0.dev0.tar.gz (9.5 kB view details)

Uploaded Source

Built Distribution

udata_hydra_csvapi-0.2.0.dev0-py3-none-any.whl (11.4 kB view details)

Uploaded Python 3

File details

Details for the file udata_hydra_csvapi-0.2.0.dev0.tar.gz.

File metadata

  • Download URL: udata_hydra_csvapi-0.2.0.dev0.tar.gz
  • Upload date:
  • Size: 9.5 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.8.3 CPython/3.11.10 Linux/6.8.0-1018-aws

File hashes

Hashes for udata_hydra_csvapi-0.2.0.dev0.tar.gz
Algorithm Hash digest
SHA256 159bba9f1a93b263268d1f6651ae6ad2b4c616a34119f5f76bd091db080af9b4
MD5 d6decac5ce05450d0b8d43113fd6b405
BLAKE2b-256 d2602748950dd6eea0c3b54876f678c314d3ce0c1a9d34e3b1aaee5f4e0abb3c

See more details on using hashes here.

File details

Details for the file udata_hydra_csvapi-0.2.0.dev0-py3-none-any.whl.

File metadata

File hashes

Hashes for udata_hydra_csvapi-0.2.0.dev0-py3-none-any.whl
Algorithm Hash digest
SHA256 348fecf1e1e7d7a42a89575fcaddb58d0c02c16cf9d8673df41faa5fc9cdbf63
MD5 87c5378fbcd2524a19e286979482fef6
BLAKE2b-256 c4d37856e63c979330d82f823d613f8fc9792e945710574ed2db1a96f14b5b6d

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page