grlc, the git repository linked data API constructor
Project description
grlc, the git repository linked data API constructor, automatically builds Web APIs using SPARQL queries stored in git repositories. http://grlc.io/
Contributors: Albert Meroño, Rinke Hoekstra, Carlos Martínez
Copyright: Albert Meroño, VU University Amsterdam
License: MIT License (see LICENSE.txt)
What is grlc ?
grlc is a lightweight server that takes SPARQL queries curated in GitHub repositories, and translates them to Linked Data Web APIs. This enables universal access to Linked Data. Users are not required to know SPARQL to query their data, but instead can access a web API.
Quick tutorial
For a quick usage tutorial check out our wiki walkthrough here
Features
- Request parameter mappings into SPARQL: grlc is compliant with BASIL's convention on how to map GET/POST request parameters into SPARQL
- Automatic, user customizable population of parameter values in swagger-ui's dropdown menus via SPARQL triple pattern querying
- Parameter values as enumerations (i.e. closed lists of values that will fill a dropdown in the UI) can now also be specified in the query decorators to save endpoint requests (see this example)
- Parameter default values can now also be indicated through decorators (see this example)
- URL-based content negotiation: you can request for specific content types by attaching them to the operation request URL, e.g. http://localhost:8088/CEDAR-project/Queries/residenceStatus_all.csv will request for results in CSV
- Pagination of API results, as per the
pagination
decorator and GitHub's API Pagination Traversal - Docker images in Docker Hub for easy deployment
- Compatibility with Linked Data Fragments servers, RDF dumps, and HTML+RDFa files
- [NEW] grlc integrates now SPARQLTransformer, allowing the use of queries in JSON (see this example).
- Generation of provenance in PROV of both the repo history (via Git2PROV) and grlc's activity additions
- Commit-based API versioning that's coherent with the repo versioning with git hashes
- SPARQL endpoint address can be set at the query level, repository level, and now also as a query parameter. This makes your APIs endpoint agnostic, and enables for generic and transposable queries!
- CONSTRUCT queries are now mapped automatically to GET requests, accept parameters in the WHERE clause, and return content in
text/turtle
orapplication/ld+json
- INSERT DATA queries are now mapped automatically to POST requests. Support is limited to queries with no WHERE clause, and parameters are always expected to be values for
g
(named graph where to insert the data) anddata
(with the triples to insert, inntriples
format). The INSERT query pattern is so far static, as defined in static.py. Only tested with Virtuoso.
Install and run
Running via docker is the easiest and preferred form of deploying grlc. You'll need a working installation of docker and docker-compose. To deploy grlc, just pull the latest image from Docker hub, and run docker compose with a docker-compose.yml
that suits your needs (an example is provided in the root directory):
git clone https://github.com/CLARIAH/grlc cd grlc docker pull clariah/grlc docker-compose -f docker-compose.default.yml up
To run directly from Docker Hub it is sufficient to do:
docker run --rm -p 8088:80 -e GRLC_SERVER_NAME=grlc.io -e GRLC_GITHUB_ACCESS_TOKEN=xxx -e GRLC_SPARQL_ENDPOINT=http://dbpedia.org/sparql -e DEBUG=true clariah/grlc
(You can omit the first two commands if you just copy this file somehwere in your filesystem)
If you use the supplied docker-compose.default.yml
your grlc instance will be available at http://localhost:8001
If you want your grlc instance to forward queries to a different service than grlc.io
, edit the GRLC_SERVER_NAME
variable in your docker-compose.yml
or docker-compose.default.yml
file.
In order for grlc to communicate with GitHub, you'll need to tell grlc what your access token is:
- Get a GitHub personal access token. In your GitHub's profile page, go to Settings, then Developer settings, Personal access tokens, and Generate new token
- You'll get an access token string, copy it and save it somewhere safe (GitHub won't let you see it again!)
- Edit your
docker-compose.yml
ordocker-compose.default.yml
file, and paste this token as value of the environment variable GRLC_GITHUB_ACCESS_TOKEN
If you want to run grlc at system boot as a service, you can find example upstart scripts at upstart/
Alternative install methods
Through these you'll miss some cool docker bundled features (like nginx-based caching). We provide these alternatives just for testing, development scenarios, or docker compatibility reasons.
Prerequisites
- Python3
- development files:
sudo apt-get install libevent-dev python-all-dev
pip
If you want to use grlc as a library, you'll find it useful to install via pip
.
pip install grlc grlc-server
More details can be found at grlc's PyPi page (thanks to c-martinez!).
Flask application
You can run grlc natively as follows:
gunicorn -c gunicorn_config.py src.server:app
Note: Since gunicorn
does not work under Windows, you can use waitress
instead:
waitress-serve --port=8088 src.server:app
You can also find an example here
Usage
grlc assumes a GitHub repository (support for general git repos is on the way) where you store your SPARQL queries as .rq files (like in this one). grlc will create an API operation per such a SPARQL query/.rq file.
If you're seeing this, your grlc instance is up and running, and ready to build APIs. Assuming you got it running at http://localhost:8088/
and your queries are at https://github.com/CEDAR-project/Queries
, just point your browser to the following locations:
- To request the swagger spec of your API,
http://localhost:8088/api/username/repo/spec
, e.g. http://localhost:8088/api/CEDAR-project/Queries/spec or http://localhost:8088/api/CLARIAH/wp4-queries/spec - To request the api-docs of your API swagger-ui style,
http://localhost:8088/api/username/repo/api-docs
, e.g. http://localhost:8088/api/CEDAR-project/Queries/api-docs or http://localhost:8088/api/CLARIAH/wp4-queries/api-docs
By default grlc will direct your queries to the DBPedia SPARQL endpoint. To change this either:
- Add a
endpoint
parameter to your request: 'http://grlc.io/user/repo/query?endpoint=http://sparql-endpoint/'. You can add a#+ endpoint_in_url: False
decorator if you DO NOT want to see theendpoint
parameter in the swagger-ui of your API. - Add a
#+ endpoint:
decorator in the first comment block of the query text (preferred, see below) - Add the URL of the endpoint on a single line in an
endpoint.txt
file within the GitHub repository that contains the queries. - Or you can directly modify the grlc source code (but it's nicer if the queries are self-contained)
That's it!
Example APIs
Check these out:
- http://grlc.io/api/CLARIAH/wp4-queries-hisco/
- http://grlc.io/api/albertmeronyo/lodapi/
- http://grlc.io/api/albertmeronyo/lsq-api
You'll find the sources of these and many more in GitHub
Decorator syntax
A couple of SPARQL comment embedded decorators are available to make your swagger-ui look nicer (note all comments start with #+
and the use of ':'
is restricted to list-representations and cannot be used in the summary text):
- To specify a query-specific endpoint,
#+ endpoint: http://example.com/sparql
. - To indicate the HTTP request method,
#+ method: GET
. - To paginate the results in e.g. groups of 100,
#+ pagination: 100
. - To create a summary of your query/operation,
#+ summary: This is the summary of my query/operation
- To assign tags to your query/operation,
#+ tags: #+ - firstTag #+ - secondTag
- To indicate which parameters of your query/operation should get enumerations (and get dropdown menus in the swagger-ui) using values from the SPARQL endpoint,
#+ enumerate: #+ - var1 #+ - var2
- These parameters can also be hard-coded into the query decorators to save endpoint requests and speed up the API generation:
#+ enumerate: #+ - var1: #+ - value1 #+ - value2
Notice that these should be plain variable names without SPARQL/BASIL conventions (so var1
instead of ?_var1_iri
)
See examples at https://github.com/albertmeronyo/lodapi.
Use this GitHub search to see examples from other users of grlc.
Contribute!
grlc needs you to continue bringing Semantic Web content to developers, applications and users. No matter if you are just a curious user, a developer, or a researcher; there are many ways in which you can contribute:
- File in bug reports
- Request new features
- Set up your own environment and start hacking
Check our contributing guidelines for these and more, and join us today!
If you cannot code, that's no problem! There's still plenty you can contribute:
- Share your experience at using grlc in Twitter (mention the handler @grlcldapi)
- If you are good with HTML/CSS, let us know
Related tools
- SPARQL2Git is a Web interface for editing SPARQL queries and saving them in GitHub as grlc APIs.
- grlcR is a package for R that brings Linked Data into your R environment easily through grlc.
- Hay's tools lists grlc as a Wikimedia-related tool :-)
This is what grlc users are saying
- Flavour your Linked Data with grlc, by Carlos Martinez
- Egon Willighagen's blog
Academic publications
- Albert Meroño-Peñuela, Rinke Hoekstra. “grlc Makes GitHub Taste Like Linked Data APIs”. The Semantic Web – ESWC 2016 Satellite Events, Heraklion, Crete, Greece, May 29 – June 2, 2016, Revised Selected Papers. LNCS 9989, pp. 342-353 (2016). (PDF)
- Albert Meroño-Peñuela, Rinke Hoekstra. “SPARQL2Git: Transparent SPARQL and Linked Data API Curation via Git”. In: Proceedings of the 14th Extended Semantic Web Conference (ESWC 2017), Poster and Demo Track. Portoroz, Slovenia, May 28th – June 1st, 2017 (2017). (PDF)
- Albert Meroño-Peñuela, Rinke Hoekstra. “Automatic Query-centric API for Routine Access to Linked Data”. In: The Semantic Web – ISWC 2017, 16th International Semantic Web Conference. Lecture Notes in Computer Science, vol 10587, pp. 334-339 (2017). (PDF)
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
File details
Details for the file grlc-1.3.0.tar.gz
.
File metadata
- Download URL: grlc-1.3.0.tar.gz
- Upload date:
- Size: 70.2 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/1.13.0 pkginfo/1.5.0.1 requests/2.22.0 setuptools/41.0.1 requests-toolbelt/0.9.1 tqdm/4.32.2 CPython/3.5.6
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 9f797e84fc3f6322c6583ce05227a84c99e3019606531748e2d39c4d359031c3 |
|
MD5 | 2ede264e4a9c8b56830cd477dacd7c40 |
|
BLAKE2b-256 | 2cbbbdb644240eb567036c933fa16c80797675d7ebd7d7b87eaa669a56419db7 |
Provenance
File details
Details for the file grlc-1.3.0-py3-none-any.whl
.
File metadata
- Download URL: grlc-1.3.0-py3-none-any.whl
- Upload date:
- Size: 74.5 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/1.13.0 pkginfo/1.5.0.1 requests/2.22.0 setuptools/41.0.1 requests-toolbelt/0.9.1 tqdm/4.32.2 CPython/3.5.6
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 65ebd73ae31a1caa216f08cae3968bba4fc126f3219686408c5038715bf9a78d |
|
MD5 | a150e71234e13d661cdf6e4b047c058a |
|
BLAKE2b-256 | e53c007989f18df88726e78b2a95a74bc83bd7b2dae328e11ff571cdb51a7d3e |