An easy to use Natural Language Processing library and framework for predicting, training, fine-tuning, and serving up state-of-the-art NLP models.
Project description
Welcome to AdaptNLP
A high level framework and library for running, training, and deploying state-of-the-art Natural Language Processing (NLP) models for end to end tasks.
What is AdaptNLP?
AdaptNLP is a python package that allows users ranging from beginner python coders to experienced Machine Learning Engineers to leverage state-of-the-art Natural Language Processing (NLP) models and training techniques in one easy-to-use python package.
Utilizing fastai with HuggingFace's Transformers library and Humboldt University of Berlin's Flair library, AdaptNLP provides Machine Learning Researchers and Scientists a modular and adaptive approach to a variety of NLP tasks simplifying what it takes to train, perform inference, and deploy NLP-based models and microservices.
What is the Benefit of AdaptNLP Rather Than Just Using Transformers?
Despite quick inference functionalities such as the pipeline
API in transformers
, it still is not quite as flexible nor fast enough. With AdaptNLP's Easy*
inference modules, these tend to be slightly faster than the pipeline
interface (bare minimum the same speed), while also providing the user with simple intuitive returns to alleviate any unneeded junk that may be returned.
Along with this, with the integration of the fastai
library the code needed to train or run inference on your models has a completely modular API through the fastai
Callback system. Rather than needing to write your entire torch loop, if there is anything special needed for a model a Callback can be written in less than 10 lines of code to achieve your specific functionalities.
Finally, when training your model fastai is on the forefront of beign a library constantly bringing in the best practices for achiving state-of-the-art training with new research methodologies heavily tested before integration. As such, AdaptNLP fully supports training with the One-Cycle policy, and using new optimizer combinations such as the Ranger optimizer with Cosine Annealing training through simple one-line fitting functions (fit_one_cycle
and fit_flat_cos
).
Installation Directions
PyPi
To install with pypi, please use:
pip install adaptnlp
Or if you have pip3:
pip3 install adaptnlp
Conda (Coming Soon)
Developmental Builds
To install any developmental style builds, please follow the below directions to install directly from git:
Stable Master Branch The master branch generally is not updated much except for hotfixes and new releases. To install please use:
pip install git+https://github.com/Novetta/adaptnlp
Developmental Branch {% include note.html content='Generally this branch can become unstable, and it is only recommended for contributors or those that really want to test out new technology. Please make sure to see if the latest tests are passing (A green checkmark on the commit message) before trying this branch out' %} You can install the developmental builds with:
pip install git+https://github.com/Novetta/adaptnlp@dev
Docker Images
There are actively updated Docker images hosted on Novetta's DockerHub
The guide to each tag is as follows:
- latest: This is the latest pypi release and installs a complete package that is CUDA capable
- dev: These are occasionally built developmental builds at certain stages. They are built by the
dev
branch and are generally stable - *api: The API builds are for the REST-API
To pull and run any AdaptNLP image immediatly you can run:
docker run -itp 8888:8888 novetta/adaptnlp:TAG
Replacing TAG
with any of the afformentioned tags earlier.
Afterwards check localhost:8888
or localhost:888/lab
to access the notebook containers
Navigating the Documentation
The AdaptNLP library is built with nbdev, so any documentation page you find (including this one!) can be directly run as a Jupyter Notebook. Each page at the top includes an "Open in Colab" button as well that will open the notebook in Google Colaboratory to allow for immediate access to the code.
The documentation is split into six sections, each with a specific purpose:
Getting Started
This group contains quick access to the homepage, what are the AdaptNLP Cookbooks, and how to contribute
Models and Model Hubs
These contain any relevant documentation for the AdaptiveModel
class, the HuggingFace Hub model search integration, and the Result
class that various inference API's return
Class API
This section contains the module documentation for the inference framework, the tuning framework, as well as the utilities and foundations for the AdaptNLP library.
Inference and Training Cookbooks
These two sections provide quick access to single use recipies for starting any AdaptNLP project for a particular task, with easy to use code designed for that specific use case. There are currently over 13 different tutorials available, with more coming soon.
NLP Services with FastAPI
This section provides directions on how to use the AdaptNLP REST API for deploying your models quickly with FastAPI
Contributing
There is a controbution guide available here
Testing
AdaptNLP is run on the nbdev
framework. To run all tests please do the following:
pip install nbverbose
git clone https://github.com/Novetta/adaptnlp
cd adaptnlp
pip install -e .
nbdev_test_nbs
This will run every notebook and ensure that all tests have passed. Please see the nbdev documentation for more information about it.
Contact
Please contact Zachary Mueller at zmueller@novetta.com with questions or comments regarding AdaptNLP.
Follow us on Twitter at @TheZachMueller and @AdaptNLP for updates and NLP dialogue.
License
This project is licensed under the terms of the Apache 2.0 license.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
File details
Details for the file adaptnlp-0.3.1.tar.gz
.
File metadata
- Download URL: adaptnlp-0.3.1.tar.gz
- Upload date:
- Size: 54.8 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/3.3.0 pkginfo/1.4.2 requests/2.25.1 setuptools/57.0.0 requests-toolbelt/0.9.1 tqdm/4.49.0 CPython/3.7.0
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | aa8b9939c87b9ed120877d3932cd259ba5a2bee0a64e76c90c9f5752fdfdda69 |
|
MD5 | 0e0db540be001e686e1c52438667e16c |
|
BLAKE2b-256 | adc983de37e876f026f87c85da8744de594f5c8d58252fa1f479c1195be0e9de |
File details
Details for the file adaptnlp-0.3.1-py3-none-any.whl
.
File metadata
- Download URL: adaptnlp-0.3.1-py3-none-any.whl
- Upload date:
- Size: 62.8 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/3.3.0 pkginfo/1.4.2 requests/2.25.1 setuptools/57.0.0 requests-toolbelt/0.9.1 tqdm/4.49.0 CPython/3.7.0
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 58510728c11eb8d8a7bb0e111fc1e992863d09330400b50898f53e7be80fa72c |
|
MD5 | bc49325255dc9ceef13d9ccb9e8cefe5 |
|
BLAKE2b-256 | 7040353ab75baa59f0590d018a0669ccb5b64aa81b058c2d0f5746efd745f676 |