Skip to main content

A simple wrapper around multiple LLM/AI providers

Project description

EveryAI

License: MIT PyPI version EveryAI CI

EveryAI intends to provide a unified interface to multiple AI services, offering a single API that consuming applications can use without needing to know the specifics of the underlying service.

Currently the supported backends are:

  • OpenAI (chat and embeddings)
  • Anthropic (chat only)

Usage

To reate an instance of an EveryAI backend, use the init function:

import every_ai

backend = every_ai.init("openai")

Once you have a backend, you have access to methods depending on backend:

response = backend.chat("Tell me a joke")
embedding = backend.embed(["Embed this content"])

OpenAI

The OpenAI backend supports both chat completions and embeddings. It can be initialised with a dictionary containing the following settings:

  • api_key (required) - Your OpenAI API key
  • chat_model (default: gpt-3.5-turbo) - The OpenAI model to be used for chat completions, e.g. gpt-4, gpt-3.5-turbo.
  • embedding_model (default: text-embedding-ada-002) - The OpenAI model to be used for embeddings.

For example, to customise the OpenAI backend to use GPT4:

import every_ai

backend = every_ai.init("openai", api_key="foo-bar-baz", chat_model="gpt-4")

Anthropic

The Antrophic backend only supports chat completions. It can be initialised with a dictionary containing the following settings:

  • api_key (required) - Your Anthropic API key
  • chat_model (default: claude-instant-1) - The Anthropic model to be used for chat completions, e.g. claude-instant-1, claude-2.

For example, to customise the Anthropic backend to use claude-2:

import every_ai

backend = every_ai.init("anthropic", api_key="foo-bar-baz", chat_model="claude-2")

Links

Contributing

Install

To make changes to this project, first clone this repository:

git clone https://github.com/tomusher/every-ai.git
cd every-ai

With your preferred virtualenv activated, install testing dependencies:

Using pip

python -m pip install --upgrade pip>=21.3
python -m pip install -e .[testing] -U

Using flit

python -m pip install flit
flit install

pre-commit

Note that this project uses pre-commit. It is included in the project testing requirements. To set up locally:

# go to the project directory
$ cd every-ai
# initialize pre-commit
$ pre-commit install

# Optional, run all checks once for this, then the checks will run only on the changed files
$ git ls-files --others --cached --exclude-standard | xargs pre-commit run --files

How to run tests

Now you can run tests as shown below:

tox

or, you can run them for a specific environment tox -e python3.8 or specific test tox -e python3.9 every-ai.tests.test_file.TestClass.test_method

To run the test app interactively, use tox -e interactive, visit http://127.0.0.1:8020/admin/ and log in with admin/changeme.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

every_ai-1.1.0.tar.gz (7.0 kB view details)

Uploaded Source

Built Distribution

every_ai-1.1.0-py3-none-any.whl (7.6 kB view details)

Uploaded Python 3

File details

Details for the file every_ai-1.1.0.tar.gz.

File metadata

  • Download URL: every_ai-1.1.0.tar.gz
  • Upload date:
  • Size: 7.0 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: python-requests/2.31.0

File hashes

Hashes for every_ai-1.1.0.tar.gz
Algorithm Hash digest
SHA256 15846c51917406ba90449225416edb431f6c75ee4713b4d53e33dda09e04b8b3
MD5 a9928acbce54c71e52f5b9c81dda0b0f
BLAKE2b-256 0afe225fe734b99e98bbfbf33bed78e5c225a4d72870dc79b3faf9882f58a7a7

See more details on using hashes here.

File details

Details for the file every_ai-1.1.0-py3-none-any.whl.

File metadata

  • Download URL: every_ai-1.1.0-py3-none-any.whl
  • Upload date:
  • Size: 7.6 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: python-requests/2.31.0

File hashes

Hashes for every_ai-1.1.0-py3-none-any.whl
Algorithm Hash digest
SHA256 ad84c3537b1f1f0c8d747e5cf3b158ce128189da98bd8f3b421d05e37086c1a0
MD5 925542eb00d9dc4b6feb5cc3079e6984
BLAKE2b-256 da5c1e2c56c00cd5d770a20f32369e8585298871503e44c6c5939e7ec7115e4c

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page