Skip to main content

A simple wrapper around multiple LLM/AI providers

Project description

EveryAI

License: MIT PyPI version EveryAI CI

EveryAI intends to provide a unified interface to multiple AI services, offering a single API that consuming applications can use without needing to know the specifics of the underlying service.

Currently the supported backends are:

  • OpenAI (chat and embeddings)
  • Anthropic (chat only)

Usage

To reate an instance of an EveryAI backend, use the init function:

import every_ai

backend = every_ai.init("openai")

Once you have a backend, you have access to methods depending on backend:

response = backend.chat("Tell me a joke")
embedding = backend.embed(["Embed this content"])

OpenAI

The OpenAI backend supports both chat completions and embeddings. It can be initialised with a dictionary containing the following settings:

  • api_key (required) - Your OpenAI API key
  • chat_model (default: gpt-3.5-turbo) - The OpenAI model to be used for chat completions, e.g. gpt-4, gpt-3.5-turbo.
  • embedding_model (default: text-embedding-ada-002) - The OpenAI model to be used for embeddings.

For example, to customise the OpenAI backend to use GPT4:

import every_ai

backend = every_ai.init("openai", api_key="foo-bar-baz", chat_model="gpt-4")

Anthropic

The Antrophic backend only supports chat completions. It can be initialised with a dictionary containing the following settings:

  • api_key (required) - Your Anthropic API key
  • chat_model (default: claude-instant-1) - The Anthropic model to be used for chat completions, e.g. claude-instant-1, claude-2.

For example, to customise the Anthropic backend to use claude-2:

import every_ai

backend = every_ai.init("anthropic", api_key="foo-bar-baz", chat_model="claude-2")

Links

Contributing

Install

To make changes to this project, first clone this repository:

git clone https://github.com/tomusher/every-ai.git
cd every-ai

With your preferred virtualenv activated, install testing dependencies:

Using pip

python -m pip install --upgrade pip>=21.3
python -m pip install -e .[testing] -U

Using flit

python -m pip install flit
flit install

pre-commit

Note that this project uses pre-commit. It is included in the project testing requirements. To set up locally:

# go to the project directory
$ cd every-ai
# initialize pre-commit
$ pre-commit install

# Optional, run all checks once for this, then the checks will run only on the changed files
$ git ls-files --others --cached --exclude-standard | xargs pre-commit run --files

How to run tests

Now you can run tests as shown below:

tox

or, you can run them for a specific environment tox -e python3.8 or specific test tox -e python3.9 every-ai.tests.test_file.TestClass.test_method

To run the test app interactively, use tox -e interactive, visit http://127.0.0.1:8020/admin/ and log in with admin/changeme.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

every_ai-1.1.1.tar.gz (7.0 kB view details)

Uploaded Source

Built Distribution

every_ai-1.1.1-py3-none-any.whl (7.6 kB view details)

Uploaded Python 3

File details

Details for the file every_ai-1.1.1.tar.gz.

File metadata

  • Download URL: every_ai-1.1.1.tar.gz
  • Upload date:
  • Size: 7.0 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: python-requests/2.31.0

File hashes

Hashes for every_ai-1.1.1.tar.gz
Algorithm Hash digest
SHA256 9abd26eb8dde21d776696291230810daaee69d00ad08e2c2295afbac1295fef3
MD5 2ea0834fdfee31ee18098b76daa9bce1
BLAKE2b-256 d62bb0b78325927c2c49081081a38b319c631e298c2b33ece3a32f761160bf05

See more details on using hashes here.

File details

Details for the file every_ai-1.1.1-py3-none-any.whl.

File metadata

  • Download URL: every_ai-1.1.1-py3-none-any.whl
  • Upload date:
  • Size: 7.6 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: python-requests/2.31.0

File hashes

Hashes for every_ai-1.1.1-py3-none-any.whl
Algorithm Hash digest
SHA256 f75bf0c20175d46d2ddea7d2169f0d5ac684b4b18c1e56b9c0ee0b6d62e54be2
MD5 947c182c1c5d65ce9e7adf585c4d4eb6
BLAKE2b-256 0c4d520c361850d9ec25dd43b30816ca5cb5f14091219fa5f45966c61983a427

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page