LLM unified service
Project description
Modelz LLM
Modelz LLM is an inference server that facilitates the utilization of open source large language models (LLMs), such as FastChat, LLaMA, and ChatGLM, on either local or cloud-based environments with OpenAI compatible API.
Features
- OpenAI compatible API: Modelz LLM provides an OpenAI compatible API for LLMs, which means you can use the OpenAI python SDK to interact with the model.
- Self-hosted: Modelz LLM can be easily deployed on either local or cloud-based environments.
- Open source LLMs: Modelz LLM supports open source LLMs, such as FastChat, LLaMA, and ChatGLM.
- Modelz integration: Modelz LLM can be easily integrated with Modelz, which is a serverless inference platform for LLMs and other foundation models.
Quick Start
Install
pip install modelz-llm
# or install from source
pip install git+https://github.com/tensorchord/modelz-llm.git
Run the self-hosted API server
Please first start the self-hosted API server by following the instructions:
export MODELZ_MODEL="THUDM/chatglm-6b-int4"
modelz-llm -m MODELZ_MODEL
Currently, we support the following models:
Model Name | Model (MODELZ_MODEL ) |
Docker Image |
---|---|---|
Vicuna 7B Delta V1.1 | lmsys/vicuna-7b-delta-v1.1 |
modelzai/llm-vicuna-7b |
LLaMA 7B | decapoda-research/llama-7b-hf |
modelzai/llm-llama-7b |
ChatGLM 6B INT4 | THUDM/chatglm-6b-int4 |
modelzai/llm-chatglm-6b-int4 |
ChatGLM 6B | THUDM/chatglm-6b |
modelzai/llm-chatglm-6b |
You could set the MODELZ_MODEL
environment variables to specify the model and tokenizer.
Use OpenAI python SDK
Then you can use the OpenAI python SDK to interact with the model:
import openai
openai.api_base="http://localhost:8000"
openai.api_key="any"
# create a chat completion
chat_completion = openai.ChatCompletion.create(model="any", messages=[{"role": "user", "content": "Hello world"}])
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
modelz-llm-23.5.12.tar.gz
(12.8 kB
view hashes)
Built Distribution
Close
Hashes for modelz_llm-23.5.12-py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | d5734a50abd0845056785d0977366d6e07e3996cfed8ab0afb309a2015430460 |
|
MD5 | 8b41ce0ed11cdcf1e33708de506f2a1f |
|
BLAKE2b-256 | aa79152b00212cf33fe2821a9e02e91aca4624761d49d1549306186d80dafb87 |