Skip to main content

No project description provided

Project description

LangServe 🦜️🔗

Overview

LangServe is a library that allows developers to host their Langchain runnables / call into them remotely from a runnable interface.

Examples

For more examples, see the examples directory.

Server

#!/usr/bin/env python
from fastapi import FastAPI
from langchain.prompts import ChatPromptTemplate
from langchain.chat_models import ChatAnthropic, ChatOpenAI
from langserve import add_routes
from typing_extensions import TypedDict


app = FastAPI(
  title="LangChain Server",
  version="1.0",
  description="A simple api server using Langchain's Runnable interfaces",
)


add_routes(
    app,
    ChatOpenAI(),
    path="/openai",
    config_keys=[],
)

add_routes(
    app,
    ChatAnthropic(),
    path="/anthropic",
    config_keys=[],
)

# Serve a joke chain
class ChainInput(TypedDict):
    """The input to the chain."""

    topic: str
    """The topic of the joke."""

model = ChatAnthropic()
prompt = ChatPromptTemplate.from_template("tell me a joke about {topic}")
add_routes(app, prompt | model, path="/chain", input_type=ChainInput)

if __name__ == "__main__":
    import uvicorn

    uvicorn.run(app, host="localhost", port=8000)

Client

from langchain.schema import SystemMessage, HumanMessage
from langchain.prompts import ChatPromptTemplate
from langchain.schema.runnable import RunnableMap
from langserve import RemoteRunnable

openai = RemoteRunnable("http://localhost:8000/openai/")
anthropic = RemoteRunnable("http://localhost:8000/anthropic/")
joke_chain = RemoteRunnable("http://localhost:8000/chain/")

joke_chain.invoke({"topic": "parrots"})

# or async
await joke_chain.ainvoke({"topic": "parrots"})

prompt = [
    SystemMessage(content='Act like either a cat or a parrot.'), 
    HumanMessage(content='Hello!')
]

# Supports astream
async for msg in anthropic.astream(prompt):
    print(msg, end="", flush=True)
    
prompt = ChatPromptTemplate.from_messages(
    [("system", "Tell me a long story about {topic}")]
)
    
# Can define custom chains
chain = prompt | RunnableMap({
    "openai": openai,
    "anthropic": anthropic,
})

chain.batch([{ "topic": "parrots" }, { "topic": "cats" }])

Installation

pip install langserve[all]

or use client extra for client code, and server extra for server code.

Features

  • Deploy runnables with FastAPI
  • Client can use remote runnables almost as if they were local
    • Supports async
    • Supports batch
    • Supports stream

Limitations

  • Chain callbacks cannot be passed from the client to the server

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

langserve-0.0.3.tar.gz (10.8 kB view details)

Uploaded Source

Built Distribution

langserve-0.0.3-py3-none-any.whl (11.4 kB view details)

Uploaded Python 3

File details

Details for the file langserve-0.0.3.tar.gz.

File metadata

  • Download URL: langserve-0.0.3.tar.gz
  • Upload date:
  • Size: 10.8 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/4.0.2 CPython/3.11.5

File hashes

Hashes for langserve-0.0.3.tar.gz
Algorithm Hash digest
SHA256 09507fe300b9a3e1beb83734092b5d440bd2e1f69ae56190f310b2e41ed3825f
MD5 d1d8265467414fc13b780bc52cddce86
BLAKE2b-256 6755399fc93dab6ef321e7824c485b7489c24068732664b4ec76a67bd21390a8

See more details on using hashes here.

File details

Details for the file langserve-0.0.3-py3-none-any.whl.

File metadata

  • Download URL: langserve-0.0.3-py3-none-any.whl
  • Upload date:
  • Size: 11.4 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/4.0.2 CPython/3.11.5

File hashes

Hashes for langserve-0.0.3-py3-none-any.whl
Algorithm Hash digest
SHA256 3b56cb11ea4ac3f84499f93a2240e6dd3903f8e8fc750a759c848fb4ae7c7f2f
MD5 f9fef10f15093d36770bc75490762668
BLAKE2b-256 b096cb4b0f15356710bb216aed464ec1415081b26d8ace4448e24cdf99d10b07

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page