Skip to main content

No project description provided

Project description

LangServe 🦜️🔗

Overview

LangServe is a library that allows developers to host their Langchain runnables / call into them remotely from a runnable interface.

Examples

For more examples, see the examples directory.

Server

#!/usr/bin/env python
from fastapi import FastAPI
from langchain.prompts import ChatPromptTemplate
from langchain.chat_models import ChatAnthropic, ChatOpenAI
from langserve import add_routes
from typing_extensions import TypedDict


app = FastAPI(
  title="LangChain Server",
  version="1.0",
  description="A simple api server using Langchain's Runnable interfaces",
)


add_routes(
    app,
    ChatOpenAI(),
    path="/openai",
)

add_routes(
    app,
    ChatAnthropic(),
    path="/anthropic",
)

model = ChatAnthropic()
prompt = ChatPromptTemplate.from_template("tell me a joke about {topic}")
add_routes(app, prompt | model, path="/chain")

if __name__ == "__main__":
    import uvicorn

    uvicorn.run(app, host="localhost", port=8000)

Client

from langchain.schema import SystemMessage, HumanMessage
from langchain.prompts import ChatPromptTemplate
from langchain.schema.runnable import RunnableMap
from langserve import RemoteRunnable

openai = RemoteRunnable("http://localhost:8000/openai/")
anthropic = RemoteRunnable("http://localhost:8000/anthropic/")
joke_chain = RemoteRunnable("http://localhost:8000/chain/")

joke_chain.invoke({"topic": "parrots"})

# or async
await joke_chain.ainvoke({"topic": "parrots"})

prompt = [
    SystemMessage(content='Act like either a cat or a parrot.'), 
    HumanMessage(content='Hello!')
]

# Supports astream
async for msg in anthropic.astream(prompt):
    print(msg, end="", flush=True)
    
prompt = ChatPromptTemplate.from_messages(
    [("system", "Tell me a long story about {topic}")]
)
    
# Can define custom chains
chain = prompt | RunnableMap({
    "openai": openai,
    "anthropic": anthropic,
})

chain.batch([{ "topic": "parrots" }, { "topic": "cats" }])

Installation

pip install langserve[all]

or use client extra for client code, and server extra for server code.

Features

  • Deploy runnables with FastAPI
  • Client can use remote runnables almost as if they were local
    • Supports async
    • Supports batch
    • Supports stream

Limitations

  • Chain callbacks cannot be passed from the client to the server

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

langserve-0.0.7.tar.gz (11.1 kB view details)

Uploaded Source

Built Distribution

langserve-0.0.7-py3-none-any.whl (11.8 kB view details)

Uploaded Python 3

File details

Details for the file langserve-0.0.7.tar.gz.

File metadata

  • Download URL: langserve-0.0.7.tar.gz
  • Upload date:
  • Size: 11.1 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/4.0.2 CPython/3.11.6

File hashes

Hashes for langserve-0.0.7.tar.gz
Algorithm Hash digest
SHA256 4506b2289a9eb8815259d9fe9a2e24f5017bc9f8dd2b5c2bea70bf972cdf2fba
MD5 13f1fd362e9874b324edbf98783de348
BLAKE2b-256 122deda345301b1af4b5fc1c364d96655b6f8490e7a7762c2326f81e4c7146b2

See more details on using hashes here.

File details

Details for the file langserve-0.0.7-py3-none-any.whl.

File metadata

  • Download URL: langserve-0.0.7-py3-none-any.whl
  • Upload date:
  • Size: 11.8 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/4.0.2 CPython/3.11.6

File hashes

Hashes for langserve-0.0.7-py3-none-any.whl
Algorithm Hash digest
SHA256 6267c0d118f2d8b49e3b7bc9d2c8a7afabd9be975caf5664312b5cf7a7255b1b
MD5 7388c3353211d36e775fb3752dd1061f
BLAKE2b-256 a41dbfd16ed7481062ff05855fb5f50cb7d5fbd11372ec43ef8d7325e82e6031

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page