Skip to main content

An integration package connecting Postgres and LangChain

Project description

langchain-postgres

Release Notes CI License: MIT Twitter Open Issues

The langchain-postgres package implementations of core LangChain abstractions using Postgres.

The package is released under the MIT license.

Feel free to use the abstraction as provided or else modify them / extend them as appropriate for your own application.

Requirements

The package currently only supports the psycogp3 driver.

Installation

pip install -U langchain-postgres

Usage

PostgresSaver (LangGraph Checkpointer)

The LangGraph checkpointer can be used to add memory to your LangGraph application.

PostgresSaver is an implementation of the checkpointer saver using Postgres as the backend.

Currently, only the psycopg3 driver is supported.

Sync usage:

from psycopg_pool import ConnectionPool
from langchain_postgres import (
    PostgresSaver, PickleCheckpointSerializer
)

pool = ConnectionPool(
    # Example configuration
    conninfo="postgresql://langchain:langchain@localhost:6024/langchain",
    max_size=20,
)

PostgresSaver.create_tables(pool)

checkpointer = PostgresSaver(
    serializer=PickleCheckpointSerializer(),
    sync_connection=pool,
)

# Set up the langgraph workflow with the checkpointer
workflow = ... # Fill in with your workflow
app = workflow.compile(checkpointer=checkpointer)

# Use with the sync methods of `app` (e.g., `app.stream())

pool.close() # Remember to close the connection pool.

Async usage:

from psycopg_pool import AsyncConnectionPool
from langchain_postgres import (
    PostgresSaver, PickleCheckpointSerializer
)

pool = AsyncConnectionPool(
    # Example configuration
    conninfo="postgresql://langchain:langchain@localhost:6024/langchain",
    max_size=20,
)

# Create the tables in postgres (only needs to be done once)
await PostgresSaver.acreate_tables(pool)

checkpointer = PostgresSaver(
    serializer=PickleCheckpointSerializer(),
    async_connection=pool,
)

# Set up the langgraph workflow with the checkpointer
workflow = ... # Fill in with your workflow
app = workflow.compile(checkpointer=checkpointer)

# Use with the async methods of `app` (e.g., `app.astream()`)

await pool.close() # Remember to close the connection pool.

Testing

If testing with the postgres checkpointer it may be useful to both create and drop the tables before and after the tests.

from psycopg_pool import ConnectionPool
from langchain_postgres import (
    PostgresSaver 
)
with ConnectionPool(
    # Example configuration
    conninfo="postgresql://langchain:langchain@localhost:6024/langchain",
    max_size=20,
) as conn:
    PostgresSaver.create_tables(conn)
    PostgresSaver.drop_tables(conn)
    # Run your unit tests with langgraph

ChatMessageHistory

The chat message history abstraction helps to persist chat message history in a postgres table.

PostgresChatMessageHistory is parameterized using a table_name and a session_id.

The table_name is the name of the table in the database where the chat messages will be stored.

The session_id is a unique identifier for the chat session. It can be assigned by the caller using uuid.uuid4().

import uuid

from langchain_core.messages import SystemMessage, AIMessage, HumanMessage
from langchain_postgres import PostgresChatMessageHistory
import psycopg

# Establish a synchronous connection to the database
# (or use psycopg.AsyncConnection for async)
conn_info = ... # Fill in with your connection info
sync_connection = psycopg.connect(conn_info)

# Create the table schema (only needs to be done once)
table_name = "chat_history"
PostgresChatMessageHistory.create_tables(sync_connection, table_name)

session_id = str(uuid.uuid4())

# Initialize the chat history manager
chat_history = PostgresChatMessageHistory(
    table_name,
    session_id,
    sync_connection=sync_connection
)

# Add messages to the chat history
chat_history.add_messages([
    SystemMessage(content="Meow"),
    AIMessage(content="woof"),
    HumanMessage(content="bark"),
])

print(chat_history.messages)

Vectorstore

See example for the PGVector vectorstore here

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

langchain_postgres-0.0.4.tar.gz (22.0 kB view details)

Uploaded Source

Built Distribution

langchain_postgres-0.0.4-py3-none-any.whl (22.6 kB view details)

Uploaded Python 3

File details

Details for the file langchain_postgres-0.0.4.tar.gz.

File metadata

  • Download URL: langchain_postgres-0.0.4.tar.gz
  • Upload date:
  • Size: 22.0 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/5.0.0 CPython/3.12.3

File hashes

Hashes for langchain_postgres-0.0.4.tar.gz
Algorithm Hash digest
SHA256 d7aa27cc52ec5490c714f7c6176942b096d72001826eb741f999b95d4577510b
MD5 9503ea46886b144b060969f26cd9c84c
BLAKE2b-256 b97f8cb0e217c804285376d372d42145e08f8bc04efcc7d08b8d2ad732b56ba6

See more details on using hashes here.

File details

Details for the file langchain_postgres-0.0.4-py3-none-any.whl.

File metadata

File hashes

Hashes for langchain_postgres-0.0.4-py3-none-any.whl
Algorithm Hash digest
SHA256 793db51e2130e77c61e6311a54e84669612846ddd60418223612710777f21f93
MD5 5dc2a09020e7669aac1b5dd86002538b
BLAKE2b-256 f90d213f57c23b0d52689472c03524db0d27e50c15f61b6281e323fe0e67a548

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page