Skip to main content

An integration package connecting AI21 and LangChain

Project description

langchain-ai21

This package contains the LangChain integrations for AI21 through their AI21 SDK.

Installation and Setup

  • Install the AI21 partner package
pip install langchain-ai21
  • Get an AI21 api key and set it as an environment variable (AI21_API_KEY)

Chat Models

This package contains the ChatAI21 class, which is the recommended way to interface with AI21 Chat models.

To use, install the requirements, and configure your environment.

export AI21_API_KEY=your-api-key

Then initialize

from langchain_core.messages import HumanMessage
from langchain_ai21.chat_models import ChatAI21

chat = ChatAI21(model="j2-ultra")
messages = [HumanMessage(content="Hello from AI21")]
chat.invoke(messages)

LLMs

You can use AI21's generative AI models as Langchain LLMs:

from langchain.prompts import PromptTemplate
from langchain_ai21 import AI21LLM

llm = AI21LLM(model="j2-ultra")

template = """Question: {question}

Answer: Let's think step by step."""
prompt = PromptTemplate.from_template(template)

chain = prompt | llm

question = "Which scientist discovered relativity?"
print(chain.invoke({"question": question}))

Embeddings

You can use AI21's embeddings models as:

Query

from langchain_ai21 import AI21Embeddings

embeddings = AI21Embeddings()
embeddings.embed_query("Hello! This is some query")

Document

from langchain_ai21 import AI21Embeddings

embeddings = AI21Embeddings()
embeddings.embed_documents(["Hello! This is document 1", "And this is document 2!"])

Task Specific Models

Contextual Answers

You can use AI21's contextual answers model to receives text or document, serving as a context, and a question and returns an answer based entirely on this context.

This means that if the answer to your question is not in the document, the model will indicate it (instead of providing a false answer)

from langchain_ai21 import AI21ContextualAnswers

tsm = AI21ContextualAnswers()

response = tsm.invoke(input={"context": "Your context", "question": "Your question"})

You can also use it with chains and output parsers and vector DBs:

from langchain_ai21 import AI21ContextualAnswers
from langchain_core.output_parsers import StrOutputParser

tsm = AI21ContextualAnswers()
chain = tsm | StrOutputParser()

response = chain.invoke(
    {"context": "Your context", "question": "Your question"},
)

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

langchain_ai21-0.1.1.tar.gz (8.0 kB view details)

Uploaded Source

Built Distribution

langchain_ai21-0.1.1-py3-none-any.whl (9.5 kB view details)

Uploaded Python 3

File details

Details for the file langchain_ai21-0.1.1.tar.gz.

File metadata

  • Download URL: langchain_ai21-0.1.1.tar.gz
  • Upload date:
  • Size: 8.0 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/5.0.0 CPython/3.12.2

File hashes

Hashes for langchain_ai21-0.1.1.tar.gz
Algorithm Hash digest
SHA256 a9d9280432046d52adc0ccc76bf591629f14dae7f279c9283d791746f6d96a5e
MD5 0f81e2a8d0619bedcb191c012679d911
BLAKE2b-256 1237c3cd09d17d03b03759c79e08768bf073f9b20c4f22bd8f1162f682c5cd67

See more details on using hashes here.

File details

Details for the file langchain_ai21-0.1.1-py3-none-any.whl.

File metadata

File hashes

Hashes for langchain_ai21-0.1.1-py3-none-any.whl
Algorithm Hash digest
SHA256 6ca36250cc90c8c93fa5853eb12f72d403fad52f70f5c83ac6d305ea962b710e
MD5 5d83543c0f787e4cb0c3b5122f6f4dd6
BLAKE2b-256 15e86dcc4f997f2c21b55e2ce7560af67db7dba87d92fd648f64174787d1fb01

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page