Skip to main content

An integration package connecting GoogleVertexAI and LangChain

Project description

langchain-google-vertexai

This package contains the LangChain integrations for Google Cloud generative models.

Installation

pip install -U langchain-google-vertexai

Chat Models

ChatVertexAI class exposes models .

To use, you should have Google Cloud project with APIs enabled, and configured credentials. Initialize the model as:

from langchain_google_vertexai import ChatVertexAI

llm = ChatVertexAI(model_name="gemini-pro")
llm.invoke("Sing a ballad of LangChain.")

You can use other models, e.g. chat-bison:

from langchain_google_vertexai import ChatVertexAI

llm = ChatVertexAI(model_name="chat-bison", temperature=0.3)
llm.invoke("Sing a ballad of LangChain.")

Multimodal inputs

Gemini vision model supports image inputs when providing a single chat message. Example:

from langchain_core.messages import HumanMessage
from langchain_google_vertexai import ChatVertexAI

llm = ChatVertexAI(model_name="gemini-pro-vision")
# example
message = HumanMessage(
    content=[
        {
            "type": "text",
            "text": "What's in this image?",
        },  # You can optionally provide text parts
        {"type": "image_url", "image_url": {"url": "https://picsum.photos/seed/picsum/200/300"}},
    ]
)
llm.invoke([message])

The value of image_url can be any of the following:

  • A public image URL
  • An accessible gcs file (e.g., "gcs://path/to/file.png")
  • A local file path
  • A base64 encoded image (e.g., data:image/png;base64,abcd124)

Embeddings

You can use Google Cloud's embeddings models as:

from langchain_google_vertexai import VertexAIEmbeddings

embeddings = VertexAIEmbeddings()
embeddings.embed_query("hello, world!")

LLMs

You can use Google Cloud's generative AI models as Langchain LLMs:

from langchain.prompts import PromptTemplate
from langchain_google_vertexai import VertexAI

template = """Question: {question}

Answer: Let's think step by step."""
prompt = PromptTemplate.from_template(template)

chain = prompt | llm

question = "Who was the president in the year Justin Beiber was born?"
print(chain.invoke({"question": question}))

You can use Gemini and Palm models, including code-generations ones:

from langchain_google_vertexai import VertexAI

llm = VertexAI(model_name="code-bison", max_output_tokens=1000, temperature=0.3)

question = "Write a python function that checks if a string is a valid email address"

output = llm(question)

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

langchain_google_vertexai-0.0.1.post1.tar.gz (14.8 kB view details)

Uploaded Source

Built Distribution

File details

Details for the file langchain_google_vertexai-0.0.1.post1.tar.gz.

File metadata

File hashes

Hashes for langchain_google_vertexai-0.0.1.post1.tar.gz
Algorithm Hash digest
SHA256 c3dc15048c69bd9d755f71d8b3caf639f5dc48bd163e2e7aa717d2ef25721870
MD5 eb0032778aaab56ab4810f04140f2612
BLAKE2b-256 5d12b6e3fbb63febddd4db45f191637b0ed40c50cca28cc46eceb216746b78b6

See more details on using hashes here.

File details

Details for the file langchain_google_vertexai-0.0.1.post1-py3-none-any.whl.

File metadata

File hashes

Hashes for langchain_google_vertexai-0.0.1.post1-py3-none-any.whl
Algorithm Hash digest
SHA256 fbe4f3b6a574f0cc37622d24b34cdccf85402b847880e84989f266a0e393c331
MD5 3291b3c3e839146a7d2cfcab4f37f35e
BLAKE2b-256 753dda8d922602fe6f5c003d60365b857e60d9bcb92a8c935df9c7814062699c

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page