Skip to main content

Google Generative AI High level API client library and tools.

Project description

Google Python SDK for the Gemini API

PyPI version Python support PyPI - Downloads

The Google AI Python SDK enables developers to use Google's state-of-the-art generative AI models (like Gemini and PaLM) to build AI-powered features and applications. This SDK supports use cases like:

  • Generate text from text-only input
  • Generate text from text-and-images input (multimodal) (for Gemini only)
  • Build multi-turn conversations (chat)
  • Embedding

For example, with just a few lines of code, you can access Gemini's multimodal capabilities to generate text from text-and-image input:

model = genai.GenerativeModel('gemini-pro-vision')

cookie_picture = {
    'mime_type': 'image/png',
    'data': Path('cookie.png').read_bytes()
}
prompt = "Give me a recipe for this:"

response = model.generate_content(
    content=[prompt, cookie_picture]
)
print(response.text)

Try out the API

Install from PyPI.

pip install google-generativeai

Obtain an API key from AI Studio, then configure it here.

Import the SDK and load a model.

import google.generativeai as genai

genai.configure(api_key=os.environ["API_KEY"])

model = genai.GenerativeModel('gemini-pro')

Use GenerativeModel.generate_content to have the model complete some initial text.

response = model.generate_content("The opposite of hot is")
print(response.text)  # cold.

Use GenerativeModel.start_chat to have a discussion with a model.

chat = model.start_chat()
response = chat.send_message('Hello, what should I have for dinner?')
print(response.text) #  'Here are some suggestions...'
response = chat.send_message("How do I cook the first one?")

Installation and usage

Run pip install google-generativeai.

For detailed instructions, you can find a quickstart for the Google AI Python SDK in the Google documentation.

This quickstart describes how to add your API key and install the SDK in your app, initialize the model, and then call the API to access the model. It also describes some additional use cases and features, like streaming, embedding, counting tokens, and controlling responses.

Documentation

Find complete documentation for the Google AI SDKs and the Gemini model in the Google documentation: https://ai.google.dev/docs

Contributing

See Contributing for more information on contributing to the Google AI Python SDK.

Developers who use the PaLM API

Migrate to use the Gemini API

Check our migration guide in the Google documentation.

Installation and usage for the PaLM API

Install from PyPI.

pip install google-generativeai

Obtain an API key from AI Studio, then configure it here.

import google.generativeai as palm

palm.configure(api_key=os.environ["PALM_API_KEY"])

Use palm.generate_text to have the model complete some initial text.

response = palm.generate_text(prompt="The opposite of hot is")
print(response.result)  # cold.

Use palm.chat to have a discussion with a model.

response = palm.chat(messages=["Hello."])
print(response.last) #  'Hello! What can I help you with?'
response.reply("Can you tell me a joke?")

Documentation for the PaLM API

Colab magics

%pip install -q google-generativeai
%load_ext google.generativeai.notebook

Once installed, use the Python client via the %%llm Colab magic. Read the full guide here.

%%llm
The best thing since sliced bread is

License

The contents of this repository are licensed under the Apache License, version 2.0.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distributions

No source distribution files available for this release.See tutorial on generating distribution archives.

Built Distribution

google_generativeai-0.4.0-py3-none-any.whl (137.4 kB view details)

Uploaded Python 3

File details

Details for the file google_generativeai-0.4.0-py3-none-any.whl.

File metadata

File hashes

Hashes for google_generativeai-0.4.0-py3-none-any.whl
Algorithm Hash digest
SHA256 cf53a51f7c22f0193685e39708e015119b2500626bb2e74ad8c2bf8d8568ef1e
MD5 ee1660cc1517ebc7b47e671520236633
BLAKE2b-256 ae24c282649e1b07cc2b8db6ce5d3293b526487fa011d7ed09fd4cac3a3f29af

See more details on using hashes here.

Provenance

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page