Skip to main content

Client for the public Ginkgo AI API

Project description

Ginkgo's AI model API client

Work in progress: this repo was just made public and we are still working on integration

A python client for Ginkgo's AI model API, to run inference on public and Ginkgo-proprietary models. Learn more in the Model API announcement.

Prerequisites

Register at https://models.ginkgobioworks.ai/ to get credits and an API KEY (of the form xxxxxxx-xxxx-xxxx-xxxx-xxxxxxxx). Store the API KEY in the GINKGOAI_API_KEY environment variable.

Installation

Install the python client with pip:

pip install ginkgo-ai-client

Usage:

Note: This is an alpha version of the client and its interface may vary in the future.

Example : masked inference with Ginkgo's AA0 model

The client requires an API key (and defaults to os.environ.get("GINKGOAI_API_KEY") if none is explicitly provided)

from ginkgo_ai_client import GinkgoAIClient, MaskedInferenceQuery

client = GinkgoAIClient()
model = "ginkgo-aa0-650M"

# SINGLE QUERY

query = MaskedInferenceQuery(sequence="MPK<mask><mask>RRL", model=model)
prediction = client.send_request(query)
# prediction.sequence == "MPKRRRRL"

# BATCH QUERY

sequences = ["MPK<mask><mask>RRL", "M<mask>RL", "MLLM<mask><mask>R"]
queries = [MaskedInferenceQuery(sequence=seq, model=model) for seq in sequences]
predictions = client.send_batch_request(queries)
# predictions[0].sequence == "MPKRRRRL"

Changing the model parameter to esm2-650M or esm2-3b in this example will perform masked inference with the ESM2 model.

Example : embedding computation with Ginkgo's 3'UTR language model

from ginkgo_ai_client import GinkgoAIClient, MeanEmbeddingQuery

client = GinkgoAIClient()
model = "ginkgo-maskedlm-3utr-v1"

# SINGLE QUERY

query = MeanEmbeddingQuery(sequence="ATTGCG", model=model)
prediction = client.send_request(query)
# prediction.embedding == [1.05, -2.34, ...]

# BATCH QUERY

sequences = ["ATTGCG", "CAATGC", "GCGCACATGT"]
queries = [MeanEmbeddingQuery(sequence=seq, model=model) for seq in sequences]
predictions = client.send_batch_request(queries)
# predictions[0].embedding == [1.05, -2.34, ...]

Available models

See the example folder and reference docs for more details on usage and parameters.

Model Description Reference Supported queries Versions
ESM2 Large Protein language model from Meta Github Embeddings, masked inference 3B, 650M
AA0 Ginkgo's proprietary protein language model Announcement Embeddings, masked inference 650M
3UTR Ginkgo's proprietary 3'UTR language model Preprint Embeddings, masked inference v1

License

This project is licensed under the MIT License. See the LICENSE file for details.

Releases

Make sure the changelog is up to date, increment the version in pyproject.toml, create a new tag, then create a release on Github (publication to PyPI is automated).

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

ginkgo_ai_client-0.2.0.tar.gz (9.6 kB view details)

Uploaded Source

Built Distribution

ginkgo_ai_client-0.2.0-py3-none-any.whl (8.9 kB view details)

Uploaded Python 3

File details

Details for the file ginkgo_ai_client-0.2.0.tar.gz.

File metadata

  • Download URL: ginkgo_ai_client-0.2.0.tar.gz
  • Upload date:
  • Size: 9.6 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/5.1.1 CPython/3.12.7

File hashes

Hashes for ginkgo_ai_client-0.2.0.tar.gz
Algorithm Hash digest
SHA256 1176649ae74f3aaea198f1d985e0bb2464aa2a9d6b14fb6b6df70adb8dcdf54c
MD5 e65c7d1d2657dd5674d487aa11f74880
BLAKE2b-256 77d5dfe33bbf12c2de5974eec0705aec46c02f786547dfa0e4a45671cb17d9d2

See more details on using hashes here.

Provenance

The following attestation bundles were made for ginkgo_ai_client-0.2.0.tar.gz:

Publisher: publish.yml on ginkgobioworks/ginkgo-ai-client

Attestations:

File details

Details for the file ginkgo_ai_client-0.2.0-py3-none-any.whl.

File metadata

File hashes

Hashes for ginkgo_ai_client-0.2.0-py3-none-any.whl
Algorithm Hash digest
SHA256 5b7aa9b031d8eb1db27a01f2dcabf20beb7161ab6ac0b9294b18ecd4443fe094
MD5 5ea4164a7044ebc37e4fa8963a3843d0
BLAKE2b-256 4dcc64bf6f901222af4f37f09fe40a0985eba15b1956ebce1af67182324855b8

See more details on using hashes here.

Provenance

The following attestation bundles were made for ginkgo_ai_client-0.2.0-py3-none-any.whl:

Publisher: publish.yml on ginkgobioworks/ginkgo-ai-client

Attestations:

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page