Skip to main content

Structured and typehinted GPT responses in Python.

Project description

gpt-json

JSON is a beautiful format. It's both human readable and machine readable, which makes it a great format for structured output of LLMs (after all - LLMs are somewhere in the middle). gpt-json is a wrapper around GPT that allows for declarative definition of expected output format when you're trying to parse results into a downstream pipeline.

Specifically it:

  • Relies on Pydantic schema definitions and type validations
  • Allows for defining both dictionaries and lists
  • Includes some lightweight manipulation of the output to remove superfluous context and fix broken json
  • Includes retry logic for the most common API failures
  • Adds typehinting support for both the API and the output schema

Getting Started

pip install gpt-json

Here's how to use it to generate a schema for simple tasks:

import asyncio

from gpt_json import GPTJSON, GPTMessage, GPTMessageRole
from pydantic import BaseModel

class SentimentSchema(BaseModel):
    sentiment: str

SYSTEM_PROMPT = """
Analyze the sentiment of the given text.

Respond with the following JSON schema:

{json_schema}
"""

async def runner():
    gpt_json = GPTJSON[SentimentSchema](API_KEY)
    response = await gpt_json.run(
        messages=[
            GPTMessage(
                role=GPTMessageRole.SYSTEM,
                content=SYSTEM_PROMPT,
            ),
            GPTMessage(
                role=GPTMessageRole.USER,
                content="Text: I love this product. It's the best thing ever!",
            )
        ]
    )
    print(response)
    print(f"Detected sentiment: {response.sentiment}")

asyncio.run(runner())
sentiment='positive'
Detected sentiment: positive

The json_schema is a special keyword that will be replaced with the schema definition at runtime. You should always include this in your payload to ensure the model knows how to format results. However, you can play around with where to include this schema definition; in the system prompt, in the user prompt, at the beginning, or at the end.

You can either typehint the model to return a BaseSchema back, or to provide a list of Multiple BaseSchema. Both of these work:

gpt_json_single = GPTJSON[SentimentSchema](API_KEY)
gpt_json_single = GPTJSON[list[SentimentSchema]](API_KEY)

If you want to get more specific about how you expect the model to populate a field, add hints about the value through the "description" field. This helps the model understand what you're looking for, and will help it generate better results.

from pydantic import BaseModel, Field

class SentimentSchema(BaseModel):
    sentiment: int = Field(description="Either -1, 0, or 1.")
sentiment=1
Detected sentiment: 1

Other Configurations

The GPTJSON class supports other configuration parameters at initialization.

Parameter Type Description
model GPTModelVersion | str (default: GPTModelVersion.GPT_4) - For convenience we provide the currently supported GPT model versions in the GPTModelVersion enum. You can also pass a string value if you want to use another more specific architecture.
auto_trim bool (default: False) - If your input prompt is too long, perhaps because of dynamic injected content, will automatically truncate the text to create enough room for the model's response.
auto_trim_response_overhead int (default: 0) - If you're using auto_trim, configures the max amount of tokens to allow in the model's response.

Comparison to Other Libraries

A non-exhaustive list of other libraries that address the same problem. None of them were fully compatible with my deployment (hence this library), but check them out:

jsonformer - Works with any Huggingface model, whereas gpt-json is specifically tailored towards the GPT-X family. GPT doesn't output logit probabilities or allow fixed decoder templating so the same approach can't apply.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

gpt_json-0.1.3.tar.gz (12.0 kB view details)

Uploaded Source

Built Distribution

gpt_json-0.1.3-py3-none-any.whl (14.1 kB view details)

Uploaded Python 3

File details

Details for the file gpt_json-0.1.3.tar.gz.

File metadata

  • Download URL: gpt_json-0.1.3.tar.gz
  • Upload date:
  • Size: 12.0 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.4.2 CPython/3.10.11 Linux/5.15.0-1036-azure

File hashes

Hashes for gpt_json-0.1.3.tar.gz
Algorithm Hash digest
SHA256 0efd40d0c59bf92165ec9bfd2f7eba68baf7a1796ed98c58c6556c642e0a52cd
MD5 84f057e16fdcfeb1012a2dfc5d18bb7b
BLAKE2b-256 4a87a9ac367fbd9c34001f4f239f5c16fc6d582ffb3bae7bd6dc53dfce30f6c9

See more details on using hashes here.

File details

Details for the file gpt_json-0.1.3-py3-none-any.whl.

File metadata

  • Download URL: gpt_json-0.1.3-py3-none-any.whl
  • Upload date:
  • Size: 14.1 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.4.2 CPython/3.10.11 Linux/5.15.0-1036-azure

File hashes

Hashes for gpt_json-0.1.3-py3-none-any.whl
Algorithm Hash digest
SHA256 333d03ecda56ef57798c6ed844d8790657abf09c6fe65f182dc8b08ddd2eedfb
MD5 0aeda15da69b26ab81b607b0bb505283
BLAKE2b-256 21bf00dfcdbcb4c75c5e77e110a85fcf7a4e19d72fc086d07ffcbb76db8ad1fd

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page