Comet logger for LLM
Project description
CometLLM is a tool to log and visualize your LLM prompts and chains. Use CometLLM to identify effective prompt strategies, streamline your troubleshooting, and ensure reproducible workflows!
⚡️ Quickstart
Install comet_llm
Python library with pip:
pip install comet_llm
If you don't have already, create your free Comet account and grab your API Key from the account settings page.
Now you are all set to log your first prompt and response:
import comet_llm
comet_llm.log_prompt(
prompt="What is your name?",
output=" My name is Alex.",
api_key="<YOUR_COMET_API_KEY>",
)
🎯 Features
- Log your prompts and responses, including prompt template, variables, timestamps and duration and any metadata that you need.
- Visualize your prompts and responses in the UI.
- Log your chain execution down to the level of granularity that you need.
- Visualize your chain execution in the UI.
- Automatically tracks your prompts when using the OpenAI chat models.
- Track and analyze user feedback.
- Diff your prompts and chain execution in the UI.
👀 Examples
To log a single LLM call as an individual prompt, use comet_llm.log_prompt
. If you require more granularity, you can log a chain of executions that may include more than one LLM call, context retrieval, or data pre- or post-processing with comet_llm.start_chain
.
Log a full prompt and response
import comet_llm
comet_llm.log_prompt(
prompt="Answer the question and if the question can't be answered, say \"I don't know\"\n\n---\n\nQuestion: What is your name?\nAnswer:",
prompt_template="Answer the question and if the question can't be answered, say \"I don't know\"\n\n---\n\nQuestion: {{question}}?\nAnswer:",
prompt_template_variables={"question": "What is your name?"},
metadata= {
"usage.prompt_tokens": 7,
"usage.completion_tokens": 5,
"usage.total_tokens": 12,
},
output=" My name is Alex.",
duration=16.598,
)
Read the full documentation for more details about logging a prompt.
Log a LLM chain
from comet_llm import Span, end_chain, start_chain
import datetime
from time import sleep
def retrieve_context(user_question):
if "open" in user_question:
return "Opening hours: 08:00 to 17:00 all days"
def llm_answering(user_question, current_time, context):
prompt_template = """You are a helpful chatbot. You have access to the following context:
{context}
The current time is: {current_time}
Analyze the following user question and decide if you can answer it, if the question can't be answered, say \"I don't know\":
{user_question}
"""
prompt = prompt_template.format(
user_question=user_question, current_time=current_time, context=context
)
with Span(
category="llm-call",
inputs={"prompt_template": prompt_template, "prompt": prompt},
) as span:
# Call your LLM model here
sleep(0.1)
result = "Yes we are currently open"
usage = {"prompt_tokens": 52, "completion_tokens": 12, "total_tokens": 64}
span.set_outputs(outputs={"result": result}, metadata={"usage": usage})
return result
def main(user_question, current_time):
start_chain(inputs={"user_question": user_question, "current_time": current_time})
with Span(
category="context-retrieval",
name="Retrieve Context",
inputs={"user_question": user_question},
) as span:
context = retrieve_context(user_question)
span.set_outputs(outputs={"context": context})
with Span(
category="llm-reasoning",
inputs={
"user_question": user_question,
"current_time": current_time,
"context": context,
},
) as span:
result = llm_answering(user_question, current_time, context)
span.set_outputs(outputs={"result": result})
end_chain(outputs={"result": result})
main("Are you open?", str(datetime.datetime.now().time()))
Read the full documentation for more details about logging a chain.
⚙️ Configuration
You can configure your Comet credentials and where you are logging data to:
Name | Python parameter name | Environment variable name |
---|---|---|
Comet API KEY | api_key | COMET_API_KEY |
Comet Workspace name | workspace | COMET_WORKSPACE |
Comet Project name | project | COMET_PROJECT_NAME |
📝 License
Copyright (c) Comet 2023-present. cometLLM
is free and open-source software licensed under the MIT License.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
File details
Details for the file comet_llm-2.2.1.tar.gz
.
File metadata
- Download URL: comet_llm-2.2.1.tar.gz
- Upload date:
- Size: 33.2 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/5.0.0 CPython/3.12.3
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 55c9df48ee3765a437415a987a1e293b26c3516f596e5f68d58305538ea6dcac |
|
MD5 | e3e9df2c2debc07b1952cfc046b05b35 |
|
BLAKE2b-256 | f77f02ce30e81ec58043103a7e225bb8d6ecec99d02a3c1202c68dd8863a9108 |
File details
Details for the file comet_llm-2.2.1-py3-none-any.whl
.
File metadata
- Download URL: comet_llm-2.2.1-py3-none-any.whl
- Upload date:
- Size: 68.0 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/5.0.0 CPython/3.12.3
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 8e6005e1817cf719c4ff7434c8ddd319e1efef92cef227610acc3be28c8dad6d |
|
MD5 | a4d46433aefb0ee5438419124ba4818a |
|
BLAKE2b-256 | 09cc931a8e66e1caf7dea55b7d7d02fff58c0d0842aec6193d5039b95b749d97 |