A text-based terminal client for Ollama.
Project description
oterm
the text-based terminal client for Ollama.
Features
- intuitive and simple terminal UI, no need to run servers, frontends, just type
oterm
in your terminal. - multiple persistent chat sessions, stored together with the context embeddings in sqlite.
- can use any of the models you have pulled in Ollama, or your own custom models.
Installation
Using brew
for MacOS:
brew tap ggozad/formulas
brew install ggozad/formulas/oterm
Using pip
:
pip install oterm
Using
In order to use oterm
you will need to have the Ollama server running. By default it expects to find the Ollama API running on http://localhost:11434/api
. If you are running Ollama inside docker or on a different host/port, use the OLLAMA_URL
environment variable to customize the API url.
OLLAMA_URL=http://host:port/api
oterm
will not (yet) pull models for you, please use ollama
to do that. All the models you have pulled or created will be available to oterm
.
Screenshots
License
This project is licensed under the MIT License.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
oterm-0.1.3.tar.gz
(10.7 kB
view hashes)
Built Distribution
oterm-0.1.3-py3-none-any.whl
(13.5 kB
view hashes)