Skip to main content
Ollama logo

About

Run large language models locally with a simple CLI and API, supporting Llama, Mistral, and more.

Replaces