Ollama
Run large language models locally with a single command
Open Source
Free Trial
Ollama makes it easy to run open-source LLMs locally on your machine. It handles model downloading, quantization, and serving with an OpenAI-compatible API. Supports Llama, Mistral, Gemma, Phi, and many other model families. Popular for local development, testing, and offline AI applications.
Pricing: Free
Ollama Alternatives
Explore 35 products in the Inference APIs category. View all Ollama alternatives.
Also listed in
Is your product missing? 👀 Add it here →