Cloudflare AI Gateway
LLM proxy with caching, logging, rate limiting, and cost analytics
Cloudflare AI Gateway is a proxy layer between your application and LLM providers like OpenAI, Anthropic, and Gemini. It provides request logging, cost and token analytics, response caching, rate limiting, retries, and model fallback across 70+ models from 12+ providers through a single endpoint. Integration requires changing one line of code.
Pricing: Free on all plans, Logpush $0.05/M requests on paid
Cloudflare AI Gateway Alternatives
Explore 29 products in the Observability & Analytics category. View all Cloudflare AI Gateway alternatives.
Sentrial
Production monitoring for AI agents with automated failure detection and diagnosis
Comet Opik
Comet provides an end-to-end model evaluation platform for AI developers.
Langfuse
Traces, evals, prompt management and metrics to debug and improve your LLM application.
Is your product missing?