Home / Fine-tuning / torchtune / Alternatives
Icon for torchtune

torchtune Alternatives

PyTorch-native library for fine-tuning LLMs on consumer and enterprise GPUs

torchtune is a PyTorch-native fine-tuning library by Meta. It supports full fine-tuning, LoRA, and QLoRA with memory-efficient training that works on consumer GPUs (24GB VRAM).

Explore 20 alternatives to torchtune across 1 category. Each tool listed below shares at least one category with torchtune.

Top torchtune alternatives at a glance

  1. Amazon Bedrock. Managed API access to foundation models on AWS with built-in fine-tuning and agent tooling
  2. Anyscale. Fast, cost-efficient, serverless APIs for LLM Serving and Fine Tuning
  3. Axolotl. Open-source toolkit for fine-tuning LLMs with a single YAML config across the full training pipeline
  4. fal. Build the next generation of creativity with fal. Lightning fast inference.
  5. FinetuneDB. Capture production data, evaluate outputs collaboratively, and fine-tune your LLM's performance

🧠 Fine-tuning

Is your product missing?

Add it here →