Newsletter
Join the Community
Subscribe to our newsletter for the latest news and updates
DistillFlow is an open-source toolkit for distilling large language models into smaller, efficient models.
DistillFlow is an open-source toolkit designed to simplify and scale the distillation of large language models (LLMs) into smaller, more efficient models. It provides a flexible pipeline for distillation, fine-tuning, and experimentation across multiple GPUs, with support for dynamic resource allocation and easy integration of custom techniques.