Mosaic's LLM Foundry
bCloud LLC
Mosaic's LLM Foundry
bCloud LLC
Mosaic's LLM Foundry
bCloud LLC
Version 0.22.0 + Free with Support on Ubuntu 24.04
Mosaic’s LLM Foundry is an open-source framework for training, fine-tuning, and deploying large language models (LLMs) efficiently. It provides scalable, optimized pipelines for every stage of the LLM lifecycle—from data preparation and model training to evaluation and inference—while integrating with modern acceleration tools like FlashAttention and distributed training strategies.
Features of Mosaic’s LLM Foundry:
- Supports pre-training, fine-tuning, and inference of large language models with minimal configuration.
- Integrates with advanced performance optimizations such as FlashAttention, FSDP, and tensor parallelism.
- Compatible with Hugging Face tokenizers and model architectures.
- Provides ready-to-use scripts and YAML configs for reproducible experiments.
- Designed for both single-GPU setups and multi-node distributed clusters.
- Open-source and actively maintained, making it suitable for both research and production.
To check the installed version of Mosaic's LLM Foundry, run these commands in your environment:
$ sudo su
$ sudo apt update
$ source llm-foundry
$ python -c "import llmfoundry; print(f'LLM Foundry version: {llmfoundry.__version__}')"
Disclaimer: Mosaic’s LLM Foundry is released under the Apache 2.0 License and is maintained by the MosaicML community. Users are responsible for ensuring correct usage in their specific applications. The developers do not take responsibility for any consequences arising from its use. Always refer to the official documentation for the most accurate and up-to-date information.