BLOOM
Developed by BigScience (Hugging Face-led international collaboration)
Key Capabilities
- 46 natural languages including many underrepresented languages
- One of the first truly multilingual large language models
- 13 programming language support
- ROOTS dataset is fully documented and auditable
- Strong for translation and cross-lingual tasks
VRAM Requirements by Quantization
Choose the right GPU based on your performance and quality needs.
| Model / Quantization | VRAM Required |
|---|---|
| 7.1B FP16 | 15GB |
| 176B FP16 | 352GB |
| 176B Q4 | 100GB |
Use Cases
BLOOM (560M, 1.1B, 1.7B, 3B, 7.1B, 176B) can be deployed for enterprise AI applications including document processing, code generation, data analysis, and conversational AI. License: BigScience RAIL License (responsible AI, commercial use allowed with restrictions).
Run BLOOM with Petronella
PTG deploys BLOOM for organizations needing AI in underrepresented languages (African, Southeast Asian, etc.). The only major model with 46+ language support and a fully auditable training dataset.
Recommended Hardware
| Model Size | Recommended GPU |
|---|---|
| 7.1B | RTX 5080 (16GB) |
| 176B FP16 | DGX Station GB300 (384GB) or 4x RTX PRO 6000 (384GB) |
| 176B Q4 | DGX Spark (128GB) |
Deploy BLOOM On-Premises
Our team builds GPU-accelerated systems configured and optimized for BLOOM. Private, secure, and fully under your control.