Advanced mixture-of-experts model with state-of-the-art performance
Choose the hosting option that best fits your needs:
Mistral 8x22B is a state-of-the-art open source large language model with 176 billion (8x22B MoE) parameters. It offers the following technical capabilities:
Our hosting service provides optimized configurations to get the most out of Mistral 8x22B, with technical experts available to help you integrate it into your applications.