• Community
  • Model
  • mixtral-8x7B-Instruct-v0_1

mixtral-8x7B-Instruct-v0_1

Mixtral 8x7B is a high-quality, Sparse Mixture-of-Experts (SMoE) model (llm), excelling in efficiency, multilingual support, and competitive performance across diverse benchmarks.