• Community
  • Model
  • discoLM-mixtral-8x7b-v2

discoLM-mixtral-8x7b-v2

DiscoLM Mixtral 8x7b alpha is an experimental 8x7b MoE language model, based on Mistral AI's Mixtral 8x7b architecture, fine-tuned on diverse datasets