Playground
NEW
Compute
NEW
Community
HELP
Contact Us
Documentation
Quick Start Guide
API Status
Join our Discord Channel
Product Roadmap
Log in
Sign up
Apps / Templates
Models
Workflows
Modules
All
Starred
Sort by:
Last Updated
SORT BY
Star Count
Last Updated
Last Created
Model ID
Order
Ascending
Descending
discoLM-mixtral-8x7b-v2
1
Text To Text
DiscoLM Mixtral 8x7b alpha is an experimental 8x7b MoE language model, based on Mistral AI's Mixtral 8x7b architecture, fine-tuned on diverse datasets
Share
Copy URL
Twitter
Facebook
Reddit
LinkedIn
Email
Copy ID
Delete
Help