Cybertron 7B v2 is a MistralAI-based language model (llm) excelling in mathematics, logic, and reasoning, Consistently ranks #1 in its category on the HF LeaderBoard, enhanced by the innovative Unified Neural Alignment (UNA) technique.
The maximum number of tokens to generate. Shorter token lengths will provide faster performance.
A decimal number that determines the degree of randomness in the response
An alternative to sampling with temperature, where the model considers the results of the tokens with top_p probability mass.
The top_k parameter is used to limit the number of choices for the next predicted word or token.
ResetModel loading...
Output
Notes
ID
Model Type ID
Text To Text
Input Type
text
Output Type
text
Description
Cybertron 7B v2 is a MistralAI-based language model (llm) excelling in mathematics, logic, and reasoning, Consistently ranks #1 in its category on the HF LeaderBoard, enhanced by the innovative Unified Neural Alignment (UNA) technique.