metadata
language:
- en
- fr
- de
- es
- it
- pt
- zh
- ja
- ru
- ko
base_model: mistralai/Mistral-Small-24B-Base-2501
license: apache-2.0
library_name: vllm
inference: false
extra_gated_description: >-
If you want to learn more about how we process your personal data, please read
our <a href="https://mistral.ai/terms/">Privacy Policy</a>.
tags:
- transformers
- exl2
L3.3-Nevoria-R1-70b - EXL2 8.0bpw
This is a 8.0bpw EXL2 quant of mistralai/Mistral-Small-24B-Base-2501
Details about the model can be found at the above model page.
Perplexity Scoring
Below are the perplexity scores for the EXL2 models. A lower score is better.