Hugging Face
Models
Datasets
Spaces
Posts
Docs
Enterprise
Pricing
Log In
Sign Up
eaddario
/
Dolphin3.0-R1-Mistral-24B-GGUF
like
0
Text Generation
GGUF
eaddario/imatrix-calibration
English
quant
Inference Endpoints
conversational
License:
apache-2.0
Model card
Files
Files and versions
Community
Deploy
Use this model
main
Dolphin3.0-R1-Mistral-24B-GGUF
1 contributor
History:
23 commits
eaddario
Update README.md
2345f28
verified
9 days ago
imatrix
Generate Small imatrix
11 days ago
logits
Generate base model logits
11 days ago
scores
Generate perplexity and kld scores
11 days ago
.gitattributes
1.65 kB
Update .gitattributes
11 days ago
.gitignore
6.78 kB
Update .gitignore
11 days ago
Dolphin3.0-R1-Mistral-24B-F16.gguf
47.2 GB
LFS
Convert to GGUF @ F16
11 days ago
Dolphin3.0-R1-Mistral-24B-IQ3_M.gguf
10.7 GB
LFS
Generate IQ3_M quant
11 days ago
Dolphin3.0-R1-Mistral-24B-IQ3_S.gguf
10.4 GB
LFS
Generate IQ3_S quant
11 days ago
Dolphin3.0-R1-Mistral-24B-IQ4_NL.gguf
13.5 GB
LFS
Generate IQ4_NL quant
11 days ago
Dolphin3.0-R1-Mistral-24B-Q3_K_L.gguf
12.4 GB
LFS
Generate Q3_K_L quant
11 days ago
Dolphin3.0-R1-Mistral-24B-Q3_K_M.gguf
11.5 GB
LFS
Generate Q3_K_M quant
11 days ago
Dolphin3.0-R1-Mistral-24B-Q3_K_S.gguf
10.4 GB
LFS
Generate Q3_K_M quant
11 days ago
Dolphin3.0-R1-Mistral-24B-Q4_K_M.gguf
14.3 GB
LFS
Generate Q3_K_M quant
11 days ago
Dolphin3.0-R1-Mistral-24B-Q4_K_S.gguf
13.5 GB
LFS
Generate Q4_K_S quant
11 days ago
Dolphin3.0-R1-Mistral-24B-Q5_K_M.gguf
16.8 GB
LFS
Generate Q5_K_M quant
11 days ago
Dolphin3.0-R1-Mistral-24B-Q5_K_S.gguf
16.3 GB
LFS
Generate Q5_K_S quant
11 days ago
Dolphin3.0-R1-Mistral-24B-Q6_K.gguf
19.3 GB
LFS
Generate Q6_K quant
11 days ago
Dolphin3.0-R1-Mistral-24B-Q8_0.gguf
25.1 GB
LFS
Generate Q8_0 quant
11 days ago
README.md
10.9 kB
Update README.md
9 days ago