license: apache-2.0 | |
base_model: deepseek-ai/DeepSeek-R1-Distill-Qwen-32B | |
Quantized from [deepseek-ai/DeepSeek-R1-Distill-Qwen-32B](https://huggingface.co/deepseek-ai/DeepSeek-R1-Distill-Qwen-32B) down to 4 bits, GEMM. | |
license: apache-2.0 | |
base_model: deepseek-ai/DeepSeek-R1-Distill-Qwen-32B | |
Quantized from [deepseek-ai/DeepSeek-R1-Distill-Qwen-32B](https://huggingface.co/deepseek-ai/DeepSeek-R1-Distill-Qwen-32B) down to 4 bits, GEMM. | |