language: | |
- en | |
license: mit | |
base_model: deepseek-ai/DeepSeek-R1-Distill-Qwen-32B | |
base_model_relation: quantized | |
library_name: mlc-llm | |
pipeline_tag: text-generation | |
4-bit GPTQ quantized version of [DeepSeek-R1-Distill-Qwen-32B](https://huggingface.co/deepseek-ai/DeepSeek-R1-Distill-Qwen-32B) for inference with the [Private LLM](http://privatellm.app) app. | |