Whisprell-CoT
Collection
3 items
•
Updated
•
1
Whisprell-DeepSeek-R1-Enhanced-1.5B is a Chain-of-Thought (CoT) reasoning focused model developed by NexThinkLabs. The model is based on DeepSeek's DeepSeek-R1-Distill-Qwen-1.5B and has been further fine-tuned to enhance reasoning capabilities while maintaining computational efficiency.
from transformers import AutoModelForCausalLM, AutoTokenizer
model = AutoModelForCausalLM.from_pretrained("NexThinkLabsAI/Whisprell-DeepSeek-R1-Enhanced-1.5B)
tokenizer = AutoTokenizer.from_pretrained("NexThinkLabsAI/Whisprell-DeepSeek-R1-Enhanced-1.5B")
This model is under Personal Proprietary License. The base model (deepseek-ai/DeepSeek-R1-Distill-Qwen-1.5B) is under MIT License.
We thank DeepSeek AI for their DeepSeek-R1-Distill-Qwen-1.5B model which served as the foundation for this work.
For questions and support, please:
Base model
NexThinkLabsAI/Whisprell-R1-Distill-1.5B