--- metrics: - accuracy base_model: - unsloth/Qwen2.5-0.5B-Instruct license: apache-2.0 datasets: - openai/gsm8k language: - en pipeline_tag: text-generation library_name: transformers --- Trained on 100 epochs, has potential to scale upto 47-48% on GSM8k for a full run of 400 epochs. ![image/png](https://cdn-uploads.huggingface.co/production/uploads/665c8c420a6a8196763a53f8/0X3Egp3-o6VTk0Kx9N-gY.png)
![image/png](https://cdn-uploads.huggingface.co/production/uploads/665c8c420a6a8196763a53f8/rag6xErI0YB51jR2mtj6i.png)