rwmasood commited on
Commit
4e0123b
Β·
verified Β·
1 Parent(s): 6cf3229

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +23 -1
README.md CHANGED
@@ -18,7 +18,7 @@ This document presents the evaluation results of `Llama-3.1-8B-Instruct-gptq-4bi
18
 
19
  ## πŸ“Š Evaluation Summary
20
 
21
- | **Metric** | **Value** | **Description** | **Llama-3.1-8B-Instruct** |
22
  |----------------------|-----------|-----------------|-----------|
23
  | **Accuracy (acc,none)** | `47.1%` | Raw accuracy - percentage of correct answers. | `53.1%` |
24
  | **Standard Error (acc_stderr,none)** | `1.46%` | Uncertainty in the accuracy estimate. | `1.45%` |
@@ -73,4 +73,26 @@ This document presents the evaluation results of `Llama-3.1-8B-Instruct-gptq-4bi
73
 
74
  ---
75
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
76
  πŸ“Œ Let us know if you need further analysis or model tuning! πŸš€
 
18
 
19
  ## πŸ“Š Evaluation Summary
20
 
21
+ | **Metric** | **Value** | **Description** | **[original](https://huggingface.co/meta-llama/Llama-3.1-8B-Instruct)** |
22
  |----------------------|-----------|-----------------|-----------|
23
  | **Accuracy (acc,none)** | `47.1%` | Raw accuracy - percentage of correct answers. | `53.1%` |
24
  | **Standard Error (acc_stderr,none)** | `1.46%` | Uncertainty in the accuracy estimate. | `1.45%` |
 
73
 
74
  ---
75
 
76
+
77
+ ## **Citation**
78
+ If you use this model in your research or project, please cite it as follows:
79
+
80
+ πŸ“Œ **Dr. Wasif Masood** (2024). *4bit Llama-3.1-8B-Instruct*. Version 1.0.
81
+ Available at: [https://huggingface.co/empirischtech/Meta-Llama-3.1-8B-Instruct-gptq-4bit](https://huggingface.co/empirischtech/Meta-Llama-3.1-8B-Instruct-gptq-4bit)
82
+
83
+ ### **BibTeX:**
84
+ ```bibtex
85
+ @dataset{rwmasood2024,
86
+ author = {Dr. Wasif Masood and Empirisch Tech GmbH},
87
+ title = {Llama-3.1-8B 4 bit quantized},
88
+ year = {2024},
89
+ publisher = {Hugging Face},
90
+ url = {https://huggingface.co/empirischtech/Meta-Llama-3.1-8B-Instruct-gptq-4bit},
91
+ version = {1.0},
92
+ license = {llama3.1},
93
+ institution = {Empirisch Tech GmbH}
94
+ }
95
+
96
+
97
+
98
  πŸ“Œ Let us know if you need further analysis or model tuning! πŸš€