salbatarni commited on
Commit
b41a2f8
·
verified ·
1 Parent(s): d516d07

End of training

Browse files
Files changed (1) hide show
  1. README.md +87 -82
README.md CHANGED
@@ -3,20 +3,20 @@ base_model: aubmindlab/bert-base-arabertv02
3
  tags:
4
  - generated_from_trainer
5
  model-index:
6
- - name: arabert_cross_relevance_task7_fold1
7
  results: []
8
  ---
9
 
10
  <!-- This model card has been generated automatically according to the information the Trainer had access to. You
11
  should probably proofread and complete it, then remove this comment. -->
12
 
13
- # arabert_cross_relevance_task7_fold1
14
 
15
  This model is a fine-tuned version of [aubmindlab/bert-base-arabertv02](https://huggingface.co/aubmindlab/bert-base-arabertv02) on the None dataset.
16
  It achieves the following results on the evaluation set:
17
- - Loss: 0.3047
18
- - Qwk: 0.0332
19
- - Mse: 0.3047
20
 
21
  ## Model description
22
 
@@ -45,83 +45,88 @@ The following hyperparameters were used during training:
45
 
46
  ### Training results
47
 
48
- | Training Loss | Epoch | Step | Validation Loss | Qwk | Mse |
49
- |:-------------:|:------:|:----:|:---------------:|:------:|:------:|
50
- | No log | 0.1333 | 2 | 1.2830 | 0.0016 | 1.2830 |
51
- | No log | 0.2667 | 4 | 0.3134 | 0.0127 | 0.3134 |
52
- | No log | 0.4 | 6 | 0.1648 | 0.1104 | 0.1648 |
53
- | No log | 0.5333 | 8 | 0.1251 | 0.0517 | 0.1251 |
54
- | No log | 0.6667 | 10 | 0.2224 | 0.0017 | 0.2224 |
55
- | No log | 0.8 | 12 | 0.2590 | 0.0094 | 0.2590 |
56
- | No log | 0.9333 | 14 | 0.1796 | 0.0127 | 0.1796 |
57
- | No log | 1.0667 | 16 | 0.1308 | 0.0397 | 0.1308 |
58
- | No log | 1.2 | 18 | 0.1449 | 0.0303 | 0.1449 |
59
- | No log | 1.3333 | 20 | 0.1645 | 0.0288 | 0.1645 |
60
- | No log | 1.4667 | 22 | 0.1859 | 0.0270 | 0.1859 |
61
- | No log | 1.6 | 24 | 0.1923 | 0.0425 | 0.1923 |
62
- | No log | 1.7333 | 26 | 0.1918 | 0.0319 | 0.1918 |
63
- | No log | 1.8667 | 28 | 0.2199 | 0.0290 | 0.2199 |
64
- | No log | 2.0 | 30 | 0.2212 | 0.0273 | 0.2212 |
65
- | No log | 2.1333 | 32 | 0.1858 | 0.0273 | 0.1858 |
66
- | No log | 2.2667 | 34 | 0.1779 | 0.0270 | 0.1779 |
67
- | No log | 2.4 | 36 | 0.2133 | 0.0270 | 0.2133 |
68
- | No log | 2.5333 | 38 | 0.2467 | 0.0339 | 0.2467 |
69
- | No log | 2.6667 | 40 | 0.2211 | 0.0355 | 0.2211 |
70
- | No log | 2.8 | 42 | 0.1890 | 0.0355 | 0.1890 |
71
- | No log | 2.9333 | 44 | 0.2091 | 0.0270 | 0.2091 |
72
- | No log | 3.0667 | 46 | 0.2659 | 0.0254 | 0.2659 |
73
- | No log | 3.2 | 48 | 0.2479 | 0.0235 | 0.2479 |
74
- | No log | 3.3333 | 50 | 0.2076 | 0.0284 | 0.2076 |
75
- | No log | 3.4667 | 52 | 0.1978 | 0.0351 | 0.1978 |
76
- | No log | 3.6 | 54 | 0.2247 | 0.0334 | 0.2247 |
77
- | No log | 3.7333 | 56 | 0.2784 | 0.0319 | 0.2784 |
78
- | No log | 3.8667 | 58 | 0.2815 | 0.0217 | 0.2815 |
79
- | No log | 4.0 | 60 | 0.2597 | 0.0235 | 0.2597 |
80
- | No log | 4.1333 | 62 | 0.2093 | 0.0304 | 0.2093 |
81
- | No log | 4.2667 | 64 | 0.2137 | 0.0287 | 0.2137 |
82
- | No log | 4.4 | 66 | 0.2532 | 0.0235 | 0.2532 |
83
- | No log | 4.5333 | 68 | 0.2399 | 0.0251 | 0.2399 |
84
- | No log | 4.6667 | 70 | 0.2137 | 0.0338 | 0.2137 |
85
- | No log | 4.8 | 72 | 0.2516 | 0.0235 | 0.2516 |
86
- | No log | 4.9333 | 74 | 0.2786 | 0.0319 | 0.2786 |
87
- | No log | 5.0667 | 76 | 0.3017 | 0.0319 | 0.3017 |
88
- | No log | 5.2 | 78 | 0.2701 | 0.0235 | 0.2701 |
89
- | No log | 5.3333 | 80 | 0.2234 | 0.0301 | 0.2234 |
90
- | No log | 5.4667 | 82 | 0.2231 | 0.0301 | 0.2231 |
91
- | No log | 5.6 | 84 | 0.2419 | 0.0284 | 0.2419 |
92
- | No log | 5.7333 | 86 | 0.2839 | 0.0233 | 0.2839 |
93
- | No log | 5.8667 | 88 | 0.2964 | 0.0233 | 0.2964 |
94
- | No log | 6.0 | 90 | 0.2986 | 0.0214 | 0.2986 |
95
- | No log | 6.1333 | 92 | 0.2565 | 0.0317 | 0.2565 |
96
- | No log | 6.2667 | 94 | 0.2203 | 0.0334 | 0.2203 |
97
- | No log | 6.4 | 96 | 0.2553 | 0.0317 | 0.2553 |
98
- | No log | 6.5333 | 98 | 0.3609 | 0.0193 | 0.3609 |
99
- | No log | 6.6667 | 100 | 0.4216 | 0.0206 | 0.4216 |
100
- | No log | 6.8 | 102 | 0.3627 | 0.0225 | 0.3627 |
101
- | No log | 6.9333 | 104 | 0.2543 | 0.0317 | 0.2543 |
102
- | No log | 7.0667 | 106 | 0.2003 | 0.0312 | 0.2003 |
103
- | No log | 7.2 | 108 | 0.2014 | 0.0295 | 0.2014 |
104
- | No log | 7.3333 | 110 | 0.2374 | 0.0301 | 0.2374 |
105
- | No log | 7.4667 | 112 | 0.3176 | 0.0263 | 0.3176 |
106
- | No log | 7.6 | 114 | 0.3811 | 0.0237 | 0.3811 |
107
- | No log | 7.7333 | 116 | 0.3686 | 0.0285 | 0.3686 |
108
- | No log | 7.8667 | 118 | 0.3010 | 0.0260 | 0.3010 |
109
- | No log | 8.0 | 120 | 0.2467 | 0.0317 | 0.2467 |
110
- | No log | 8.1333 | 122 | 0.2371 | 0.0334 | 0.2371 |
111
- | No log | 8.2667 | 124 | 0.2613 | 0.0361 | 0.2613 |
112
- | No log | 8.4 | 126 | 0.2959 | 0.0310 | 0.2959 |
113
- | No log | 8.5333 | 128 | 0.3317 | 0.0225 | 0.3317 |
114
- | No log | 8.6667 | 130 | 0.3288 | 0.0240 | 0.3288 |
115
- | No log | 8.8 | 132 | 0.2998 | 0.0370 | 0.2998 |
116
- | No log | 8.9333 | 134 | 0.2797 | 0.0341 | 0.2797 |
117
- | No log | 9.0667 | 136 | 0.2625 | 0.0441 | 0.2625 |
118
- | No log | 9.2 | 138 | 0.2672 | 0.0421 | 0.2672 |
119
- | No log | 9.3333 | 140 | 0.2738 | 0.0421 | 0.2738 |
120
- | No log | 9.4667 | 142 | 0.2892 | 0.0401 | 0.2892 |
121
- | No log | 9.6 | 144 | 0.3017 | 0.0366 | 0.3017 |
122
- | No log | 9.7333 | 146 | 0.3079 | 0.0332 | 0.3079 |
123
- | No log | 9.8667 | 148 | 0.3065 | 0.0332 | 0.3065 |
124
- | No log | 10.0 | 150 | 0.3047 | 0.0332 | 0.3047 |
 
 
 
 
 
125
 
126
 
127
  ### Framework versions
 
3
  tags:
4
  - generated_from_trainer
5
  model-index:
6
+ - name: arabert_cross_relevance_task7_fold2
7
  results: []
8
  ---
9
 
10
  <!-- This model card has been generated automatically according to the information the Trainer had access to. You
11
  should probably proofread and complete it, then remove this comment. -->
12
 
13
+ # arabert_cross_relevance_task7_fold2
14
 
15
  This model is a fine-tuned version of [aubmindlab/bert-base-arabertv02](https://huggingface.co/aubmindlab/bert-base-arabertv02) on the None dataset.
16
  It achieves the following results on the evaluation set:
17
+ - Loss: 0.3920
18
+ - Qwk: 0.0
19
+ - Mse: 0.3925
20
 
21
  ## Model description
22
 
 
45
 
46
  ### Training results
47
 
48
+ | Training Loss | Epoch | Step | Validation Loss | Qwk | Mse |
49
+ |:-------------:|:-----:|:----:|:---------------:|:-------:|:------:|
50
+ | No log | 0.125 | 2 | 1.1351 | 0.0055 | 1.1338 |
51
+ | No log | 0.25 | 4 | 0.3327 | 0.0239 | 0.3327 |
52
+ | No log | 0.375 | 6 | 0.4966 | 0.0325 | 0.4972 |
53
+ | No log | 0.5 | 8 | 0.4252 | 0.0068 | 0.4257 |
54
+ | No log | 0.625 | 10 | 0.3269 | 0.0 | 0.3271 |
55
+ | No log | 0.75 | 12 | 0.3043 | 0.0 | 0.3043 |
56
+ | No log | 0.875 | 14 | 0.3478 | -0.0164 | 0.3479 |
57
+ | No log | 1.0 | 16 | 0.5150 | -0.1231 | 0.5155 |
58
+ | No log | 1.125 | 18 | 0.6025 | -0.0901 | 0.6032 |
59
+ | No log | 1.25 | 20 | 0.5883 | -0.1348 | 0.5891 |
60
+ | No log | 1.375 | 22 | 0.4750 | -0.0759 | 0.4757 |
61
+ | No log | 1.5 | 24 | 0.3601 | -0.0164 | 0.3606 |
62
+ | No log | 1.625 | 26 | 0.3455 | 0.0 | 0.3460 |
63
+ | No log | 1.75 | 28 | 0.3605 | 0.0 | 0.3611 |
64
+ | No log | 1.875 | 30 | 0.4021 | -0.0875 | 0.4027 |
65
+ | No log | 2.0 | 32 | 0.4215 | -0.0723 | 0.4221 |
66
+ | No log | 2.125 | 34 | 0.4273 | -0.1017 | 0.4280 |
67
+ | No log | 2.25 | 36 | 0.4182 | -0.0723 | 0.4188 |
68
+ | No log | 2.375 | 38 | 0.3673 | -0.0042 | 0.3677 |
69
+ | No log | 2.5 | 40 | 0.3484 | 0.0 | 0.3488 |
70
+ | No log | 2.625 | 42 | 0.3234 | 0.0 | 0.3237 |
71
+ | No log | 2.75 | 44 | 0.3167 | 0.0 | 0.3170 |
72
+ | No log | 2.875 | 46 | 0.3171 | 0.0 | 0.3173 |
73
+ | No log | 3.0 | 48 | 0.3402 | 0.0 | 0.3406 |
74
+ | No log | 3.125 | 50 | 0.3835 | -0.0462 | 0.3840 |
75
+ | No log | 3.25 | 52 | 0.3969 | -0.0631 | 0.3975 |
76
+ | No log | 3.375 | 54 | 0.3965 | -0.0631 | 0.3971 |
77
+ | No log | 3.5 | 56 | 0.3727 | -0.0085 | 0.3732 |
78
+ | No log | 3.625 | 58 | 0.3399 | 0.0122 | 0.3403 |
79
+ | No log | 3.75 | 60 | 0.3210 | 0.0122 | 0.3213 |
80
+ | No log | 3.875 | 62 | 0.3144 | 0.0122 | 0.3145 |
81
+ | No log | 4.0 | 64 | 0.3203 | 0.0122 | 0.3205 |
82
+ | No log | 4.125 | 66 | 0.3278 | 0.0 | 0.3281 |
83
+ | No log | 4.25 | 68 | 0.3425 | 0.0 | 0.3429 |
84
+ | No log | 4.375 | 70 | 0.3631 | -0.0085 | 0.3636 |
85
+ | No log | 4.5 | 72 | 0.3878 | -0.0631 | 0.3884 |
86
+ | No log | 4.625 | 74 | 0.3841 | -0.0631 | 0.3847 |
87
+ | No log | 4.75 | 76 | 0.3554 | -0.0164 | 0.3558 |
88
+ | No log | 4.875 | 78 | 0.3433 | 0.0 | 0.3437 |
89
+ | No log | 5.0 | 80 | 0.3407 | 0.0 | 0.3410 |
90
+ | No log | 5.125 | 82 | 0.3450 | 0.0 | 0.3453 |
91
+ | No log | 5.25 | 84 | 0.3491 | 0.0 | 0.3494 |
92
+ | No log | 5.375 | 86 | 0.3528 | 0.0 | 0.3532 |
93
+ | No log | 5.5 | 88 | 0.3559 | 0.0 | 0.3564 |
94
+ | No log | 5.625 | 90 | 0.3543 | 0.0 | 0.3547 |
95
+ | No log | 5.75 | 92 | 0.3489 | 0.0 | 0.3493 |
96
+ | No log | 5.875 | 94 | 0.3478 | 0.0 | 0.3480 |
97
+ | No log | 6.0 | 96 | 0.3484 | 0.0 | 0.3486 |
98
+ | No log | 6.125 | 98 | 0.3566 | 0.0 | 0.3568 |
99
+ | No log | 6.25 | 100 | 0.3544 | 0.0 | 0.3546 |
100
+ | No log | 6.375 | 102 | 0.3495 | 0.0 | 0.3499 |
101
+ | No log | 6.5 | 104 | 0.3529 | 0.0 | 0.3534 |
102
+ | No log | 6.625 | 106 | 0.3633 | 0.0122 | 0.3638 |
103
+ | No log | 6.75 | 108 | 0.3602 | 0.0122 | 0.3608 |
104
+ | No log | 6.875 | 110 | 0.3558 | 0.0 | 0.3563 |
105
+ | No log | 7.0 | 112 | 0.3542 | 0.0 | 0.3548 |
106
+ | No log | 7.125 | 114 | 0.3543 | 0.0122 | 0.3548 |
107
+ | No log | 7.25 | 116 | 0.3536 | 0.0122 | 0.3542 |
108
+ | No log | 7.375 | 118 | 0.3534 | 0.0122 | 0.3539 |
109
+ | No log | 7.5 | 120 | 0.3538 | 0.0 | 0.3543 |
110
+ | No log | 7.625 | 122 | 0.3553 | 0.0 | 0.3557 |
111
+ | No log | 7.75 | 124 | 0.3581 | 0.0 | 0.3585 |
112
+ | No log | 7.875 | 126 | 0.3601 | 0.0 | 0.3605 |
113
+ | No log | 8.0 | 128 | 0.3648 | 0.0 | 0.3652 |
114
+ | No log | 8.125 | 130 | 0.3658 | 0.0 | 0.3662 |
115
+ | No log | 8.25 | 132 | 0.3655 | 0.0 | 0.3659 |
116
+ | No log | 8.375 | 134 | 0.3666 | 0.0 | 0.3670 |
117
+ | No log | 8.5 | 136 | 0.3714 | 0.0 | 0.3718 |
118
+ | No log | 8.625 | 138 | 0.3763 | 0.0 | 0.3767 |
119
+ | No log | 8.75 | 140 | 0.3809 | 0.0 | 0.3813 |
120
+ | No log | 8.875 | 142 | 0.3871 | 0.0 | 0.3875 |
121
+ | No log | 9.0 | 144 | 0.3938 | 0.0 | 0.3941 |
122
+ | No log | 9.125 | 146 | 0.3971 | 0.0 | 0.3974 |
123
+ | No log | 9.25 | 148 | 0.3971 | 0.0 | 0.3975 |
124
+ | No log | 9.375 | 150 | 0.3950 | 0.0 | 0.3953 |
125
+ | No log | 9.5 | 152 | 0.3939 | 0.0 | 0.3943 |
126
+ | No log | 9.625 | 154 | 0.3921 | 0.0 | 0.3925 |
127
+ | No log | 9.75 | 156 | 0.3913 | 0.0 | 0.3918 |
128
+ | No log | 9.875 | 158 | 0.3917 | 0.0 | 0.3922 |
129
+ | No log | 10.0 | 160 | 0.3920 | 0.0 | 0.3925 |
130
 
131
 
132
  ### Framework versions