salbatarni commited on
Commit
6b855b1
·
verified ·
1 Parent(s): a69dc59

End of training

Browse files
Files changed (1) hide show
  1. README.md +92 -87
README.md CHANGED
@@ -3,20 +3,20 @@ base_model: aubmindlab/bert-base-arabertv02
3
  tags:
4
  - generated_from_trainer
5
  model-index:
6
- - name: arabert_cross_organization_task1_fold1
7
  results: []
8
  ---
9
 
10
  <!-- This model card has been generated automatically according to the information the Trainer had access to. You
11
  should probably proofread and complete it, then remove this comment. -->
12
 
13
- # arabert_cross_organization_task1_fold1
14
 
15
  This model is a fine-tuned version of [aubmindlab/bert-base-arabertv02](https://huggingface.co/aubmindlab/bert-base-arabertv02) on the None dataset.
16
  It achieves the following results on the evaluation set:
17
- - Loss: 0.9947
18
- - Qwk: 0.0679
19
- - Mse: 0.9916
20
 
21
  ## Model description
22
 
@@ -45,88 +45,93 @@ The following hyperparameters were used during training:
45
 
46
  ### Training results
47
 
48
- | Training Loss | Epoch | Step | Validation Loss | Qwk | Mse |
49
- |:-------------:|:-----:|:----:|:---------------:|:-------:|:------:|
50
- | No log | 0.125 | 2 | 5.2049 | -0.0008 | 5.2020 |
51
- | No log | 0.25 | 4 | 1.9284 | -0.0122 | 1.9256 |
52
- | No log | 0.375 | 6 | 1.0209 | 0.0513 | 1.0167 |
53
- | No log | 0.5 | 8 | 0.8405 | 0.0824 | 0.8373 |
54
- | No log | 0.625 | 10 | 0.8656 | 0.1060 | 0.8628 |
55
- | No log | 0.75 | 12 | 0.8332 | 0.1240 | 0.8306 |
56
- | No log | 0.875 | 14 | 0.8775 | 0.0730 | 0.8754 |
57
- | No log | 1.0 | 16 | 0.9183 | 0.0104 | 0.9162 |
58
- | No log | 1.125 | 18 | 0.9057 | 0.0508 | 0.9032 |
59
- | No log | 1.25 | 20 | 0.8886 | 0.1273 | 0.8859 |
60
- | No log | 1.375 | 22 | 0.9957 | 0.0849 | 0.9930 |
61
- | No log | 1.5 | 24 | 1.0595 | 0.1181 | 1.0564 |
62
- | No log | 1.625 | 26 | 1.2289 | 0.0182 | 1.2258 |
63
- | No log | 1.75 | 28 | 1.2976 | 0.0182 | 1.2948 |
64
- | No log | 1.875 | 30 | 0.9648 | 0.1402 | 0.9617 |
65
- | No log | 2.0 | 32 | 0.9714 | 0.1016 | 0.9684 |
66
- | No log | 2.125 | 34 | 0.9511 | 0.0710 | 0.9483 |
67
- | No log | 2.25 | 36 | 0.8591 | 0.1425 | 0.8564 |
68
- | No log | 2.375 | 38 | 0.8696 | 0.1182 | 0.8667 |
69
- | No log | 2.5 | 40 | 1.0662 | 0.0360 | 1.0635 |
70
- | No log | 2.625 | 42 | 1.1724 | 0.0360 | 1.1696 |
71
- | No log | 2.75 | 44 | 1.3100 | 0.0182 | 1.3071 |
72
- | No log | 2.875 | 46 | 1.3304 | 0.0182 | 1.3275 |
73
- | No log | 3.0 | 48 | 1.0676 | 0.0424 | 1.0645 |
74
- | No log | 3.125 | 50 | 0.9732 | 0.0668 | 0.9701 |
75
- | No log | 3.25 | 52 | 1.1173 | 0.0279 | 1.1143 |
76
- | No log | 3.375 | 54 | 1.2420 | 0.0182 | 1.2393 |
77
- | No log | 3.5 | 56 | 1.1410 | 0.0155 | 1.1382 |
78
- | No log | 3.625 | 58 | 0.9316 | 0.0268 | 0.9285 |
79
- | No log | 3.75 | 60 | 0.8907 | 0.1122 | 0.8876 |
80
- | No log | 3.875 | 62 | 1.0183 | 0.0253 | 1.0153 |
81
- | No log | 4.0 | 64 | 1.1271 | 0.0279 | 1.1242 |
82
- | No log | 4.125 | 66 | 1.1742 | 0.0300 | 1.1712 |
83
- | No log | 4.25 | 68 | 1.2066 | 0.0682 | 1.2034 |
84
- | No log | 4.375 | 70 | 1.2604 | 0.0377 | 1.2572 |
85
- | No log | 4.5 | 72 | 1.1679 | 0.0830 | 1.1646 |
86
- | No log | 4.625 | 74 | 1.1770 | 0.0966 | 1.1739 |
87
- | No log | 4.75 | 76 | 1.1163 | 0.0966 | 1.1131 |
88
- | No log | 4.875 | 78 | 0.9754 | 0.0695 | 0.9721 |
89
- | No log | 5.0 | 80 | 0.9489 | 0.0767 | 0.9456 |
90
- | No log | 5.125 | 82 | 0.9900 | 0.0994 | 0.9868 |
91
- | No log | 5.25 | 84 | 0.8622 | 0.0654 | 0.8588 |
92
- | No log | 5.375 | 86 | 0.8621 | 0.1028 | 0.8586 |
93
- | No log | 5.5 | 88 | 1.0043 | 0.0807 | 1.0011 |
94
- | No log | 5.625 | 90 | 1.0565 | 0.0448 | 1.0533 |
95
- | No log | 5.75 | 92 | 0.9899 | 0.0848 | 0.9866 |
96
- | No log | 5.875 | 94 | 1.1141 | 0.0466 | 1.1111 |
97
- | No log | 6.0 | 96 | 1.3040 | 0.0906 | 1.3012 |
98
- | No log | 6.125 | 98 | 1.2856 | 0.1112 | 1.2829 |
99
- | No log | 6.25 | 100 | 1.3671 | 0.0962 | 1.3644 |
100
- | No log | 6.375 | 102 | 1.2601 | 0.1091 | 1.2574 |
101
- | No log | 6.5 | 104 | 1.2039 | 0.1595 | 1.2011 |
102
- | No log | 6.625 | 106 | 1.1272 | 0.0913 | 1.1244 |
103
- | No log | 6.75 | 108 | 1.0754 | 0.0958 | 1.0725 |
104
- | No log | 6.875 | 110 | 1.0818 | 0.0777 | 1.0790 |
105
- | No log | 7.0 | 112 | 1.0175 | 0.0670 | 1.0146 |
106
- | No log | 7.125 | 114 | 0.9552 | 0.0569 | 0.9521 |
107
- | No log | 7.25 | 116 | 0.8938 | 0.1278 | 0.8906 |
108
- | No log | 7.375 | 118 | 0.9486 | 0.0697 | 0.9455 |
109
- | No log | 7.5 | 120 | 0.9351 | 0.0773 | 0.9319 |
110
- | No log | 7.625 | 122 | 0.8928 | 0.0870 | 0.8895 |
111
- | No log | 7.75 | 124 | 0.8558 | 0.1373 | 0.8524 |
112
- | No log | 7.875 | 126 | 0.8561 | 0.1606 | 0.8527 |
113
- | No log | 8.0 | 128 | 0.9205 | 0.0389 | 0.9174 |
114
- | No log | 8.125 | 130 | 1.0514 | 0.0941 | 1.0484 |
115
- | No log | 8.25 | 132 | 1.0795 | 0.1246 | 1.0765 |
116
- | No log | 8.375 | 134 | 1.0151 | 0.0977 | 1.0120 |
117
- | No log | 8.5 | 136 | 0.9815 | 0.0716 | 0.9784 |
118
- | No log | 8.625 | 138 | 0.9817 | 0.0668 | 0.9786 |
119
- | No log | 8.75 | 140 | 0.9721 | 0.0597 | 0.9690 |
120
- | No log | 8.875 | 142 | 0.9865 | 0.0668 | 0.9834 |
121
- | No log | 9.0 | 144 | 0.9956 | 0.0716 | 0.9925 |
122
- | No log | 9.125 | 146 | 0.9824 | 0.0807 | 0.9793 |
123
- | No log | 9.25 | 148 | 0.9599 | 0.0721 | 0.9568 |
124
- | No log | 9.375 | 150 | 0.9488 | 0.0858 | 0.9456 |
125
- | No log | 9.5 | 152 | 0.9443 | 0.0858 | 0.9411 |
126
- | No log | 9.625 | 154 | 0.9603 | 0.0721 | 0.9572 |
127
- | No log | 9.75 | 156 | 0.9767 | 0.0770 | 0.9735 |
128
- | No log | 9.875 | 158 | 0.9906 | 0.0679 | 0.9874 |
129
- | No log | 10.0 | 160 | 0.9947 | 0.0679 | 0.9916 |
 
 
 
 
 
130
 
131
 
132
  ### Framework versions
 
3
  tags:
4
  - generated_from_trainer
5
  model-index:
6
+ - name: arabert_cross_organization_task1_fold2
7
  results: []
8
  ---
9
 
10
  <!-- This model card has been generated automatically according to the information the Trainer had access to. You
11
  should probably proofread and complete it, then remove this comment. -->
12
 
13
+ # arabert_cross_organization_task1_fold2
14
 
15
  This model is a fine-tuned version of [aubmindlab/bert-base-arabertv02](https://huggingface.co/aubmindlab/bert-base-arabertv02) on the None dataset.
16
  It achieves the following results on the evaluation set:
17
+ - Loss: 1.1732
18
+ - Qwk: 0.1337
19
+ - Mse: 1.1732
20
 
21
  ## Model description
22
 
 
45
 
46
  ### Training results
47
 
48
+ | Training Loss | Epoch | Step | Validation Loss | Qwk | Mse |
49
+ |:-------------:|:------:|:----:|:---------------:|:-------:|:------:|
50
+ | No log | 0.1176 | 2 | 4.6922 | -0.0262 | 4.6922 |
51
+ | No log | 0.2353 | 4 | 1.9697 | -0.0193 | 1.9697 |
52
+ | No log | 0.3529 | 6 | 1.2474 | -0.0024 | 1.2474 |
53
+ | No log | 0.4706 | 8 | 1.3039 | -0.1115 | 1.3039 |
54
+ | No log | 0.5882 | 10 | 1.2027 | -0.0222 | 1.2027 |
55
+ | No log | 0.7059 | 12 | 1.1028 | 0.0290 | 1.1028 |
56
+ | No log | 0.8235 | 14 | 1.1603 | 0.0257 | 1.1603 |
57
+ | No log | 0.9412 | 16 | 1.1285 | -0.0227 | 1.1285 |
58
+ | No log | 1.0588 | 18 | 1.1018 | 0.0835 | 1.1018 |
59
+ | No log | 1.1765 | 20 | 1.1034 | -0.0341 | 1.1034 |
60
+ | No log | 1.2941 | 22 | 1.0966 | 0.0201 | 1.0966 |
61
+ | No log | 1.4118 | 24 | 1.0911 | 0.0407 | 1.0911 |
62
+ | No log | 1.5294 | 26 | 1.0941 | 0.0854 | 1.0941 |
63
+ | No log | 1.6471 | 28 | 1.1027 | 0.0379 | 1.1027 |
64
+ | No log | 1.7647 | 30 | 1.1067 | 0.0462 | 1.1067 |
65
+ | No log | 1.8824 | 32 | 1.1142 | -0.0019 | 1.1142 |
66
+ | No log | 2.0 | 34 | 1.1210 | 0.0205 | 1.1210 |
67
+ | No log | 2.1176 | 36 | 1.1209 | 0.0124 | 1.1209 |
68
+ | No log | 2.2353 | 38 | 1.1138 | 0.0345 | 1.1138 |
69
+ | No log | 2.3529 | 40 | 1.1537 | 0.0350 | 1.1537 |
70
+ | No log | 2.4706 | 42 | 1.1387 | 0.0164 | 1.1387 |
71
+ | No log | 2.5882 | 44 | 1.1467 | 0.0448 | 1.1467 |
72
+ | No log | 2.7059 | 46 | 1.1779 | 0.0372 | 1.1779 |
73
+ | No log | 2.8235 | 48 | 1.1734 | 0.0435 | 1.1734 |
74
+ | No log | 2.9412 | 50 | 1.0971 | 0.0562 | 1.0971 |
75
+ | No log | 3.0588 | 52 | 1.0868 | 0.0993 | 1.0868 |
76
+ | No log | 3.1765 | 54 | 1.1937 | 0.0208 | 1.1937 |
77
+ | No log | 3.2941 | 56 | 1.1870 | -0.0026 | 1.1870 |
78
+ | No log | 3.4118 | 58 | 1.1154 | 0.0605 | 1.1154 |
79
+ | No log | 3.5294 | 60 | 1.1425 | 0.0586 | 1.1425 |
80
+ | No log | 3.6471 | 62 | 1.1710 | 0.0755 | 1.1710 |
81
+ | No log | 3.7647 | 64 | 1.2418 | -0.0251 | 1.2418 |
82
+ | No log | 3.8824 | 66 | 1.2678 | -0.0157 | 1.2678 |
83
+ | No log | 4.0 | 68 | 1.3259 | -0.0370 | 1.3259 |
84
+ | No log | 4.1176 | 70 | 1.4421 | -0.0243 | 1.4421 |
85
+ | No log | 4.2353 | 72 | 1.3217 | -0.0265 | 1.3217 |
86
+ | No log | 4.3529 | 74 | 1.1731 | 0.1242 | 1.1731 |
87
+ | No log | 4.4706 | 76 | 1.1656 | 0.1082 | 1.1656 |
88
+ | No log | 4.5882 | 78 | 1.2346 | -0.0387 | 1.2346 |
89
+ | No log | 4.7059 | 80 | 1.3196 | 0.0306 | 1.3196 |
90
+ | No log | 4.8235 | 82 | 1.2508 | -0.0093 | 1.2508 |
91
+ | No log | 4.9412 | 84 | 1.1905 | -0.0118 | 1.1905 |
92
+ | No log | 5.0588 | 86 | 1.1863 | 0.0002 | 1.1863 |
93
+ | No log | 5.1765 | 88 | 1.2464 | 0.0063 | 1.2464 |
94
+ | No log | 5.2941 | 90 | 1.2248 | 0.0177 | 1.2248 |
95
+ | No log | 5.4118 | 92 | 1.2140 | 0.0391 | 1.2140 |
96
+ | No log | 5.5294 | 94 | 1.2402 | 0.0177 | 1.2402 |
97
+ | No log | 5.6471 | 96 | 1.1852 | 0.1192 | 1.1852 |
98
+ | No log | 5.7647 | 98 | 1.1973 | 0.0602 | 1.1973 |
99
+ | No log | 5.8824 | 100 | 1.2330 | 0.0886 | 1.2330 |
100
+ | No log | 6.0 | 102 | 1.1570 | 0.0504 | 1.1570 |
101
+ | No log | 6.1176 | 104 | 1.1259 | 0.1211 | 1.1259 |
102
+ | No log | 6.2353 | 106 | 1.1231 | 0.1211 | 1.1231 |
103
+ | No log | 6.3529 | 108 | 1.1216 | 0.0835 | 1.1216 |
104
+ | No log | 6.4706 | 110 | 1.1530 | 0.0907 | 1.1530 |
105
+ | No log | 6.5882 | 112 | 1.1410 | 0.1176 | 1.1410 |
106
+ | No log | 6.7059 | 114 | 1.1564 | 0.1120 | 1.1564 |
107
+ | No log | 6.8235 | 116 | 1.2102 | 0.0988 | 1.2102 |
108
+ | No log | 6.9412 | 118 | 1.2596 | 0.0681 | 1.2596 |
109
+ | No log | 7.0588 | 120 | 1.2375 | 0.0722 | 1.2375 |
110
+ | No log | 7.1765 | 122 | 1.1821 | 0.1425 | 1.1821 |
111
+ | No log | 7.2941 | 124 | 1.1856 | 0.1180 | 1.1856 |
112
+ | No log | 7.4118 | 126 | 1.1952 | 0.1542 | 1.1952 |
113
+ | No log | 7.5294 | 128 | 1.2010 | 0.1643 | 1.2010 |
114
+ | No log | 7.6471 | 130 | 1.1966 | 0.1542 | 1.1966 |
115
+ | No log | 7.7647 | 132 | 1.2024 | 0.1589 | 1.2024 |
116
+ | No log | 7.8824 | 134 | 1.2473 | 0.0820 | 1.2473 |
117
+ | No log | 8.0 | 136 | 1.2492 | 0.0847 | 1.2492 |
118
+ | No log | 8.1176 | 138 | 1.2147 | 0.1538 | 1.2147 |
119
+ | No log | 8.2353 | 140 | 1.1859 | 0.0946 | 1.1859 |
120
+ | No log | 8.3529 | 142 | 1.1767 | 0.1220 | 1.1767 |
121
+ | No log | 8.4706 | 144 | 1.1738 | 0.1091 | 1.1738 |
122
+ | No log | 8.5882 | 146 | 1.1774 | 0.1326 | 1.1774 |
123
+ | No log | 8.7059 | 148 | 1.1779 | 0.1337 | 1.1779 |
124
+ | No log | 8.8235 | 150 | 1.1724 | 0.1326 | 1.1724 |
125
+ | No log | 8.9412 | 152 | 1.1684 | 0.1303 | 1.1684 |
126
+ | No log | 9.0588 | 154 | 1.1673 | 0.1468 | 1.1673 |
127
+ | No log | 9.1765 | 156 | 1.1674 | 0.1468 | 1.1674 |
128
+ | No log | 9.2941 | 158 | 1.1677 | 0.1326 | 1.1677 |
129
+ | No log | 9.4118 | 160 | 1.1752 | 0.1369 | 1.1752 |
130
+ | No log | 9.5294 | 162 | 1.1826 | 0.1538 | 1.1826 |
131
+ | No log | 9.6471 | 164 | 1.1796 | 0.1229 | 1.1796 |
132
+ | No log | 9.7647 | 166 | 1.1767 | 0.1369 | 1.1767 |
133
+ | No log | 9.8824 | 168 | 1.1735 | 0.1337 | 1.1735 |
134
+ | No log | 10.0 | 170 | 1.1732 | 0.1337 | 1.1732 |
135
 
136
 
137
  ### Framework versions