salbatarni commited on
Commit
383307c
·
verified ·
1 Parent(s): 7bed870

End of training

Browse files
Files changed (1) hide show
  1. README.md +85 -85
README.md CHANGED
@@ -3,20 +3,20 @@ base_model: aubmindlab/bert-base-arabertv02
3
  tags:
4
  - generated_from_trainer
5
  model-index:
6
- - name: arabert_cross_relevance_task5_fold6
7
  results: []
8
  ---
9
 
10
  <!-- This model card has been generated automatically according to the information the Trainer had access to. You
11
  should probably proofread and complete it, then remove this comment. -->
12
 
13
- # arabert_cross_relevance_task5_fold6
14
 
15
  This model is a fine-tuned version of [aubmindlab/bert-base-arabertv02](https://huggingface.co/aubmindlab/bert-base-arabertv02) on the None dataset.
16
  It achieves the following results on the evaluation set:
17
- - Loss: 0.3006
18
- - Qwk: 0.2137
19
- - Mse: 0.3007
20
 
21
  ## Model description
22
 
@@ -47,86 +47,86 @@ The following hyperparameters were used during training:
47
 
48
  | Training Loss | Epoch | Step | Validation Loss | Qwk | Mse |
49
  |:-------------:|:-----:|:----:|:---------------:|:------:|:------:|
50
- | No log | 0.125 | 2 | 1.0217 | 0.0195 | 1.0225 |
51
- | No log | 0.25 | 4 | 0.4152 | 0.0535 | 0.4143 |
52
- | No log | 0.375 | 6 | 0.3599 | 0.0454 | 0.3596 |
53
- | No log | 0.5 | 8 | 0.3055 | 0.0426 | 0.3056 |
54
- | No log | 0.625 | 10 | 0.2854 | 0.0805 | 0.2861 |
55
- | No log | 0.75 | 12 | 0.2641 | 0.1393 | 0.2644 |
56
- | No log | 0.875 | 14 | 0.2524 | 0.1301 | 0.2527 |
57
- | No log | 1.0 | 16 | 0.2428 | 0.1915 | 0.2430 |
58
- | No log | 1.125 | 18 | 0.2426 | 0.1915 | 0.2428 |
59
- | No log | 1.25 | 20 | 0.2443 | 0.1915 | 0.2447 |
60
- | No log | 1.375 | 22 | 0.2471 | 0.1854 | 0.2474 |
61
- | No log | 1.5 | 24 | 0.2514 | 0.1890 | 0.2517 |
62
- | No log | 1.625 | 26 | 0.2516 | 0.1638 | 0.2519 |
63
- | No log | 1.75 | 28 | 0.2545 | 0.1467 | 0.2547 |
64
- | No log | 1.875 | 30 | 0.2544 | 0.1816 | 0.2547 |
65
- | No log | 2.0 | 32 | 0.2492 | 0.2200 | 0.2495 |
66
- | No log | 2.125 | 34 | 0.2568 | 0.2044 | 0.2572 |
67
- | No log | 2.25 | 36 | 0.2853 | 0.2266 | 0.2857 |
68
- | No log | 2.375 | 38 | 0.2827 | 0.2137 | 0.2831 |
69
- | No log | 2.5 | 40 | 0.2560 | 0.2018 | 0.2565 |
70
- | No log | 2.625 | 42 | 0.2538 | 0.2177 | 0.2543 |
71
- | No log | 2.75 | 44 | 0.2538 | 0.1854 | 0.2542 |
72
- | No log | 2.875 | 46 | 0.2581 | 0.2064 | 0.2584 |
73
- | No log | 3.0 | 48 | 0.2645 | 0.2064 | 0.2648 |
74
- | No log | 3.125 | 50 | 0.2629 | 0.1830 | 0.2631 |
75
- | No log | 3.25 | 52 | 0.2615 | 0.1539 | 0.2618 |
76
- | No log | 3.375 | 54 | 0.2644 | 0.1588 | 0.2646 |
77
- | No log | 3.5 | 56 | 0.2734 | 0.2119 | 0.2736 |
78
- | No log | 3.625 | 58 | 0.2799 | 0.2050 | 0.2801 |
79
- | No log | 3.75 | 60 | 0.2702 | 0.2050 | 0.2704 |
80
- | No log | 3.875 | 62 | 0.2549 | 0.1999 | 0.2553 |
81
- | No log | 4.0 | 64 | 0.2524 | 0.2274 | 0.2528 |
82
- | No log | 4.125 | 66 | 0.2506 | 0.2274 | 0.2510 |
83
- | No log | 4.25 | 68 | 0.2560 | 0.1999 | 0.2563 |
84
- | No log | 4.375 | 70 | 0.2850 | 0.2194 | 0.2853 |
85
- | No log | 4.5 | 72 | 0.3006 | 0.2282 | 0.3008 |
86
- | No log | 4.625 | 74 | 0.2929 | 0.2099 | 0.2931 |
87
- | No log | 4.75 | 76 | 0.2761 | 0.2109 | 0.2764 |
88
- | No log | 4.875 | 78 | 0.2669 | 0.2119 | 0.2671 |
89
- | No log | 5.0 | 80 | 0.2664 | 0.2057 | 0.2667 |
90
- | No log | 5.125 | 82 | 0.2687 | 0.2109 | 0.2689 |
91
- | No log | 5.25 | 84 | 0.2617 | 0.2109 | 0.2619 |
92
- | No log | 5.375 | 86 | 0.2588 | 0.2109 | 0.2590 |
93
- | No log | 5.5 | 88 | 0.2627 | 0.2050 | 0.2628 |
94
- | No log | 5.625 | 90 | 0.2860 | 0.2147 | 0.2861 |
95
- | No log | 5.75 | 92 | 0.3309 | 0.2252 | 0.3309 |
96
- | No log | 5.875 | 94 | 0.3340 | 0.2200 | 0.3341 |
97
- | No log | 6.0 | 96 | 0.3021 | 0.2083 | 0.3022 |
98
- | No log | 6.125 | 98 | 0.2691 | 0.2173 | 0.2694 |
99
- | No log | 6.25 | 100 | 0.2601 | 0.2200 | 0.2604 |
100
- | No log | 6.375 | 102 | 0.2632 | 0.2173 | 0.2635 |
101
- | No log | 6.5 | 104 | 0.2738 | 0.2220 | 0.2741 |
102
- | No log | 6.625 | 106 | 0.2881 | 0.2250 | 0.2884 |
103
- | No log | 6.75 | 108 | 0.3079 | 0.2127 | 0.3081 |
104
- | No log | 6.875 | 110 | 0.3154 | 0.2170 | 0.3156 |
105
- | No log | 7.0 | 112 | 0.3054 | 0.2137 | 0.3056 |
106
- | No log | 7.125 | 114 | 0.2897 | 0.2091 | 0.2899 |
107
- | No log | 7.25 | 116 | 0.2842 | 0.2044 | 0.2844 |
108
- | No log | 7.375 | 118 | 0.2794 | 0.2099 | 0.2796 |
109
- | No log | 7.5 | 120 | 0.2789 | 0.2099 | 0.2791 |
110
- | No log | 7.625 | 122 | 0.2813 | 0.2099 | 0.2814 |
111
- | No log | 7.75 | 124 | 0.2926 | 0.2091 | 0.2927 |
112
- | No log | 7.875 | 126 | 0.3068 | 0.2137 | 0.3069 |
113
- | No log | 8.0 | 128 | 0.3323 | 0.2118 | 0.3323 |
114
- | No log | 8.125 | 130 | 0.3463 | 0.2070 | 0.3463 |
115
- | No log | 8.25 | 132 | 0.3382 | 0.2118 | 0.3383 |
116
- | No log | 8.375 | 134 | 0.3313 | 0.2118 | 0.3314 |
117
- | No log | 8.5 | 136 | 0.3174 | 0.2137 | 0.3174 |
118
- | No log | 8.625 | 138 | 0.3049 | 0.2137 | 0.3050 |
119
- | No log | 8.75 | 140 | 0.3007 | 0.2137 | 0.3008 |
120
- | No log | 8.875 | 142 | 0.2970 | 0.2137 | 0.2971 |
121
- | No log | 9.0 | 144 | 0.2949 | 0.2137 | 0.2950 |
122
- | No log | 9.125 | 146 | 0.2966 | 0.2137 | 0.2967 |
123
- | No log | 9.25 | 148 | 0.2992 | 0.2137 | 0.2994 |
124
- | No log | 9.375 | 150 | 0.2997 | 0.2137 | 0.2999 |
125
- | No log | 9.5 | 152 | 0.3018 | 0.2137 | 0.3019 |
126
- | No log | 9.625 | 154 | 0.3012 | 0.2137 | 0.3013 |
127
- | No log | 9.75 | 156 | 0.3004 | 0.2137 | 0.3006 |
128
- | No log | 9.875 | 158 | 0.3005 | 0.2137 | 0.3006 |
129
- | No log | 10.0 | 160 | 0.3006 | 0.2137 | 0.3007 |
130
 
131
 
132
  ### Framework versions
 
3
  tags:
4
  - generated_from_trainer
5
  model-index:
6
+ - name: arabert_cross_relevance_task6_fold0
7
  results: []
8
  ---
9
 
10
  <!-- This model card has been generated automatically according to the information the Trainer had access to. You
11
  should probably proofread and complete it, then remove this comment. -->
12
 
13
+ # arabert_cross_relevance_task6_fold0
14
 
15
  This model is a fine-tuned version of [aubmindlab/bert-base-arabertv02](https://huggingface.co/aubmindlab/bert-base-arabertv02) on the None dataset.
16
  It achieves the following results on the evaluation set:
17
+ - Loss: 0.2663
18
+ - Qwk: 0.1037
19
+ - Mse: 0.2666
20
 
21
  ## Model description
22
 
 
47
 
48
  | Training Loss | Epoch | Step | Validation Loss | Qwk | Mse |
49
  |:-------------:|:-----:|:----:|:---------------:|:------:|:------:|
50
+ | No log | 0.125 | 2 | 0.6210 | 0.0393 | 0.6211 |
51
+ | No log | 0.25 | 4 | 0.2908 | 0.1547 | 0.2908 |
52
+ | No log | 0.375 | 6 | 0.2874 | 0.0578 | 0.2874 |
53
+ | No log | 0.5 | 8 | 0.4029 | 0.0871 | 0.4030 |
54
+ | No log | 0.625 | 10 | 0.3376 | 0.0431 | 0.3380 |
55
+ | No log | 0.75 | 12 | 0.2435 | 0.0180 | 0.2438 |
56
+ | No log | 0.875 | 14 | 0.2523 | 0.0961 | 0.2524 |
57
+ | No log | 1.0 | 16 | 0.2567 | 0.1388 | 0.2568 |
58
+ | No log | 1.125 | 18 | 0.2419 | 0.0918 | 0.2421 |
59
+ | No log | 1.25 | 20 | 0.2428 | 0.1144 | 0.2430 |
60
+ | No log | 1.375 | 22 | 0.2547 | 0.1135 | 0.2549 |
61
+ | No log | 1.5 | 24 | 0.2577 | 0.1333 | 0.2580 |
62
+ | No log | 1.625 | 26 | 0.2622 | 0.1251 | 0.2625 |
63
+ | No log | 1.75 | 28 | 0.2584 | 0.1527 | 0.2587 |
64
+ | No log | 1.875 | 30 | 0.2572 | 0.1427 | 0.2574 |
65
+ | No log | 2.0 | 32 | 0.2550 | 0.1355 | 0.2552 |
66
+ | No log | 2.125 | 34 | 0.2553 | 0.1382 | 0.2554 |
67
+ | No log | 2.25 | 36 | 0.2485 | 0.1178 | 0.2486 |
68
+ | No log | 2.375 | 38 | 0.2485 | 0.0965 | 0.2487 |
69
+ | No log | 2.5 | 40 | 0.2463 | 0.0999 | 0.2465 |
70
+ | No log | 2.625 | 42 | 0.2473 | 0.1607 | 0.2475 |
71
+ | No log | 2.75 | 44 | 0.2547 | 0.2134 | 0.2549 |
72
+ | No log | 2.875 | 46 | 0.2547 | 0.1888 | 0.2549 |
73
+ | No log | 3.0 | 48 | 0.2540 | 0.1555 | 0.2542 |
74
+ | No log | 3.125 | 50 | 0.2562 | 0.1232 | 0.2565 |
75
+ | No log | 3.25 | 52 | 0.2605 | 0.1037 | 0.2607 |
76
+ | No log | 3.375 | 54 | 0.2527 | 0.1245 | 0.2529 |
77
+ | No log | 3.5 | 56 | 0.2460 | 0.1932 | 0.2462 |
78
+ | No log | 3.625 | 58 | 0.2492 | 0.1938 | 0.2494 |
79
+ | No log | 3.75 | 60 | 0.2473 | 0.1556 | 0.2476 |
80
+ | No log | 3.875 | 62 | 0.2559 | 0.1436 | 0.2563 |
81
+ | No log | 4.0 | 64 | 0.2788 | 0.0972 | 0.2792 |
82
+ | No log | 4.125 | 66 | 0.2859 | 0.0972 | 0.2863 |
83
+ | No log | 4.25 | 68 | 0.2581 | 0.1146 | 0.2584 |
84
+ | No log | 4.375 | 70 | 0.2414 | 0.1330 | 0.2417 |
85
+ | No log | 4.5 | 72 | 0.2402 | 0.1344 | 0.2404 |
86
+ | No log | 4.625 | 74 | 0.2406 | 0.1345 | 0.2408 |
87
+ | No log | 4.75 | 76 | 0.2471 | 0.1446 | 0.2473 |
88
+ | No log | 4.875 | 78 | 0.2567 | 0.1348 | 0.2570 |
89
+ | No log | 5.0 | 80 | 0.2688 | 0.1201 | 0.2692 |
90
+ | No log | 5.125 | 82 | 0.2619 | 0.1313 | 0.2622 |
91
+ | No log | 5.25 | 84 | 0.2502 | 0.1611 | 0.2505 |
92
+ | No log | 5.375 | 86 | 0.2499 | 0.1686 | 0.2501 |
93
+ | No log | 5.5 | 88 | 0.2497 | 0.1609 | 0.2499 |
94
+ | No log | 5.625 | 90 | 0.2590 | 0.1279 | 0.2592 |
95
+ | No log | 5.75 | 92 | 0.2625 | 0.1201 | 0.2628 |
96
+ | No log | 5.875 | 94 | 0.2585 | 0.1245 | 0.2588 |
97
+ | No log | 6.0 | 96 | 0.2639 | 0.1100 | 0.2642 |
98
+ | No log | 6.125 | 98 | 0.2653 | 0.1135 | 0.2656 |
99
+ | No log | 6.25 | 100 | 0.2567 | 0.1199 | 0.2570 |
100
+ | No log | 6.375 | 102 | 0.2499 | 0.1229 | 0.2502 |
101
+ | No log | 6.5 | 104 | 0.2482 | 0.1311 | 0.2484 |
102
+ | No log | 6.625 | 106 | 0.2482 | 0.1244 | 0.2485 |
103
+ | No log | 6.75 | 108 | 0.2511 | 0.1210 | 0.2514 |
104
+ | No log | 6.875 | 110 | 0.2518 | 0.1347 | 0.2521 |
105
+ | No log | 7.0 | 112 | 0.2478 | 0.1381 | 0.2481 |
106
+ | No log | 7.125 | 114 | 0.2465 | 0.1415 | 0.2468 |
107
+ | No log | 7.25 | 116 | 0.2476 | 0.1379 | 0.2478 |
108
+ | No log | 7.375 | 118 | 0.2505 | 0.1276 | 0.2508 |
109
+ | No log | 7.5 | 120 | 0.2531 | 0.1381 | 0.2534 |
110
+ | No log | 7.625 | 122 | 0.2620 | 0.1056 | 0.2623 |
111
+ | No log | 7.75 | 124 | 0.2719 | 0.1180 | 0.2723 |
112
+ | No log | 7.875 | 126 | 0.2767 | 0.1149 | 0.2770 |
113
+ | No log | 8.0 | 128 | 0.2722 | 0.1095 | 0.2726 |
114
+ | No log | 8.125 | 130 | 0.2637 | 0.1146 | 0.2640 |
115
+ | No log | 8.25 | 132 | 0.2588 | 0.1178 | 0.2591 |
116
+ | No log | 8.375 | 134 | 0.2564 | 0.1108 | 0.2567 |
117
+ | No log | 8.5 | 136 | 0.2573 | 0.1042 | 0.2576 |
118
+ | No log | 8.625 | 138 | 0.2581 | 0.0976 | 0.2584 |
119
+ | No log | 8.75 | 140 | 0.2594 | 0.0976 | 0.2597 |
120
+ | No log | 8.875 | 142 | 0.2629 | 0.0979 | 0.2632 |
121
+ | No log | 9.0 | 144 | 0.2665 | 0.0946 | 0.2669 |
122
+ | No log | 9.125 | 146 | 0.2676 | 0.0946 | 0.2679 |
123
+ | No log | 9.25 | 148 | 0.2670 | 0.0946 | 0.2673 |
124
+ | No log | 9.375 | 150 | 0.2667 | 0.0946 | 0.2671 |
125
+ | No log | 9.5 | 152 | 0.2663 | 0.0979 | 0.2666 |
126
+ | No log | 9.625 | 154 | 0.2661 | 0.0979 | 0.2664 |
127
+ | No log | 9.75 | 156 | 0.2659 | 0.0979 | 0.2663 |
128
+ | No log | 9.875 | 158 | 0.2661 | 0.1037 | 0.2664 |
129
+ | No log | 10.0 | 160 | 0.2663 | 0.1037 | 0.2666 |
130
 
131
 
132
  ### Framework versions