End of training
Browse files
README.md
CHANGED
@@ -3,20 +3,20 @@ base_model: aubmindlab/bert-base-arabertv02
|
|
3 |
tags:
|
4 |
- generated_from_trainer
|
5 |
model-index:
|
6 |
-
- name:
|
7 |
results: []
|
8 |
---
|
9 |
|
10 |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
|
11 |
should probably proofread and complete it, then remove this comment. -->
|
12 |
|
13 |
-
#
|
14 |
|
15 |
This model is a fine-tuned version of [aubmindlab/bert-base-arabertv02](https://huggingface.co/aubmindlab/bert-base-arabertv02) on the None dataset.
|
16 |
It achieves the following results on the evaluation set:
|
17 |
-
- Loss: 0.
|
18 |
-
- Qwk: 0.
|
19 |
-
- Mse: 0.
|
20 |
|
21 |
## Model description
|
22 |
|
@@ -47,96 +47,91 @@ The following hyperparameters were used during training:
|
|
47 |
|
48 |
| Training Loss | Epoch | Step | Validation Loss | Qwk | Mse |
|
49 |
|:-------------:|:------:|:----:|:---------------:|:------:|:------:|
|
50 |
-
| No log | 0.
|
51 |
-
| No log | 0.
|
52 |
-
| No log | 0.
|
53 |
-
| No log | 0.
|
54 |
-
| No log | 0.
|
55 |
-
| No log | 0.
|
56 |
-
| No log | 0.
|
57 |
-
| No log | 0.
|
58 |
-
| No log | 1.
|
59 |
-
| No log | 1.
|
60 |
-
| No log | 1.
|
61 |
-
| No log | 1.
|
62 |
-
| No log | 1.
|
63 |
-
| No log | 1.
|
64 |
-
| No log | 1.
|
65 |
-
| No log | 1.
|
66 |
-
| No log |
|
67 |
-
| No log | 2.
|
68 |
-
| No log | 2.
|
69 |
-
| No log | 2.
|
70 |
-
| No log | 2.
|
71 |
-
| No log | 2.
|
72 |
-
| No log | 2.
|
73 |
-
| No log | 2.
|
74 |
-
| No log | 2.
|
75 |
-
| No log |
|
76 |
-
| No log | 3.
|
77 |
-
| No log | 3.
|
78 |
-
| No log | 3.
|
79 |
-
| No log | 3.
|
80 |
-
| No log | 3.
|
81 |
-
| No log | 3.
|
82 |
-
| No log | 3.
|
83 |
-
| No log |
|
84 |
-
| No log |
|
85 |
-
| No log | 4.
|
86 |
-
| No log | 4.
|
87 |
-
| No log | 4.
|
88 |
-
| No log | 4.
|
89 |
-
| No log | 4.
|
90 |
-
| No log | 4.
|
91 |
-
| No log | 4.
|
92 |
-
| No log |
|
93 |
-
| No log |
|
94 |
-
| No log | 5.
|
95 |
-
| No log | 5.
|
96 |
-
| No log | 5.
|
97 |
-
| No log | 5.
|
98 |
-
| No log | 5.
|
99 |
-
| No log | 5.
|
100 |
-
| No log |
|
101 |
-
| No log |
|
102 |
-
| No log |
|
103 |
-
| No log | 6.
|
104 |
-
| No log | 6.
|
105 |
-
| No log | 6.
|
106 |
-
| No log | 6.
|
107 |
-
| No log | 6.
|
108 |
-
| No log | 6.
|
109 |
-
| No log |
|
110 |
-
| No log |
|
111 |
-
| No log |
|
112 |
-
| No log | 7.
|
113 |
-
| No log | 7.
|
114 |
-
| No log | 7.
|
115 |
-
| No log | 7.
|
116 |
-
| No log | 7.
|
117 |
-
| No log |
|
118 |
-
| No log |
|
119 |
-
| No log |
|
120 |
-
| No log |
|
121 |
-
| No log | 8.
|
122 |
-
| No log | 8.
|
123 |
-
| No log | 8.
|
124 |
-
| No log | 8.
|
125 |
-
| No log | 8.
|
126 |
-
| No log |
|
127 |
-
| No log |
|
128 |
-
| No log |
|
129 |
-
| No log |
|
130 |
-
| No log | 9.
|
131 |
-
| No log | 9.
|
132 |
-
| No log | 9.
|
133 |
-
| No log | 9.
|
134 |
-
| No log |
|
135 |
-
| No log | 9.5556 | 172 | 0.2570 | 0.3373 | 0.2570 |
|
136 |
-
| No log | 9.6667 | 174 | 0.2566 | 0.3440 | 0.2566 |
|
137 |
-
| No log | 9.7778 | 176 | 0.2569 | 0.3506 | 0.2569 |
|
138 |
-
| No log | 9.8889 | 178 | 0.2572 | 0.3506 | 0.2572 |
|
139 |
-
| No log | 10.0 | 180 | 0.2574 | 0.3506 | 0.2574 |
|
140 |
|
141 |
|
142 |
### Framework versions
|
|
|
3 |
tags:
|
4 |
- generated_from_trainer
|
5 |
model-index:
|
6 |
+
- name: arabert_cross_relevance_task4_fold5
|
7 |
results: []
|
8 |
---
|
9 |
|
10 |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
|
11 |
should probably proofread and complete it, then remove this comment. -->
|
12 |
|
13 |
+
# arabert_cross_relevance_task4_fold5
|
14 |
|
15 |
This model is a fine-tuned version of [aubmindlab/bert-base-arabertv02](https://huggingface.co/aubmindlab/bert-base-arabertv02) on the None dataset.
|
16 |
It achieves the following results on the evaluation set:
|
17 |
+
- Loss: 0.1959
|
18 |
+
- Qwk: 0.3973
|
19 |
+
- Mse: 0.1959
|
20 |
|
21 |
## Model description
|
22 |
|
|
|
47 |
|
48 |
| Training Loss | Epoch | Step | Validation Loss | Qwk | Mse |
|
49 |
|:-------------:|:------:|:----:|:---------------:|:------:|:------:|
|
50 |
+
| No log | 0.1176 | 2 | 0.2829 | 0.2281 | 0.2825 |
|
51 |
+
| No log | 0.2353 | 4 | 0.3384 | 0.3607 | 0.3377 |
|
52 |
+
| No log | 0.3529 | 6 | 0.2361 | 0.3469 | 0.2356 |
|
53 |
+
| No log | 0.4706 | 8 | 0.2565 | 0.3434 | 0.2564 |
|
54 |
+
| No log | 0.5882 | 10 | 0.2109 | 0.3531 | 0.2109 |
|
55 |
+
| No log | 0.7059 | 12 | 0.1920 | 0.3266 | 0.1917 |
|
56 |
+
| No log | 0.8235 | 14 | 0.1946 | 0.3334 | 0.1943 |
|
57 |
+
| No log | 0.9412 | 16 | 0.1757 | 0.4574 | 0.1756 |
|
58 |
+
| No log | 1.0588 | 18 | 0.1726 | 0.5071 | 0.1727 |
|
59 |
+
| No log | 1.1765 | 20 | 0.1653 | 0.4079 | 0.1653 |
|
60 |
+
| No log | 1.2941 | 22 | 0.1708 | 0.3931 | 0.1707 |
|
61 |
+
| No log | 1.4118 | 24 | 0.1783 | 0.3512 | 0.1781 |
|
62 |
+
| No log | 1.5294 | 26 | 0.1810 | 0.3512 | 0.1809 |
|
63 |
+
| No log | 1.6471 | 28 | 0.1836 | 0.3467 | 0.1835 |
|
64 |
+
| No log | 1.7647 | 30 | 0.1863 | 0.3686 | 0.1862 |
|
65 |
+
| No log | 1.8824 | 32 | 0.1946 | 0.3724 | 0.1944 |
|
66 |
+
| No log | 2.0 | 34 | 0.1957 | 0.3890 | 0.1956 |
|
67 |
+
| No log | 2.1176 | 36 | 0.1833 | 0.3976 | 0.1833 |
|
68 |
+
| No log | 2.2353 | 38 | 0.1883 | 0.3683 | 0.1884 |
|
69 |
+
| No log | 2.3529 | 40 | 0.1889 | 0.3608 | 0.1889 |
|
70 |
+
| No log | 2.4706 | 42 | 0.1805 | 0.3872 | 0.1805 |
|
71 |
+
| No log | 2.5882 | 44 | 0.1951 | 0.4399 | 0.1949 |
|
72 |
+
| No log | 2.7059 | 46 | 0.2076 | 0.4132 | 0.2074 |
|
73 |
+
| No log | 2.8235 | 48 | 0.1906 | 0.4472 | 0.1904 |
|
74 |
+
| No log | 2.9412 | 50 | 0.1805 | 0.4286 | 0.1806 |
|
75 |
+
| No log | 3.0588 | 52 | 0.1962 | 0.3808 | 0.1964 |
|
76 |
+
| No log | 3.1765 | 54 | 0.1906 | 0.3696 | 0.1907 |
|
77 |
+
| No log | 3.2941 | 56 | 0.1903 | 0.3562 | 0.1903 |
|
78 |
+
| No log | 3.4118 | 58 | 0.1893 | 0.3519 | 0.1893 |
|
79 |
+
| No log | 3.5294 | 60 | 0.1793 | 0.3754 | 0.1793 |
|
80 |
+
| No log | 3.6471 | 62 | 0.1725 | 0.3761 | 0.1726 |
|
81 |
+
| No log | 3.7647 | 64 | 0.1732 | 0.3867 | 0.1732 |
|
82 |
+
| No log | 3.8824 | 66 | 0.1756 | 0.3728 | 0.1757 |
|
83 |
+
| No log | 4.0 | 68 | 0.1756 | 0.3768 | 0.1756 |
|
84 |
+
| No log | 4.1176 | 70 | 0.1760 | 0.3788 | 0.1760 |
|
85 |
+
| No log | 4.2353 | 72 | 0.1762 | 0.3724 | 0.1762 |
|
86 |
+
| No log | 4.3529 | 74 | 0.1718 | 0.3623 | 0.1717 |
|
87 |
+
| No log | 4.4706 | 76 | 0.1679 | 0.3667 | 0.1679 |
|
88 |
+
| No log | 4.5882 | 78 | 0.1666 | 0.3816 | 0.1667 |
|
89 |
+
| No log | 4.7059 | 80 | 0.1776 | 0.3966 | 0.1778 |
|
90 |
+
| No log | 4.8235 | 82 | 0.1888 | 0.3969 | 0.1890 |
|
91 |
+
| No log | 4.9412 | 84 | 0.1893 | 0.3969 | 0.1895 |
|
92 |
+
| No log | 5.0588 | 86 | 0.1782 | 0.3768 | 0.1783 |
|
93 |
+
| No log | 5.1765 | 88 | 0.1774 | 0.3683 | 0.1774 |
|
94 |
+
| No log | 5.2941 | 90 | 0.1798 | 0.3744 | 0.1797 |
|
95 |
+
| No log | 5.4118 | 92 | 0.1823 | 0.3684 | 0.1822 |
|
96 |
+
| No log | 5.5294 | 94 | 0.1828 | 0.3752 | 0.1827 |
|
97 |
+
| No log | 5.6471 | 96 | 0.1843 | 0.3774 | 0.1843 |
|
98 |
+
| No log | 5.7647 | 98 | 0.1901 | 0.3698 | 0.1901 |
|
99 |
+
| No log | 5.8824 | 100 | 0.1920 | 0.3735 | 0.1921 |
|
100 |
+
| No log | 6.0 | 102 | 0.1844 | 0.3821 | 0.1845 |
|
101 |
+
| No log | 6.1176 | 104 | 0.1765 | 0.4146 | 0.1766 |
|
102 |
+
| No log | 6.2353 | 106 | 0.1765 | 0.4100 | 0.1765 |
|
103 |
+
| No log | 6.3529 | 108 | 0.1783 | 0.4054 | 0.1783 |
|
104 |
+
| No log | 6.4706 | 110 | 0.1769 | 0.3909 | 0.1769 |
|
105 |
+
| No log | 6.5882 | 112 | 0.1761 | 0.3945 | 0.1762 |
|
106 |
+
| No log | 6.7059 | 114 | 0.1788 | 0.3862 | 0.1789 |
|
107 |
+
| No log | 6.8235 | 116 | 0.1789 | 0.3906 | 0.1789 |
|
108 |
+
| No log | 6.9412 | 118 | 0.1803 | 0.4054 | 0.1804 |
|
109 |
+
| No log | 7.0588 | 120 | 0.1833 | 0.3980 | 0.1834 |
|
110 |
+
| No log | 7.1765 | 122 | 0.1902 | 0.4017 | 0.1903 |
|
111 |
+
| No log | 7.2941 | 124 | 0.1979 | 0.4022 | 0.1981 |
|
112 |
+
| No log | 7.4118 | 126 | 0.1996 | 0.3880 | 0.1998 |
|
113 |
+
| No log | 7.5294 | 128 | 0.1940 | 0.3978 | 0.1941 |
|
114 |
+
| No log | 7.6471 | 130 | 0.1917 | 0.4072 | 0.1918 |
|
115 |
+
| No log | 7.7647 | 132 | 0.1940 | 0.3980 | 0.1940 |
|
116 |
+
| No log | 7.8824 | 134 | 0.2019 | 0.3978 | 0.2020 |
|
117 |
+
| No log | 8.0 | 136 | 0.2093 | 0.4067 | 0.2094 |
|
118 |
+
| No log | 8.1176 | 138 | 0.2156 | 0.3976 | 0.2157 |
|
119 |
+
| No log | 8.2353 | 140 | 0.2077 | 0.4067 | 0.2078 |
|
120 |
+
| No log | 8.3529 | 142 | 0.1994 | 0.4031 | 0.1995 |
|
121 |
+
| No log | 8.4706 | 144 | 0.1959 | 0.4118 | 0.1960 |
|
122 |
+
| No log | 8.5882 | 146 | 0.1952 | 0.4068 | 0.1953 |
|
123 |
+
| No log | 8.7059 | 148 | 0.1954 | 0.4021 | 0.1955 |
|
124 |
+
| No log | 8.8235 | 150 | 0.1949 | 0.3954 | 0.1950 |
|
125 |
+
| No log | 8.9412 | 152 | 0.1945 | 0.3906 | 0.1945 |
|
126 |
+
| No log | 9.0588 | 154 | 0.1941 | 0.3852 | 0.1941 |
|
127 |
+
| No log | 9.1765 | 156 | 0.1942 | 0.3755 | 0.1942 |
|
128 |
+
| No log | 9.2941 | 158 | 0.1931 | 0.3852 | 0.1931 |
|
129 |
+
| No log | 9.4118 | 160 | 0.1927 | 0.3852 | 0.1927 |
|
130 |
+
| No log | 9.5294 | 162 | 0.1933 | 0.3941 | 0.1933 |
|
131 |
+
| No log | 9.6471 | 164 | 0.1939 | 0.3941 | 0.1940 |
|
132 |
+
| No log | 9.7647 | 166 | 0.1946 | 0.3957 | 0.1947 |
|
133 |
+
| No log | 9.8824 | 168 | 0.1954 | 0.3973 | 0.1955 |
|
134 |
+
| No log | 10.0 | 170 | 0.1959 | 0.3973 | 0.1959 |
|
|
|
|
|
|
|
|
|
|
|
135 |
|
136 |
|
137 |
### Framework versions
|