ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run1_AugV5_k13_task1_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.8239
  • Qwk: 0.6567
  • Mse: 0.8239
  • Rmse: 0.9077

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0206 2 6.6386 0.0308 6.6386 2.5766
No log 0.0412 4 4.3409 0.0894 4.3409 2.0835
No log 0.0619 6 2.9340 0.1149 2.9340 1.7129
No log 0.0825 8 2.5329 0.0970 2.5329 1.5915
No log 0.1031 10 2.0854 0.1260 2.0854 1.4441
No log 0.1237 12 1.7133 0.2655 1.7133 1.3089
No log 0.1443 14 1.5473 0.1852 1.5473 1.2439
No log 0.1649 16 1.5422 0.2832 1.5422 1.2419
No log 0.1856 18 1.8395 0.2333 1.8395 1.3563
No log 0.2062 20 2.4245 0.0541 2.4245 1.5571
No log 0.2268 22 2.2319 0.0699 2.2319 1.4940
No log 0.2474 24 1.6828 0.2951 1.6828 1.2972
No log 0.2680 26 1.1992 0.4874 1.1992 1.0951
No log 0.2887 28 1.5225 0.2000 1.5225 1.2339
No log 0.3093 30 1.3550 0.3966 1.3550 1.1641
No log 0.3299 32 1.0281 0.6015 1.0281 1.0139
No log 0.3505 34 1.1542 0.5037 1.1542 1.0743
No log 0.3711 36 1.2467 0.5211 1.2467 1.1166
No log 0.3918 38 1.0146 0.6197 1.0146 1.0073
No log 0.4124 40 0.9003 0.64 0.9003 0.9489
No log 0.4330 42 0.8324 0.6918 0.8324 0.9123
No log 0.4536 44 0.8800 0.7329 0.8800 0.9381
No log 0.4742 46 1.3931 0.5829 1.3931 1.1803
No log 0.4948 48 1.2730 0.6036 1.2730 1.1283
No log 0.5155 50 0.9855 0.6667 0.9855 0.9927
No log 0.5361 52 1.0368 0.5946 1.0368 1.0182
No log 0.5567 54 1.1082 0.5517 1.1082 1.0527
No log 0.5773 56 1.0923 0.5882 1.0923 1.0451
No log 0.5979 58 1.0331 0.6 1.0331 1.0164
No log 0.6186 60 0.9837 0.6225 0.9837 0.9918
No log 0.6392 62 0.9091 0.6323 0.9091 0.9535
No log 0.6598 64 0.8918 0.6582 0.8918 0.9444
No log 0.6804 66 1.0270 0.6471 1.0270 1.0134
No log 0.7010 68 0.9795 0.6282 0.9795 0.9897
No log 0.7216 70 0.8182 0.7059 0.8182 0.9046
No log 0.7423 72 0.8577 0.7105 0.8577 0.9261
No log 0.7629 74 1.1304 0.5769 1.1304 1.0632
No log 0.7835 76 1.3418 0.5093 1.3418 1.1584
No log 0.8041 78 1.1096 0.6125 1.1096 1.0534
No log 0.8247 80 0.9768 0.6835 0.9768 0.9883
No log 0.8454 82 0.9448 0.6800 0.9448 0.9720
No log 0.8660 84 1.1318 0.5844 1.1318 1.0639
No log 0.8866 86 1.1446 0.5503 1.1446 1.0699
No log 0.9072 88 1.0225 0.6081 1.0225 1.0112
No log 0.9278 90 0.9466 0.6494 0.9466 0.9729
No log 0.9485 92 0.9471 0.6410 0.9471 0.9732
No log 0.9691 94 0.9464 0.6369 0.9464 0.9728
No log 0.9897 96 0.9880 0.6832 0.9880 0.9940
No log 1.0103 98 1.0383 0.6220 1.0383 1.0190
No log 1.0309 100 0.9291 0.6708 0.9291 0.9639
No log 1.0515 102 0.8811 0.7051 0.8811 0.9387
No log 1.0722 104 0.8661 0.6933 0.8661 0.9306
No log 1.0928 106 0.8505 0.7320 0.8505 0.9222
No log 1.1134 108 0.9108 0.6835 0.9108 0.9544
No log 1.1340 110 0.9671 0.6545 0.9671 0.9834
No log 1.1546 112 0.8803 0.7143 0.8803 0.9382
No log 1.1753 114 0.7077 0.7927 0.7077 0.8412
No log 1.1959 116 0.6499 0.7907 0.6499 0.8062
No log 1.2165 118 0.6344 0.7977 0.6344 0.7965
No log 1.2371 120 0.7094 0.7602 0.7094 0.8422
No log 1.2577 122 0.8116 0.7425 0.8116 0.9009
No log 1.2784 124 0.8744 0.6962 0.8744 0.9351
No log 1.2990 126 0.7462 0.7407 0.7462 0.8638
No log 1.3196 128 0.5628 0.8070 0.5628 0.7502
No log 1.3402 130 0.6567 0.7831 0.6567 0.8103
No log 1.3608 132 0.9735 0.6748 0.9735 0.9867
No log 1.3814 134 0.6423 0.8024 0.6423 0.8014
No log 1.4021 136 0.6233 0.8128 0.6233 0.7895
No log 1.4227 138 0.9567 0.7437 0.9567 0.9781
No log 1.4433 140 0.9548 0.7111 0.9548 0.9771
No log 1.4639 142 1.0082 0.6144 1.0082 1.0041
No log 1.4845 144 0.9656 0.6234 0.9656 0.9827
No log 1.5052 146 0.9097 0.6941 0.9097 0.9538
No log 1.5258 148 0.8018 0.6918 0.8018 0.8954
No log 1.5464 150 0.7871 0.6918 0.7871 0.8872
No log 1.5670 152 0.8130 0.6923 0.8130 0.9017
No log 1.5876 154 0.8739 0.6883 0.8739 0.9348
No log 1.6082 156 0.9073 0.6883 0.9073 0.9525
No log 1.6289 158 0.8979 0.6797 0.8979 0.9476
No log 1.6495 160 0.9202 0.7020 0.9202 0.9593
No log 1.6701 162 0.9533 0.6933 0.9533 0.9764
No log 1.6907 164 1.0495 0.6323 1.0495 1.0244
No log 1.7113 166 0.9693 0.6667 0.9693 0.9845
No log 1.7320 168 0.9011 0.7059 0.9011 0.9493
No log 1.7526 170 0.8788 0.6914 0.8788 0.9375
No log 1.7732 172 0.8192 0.7152 0.8192 0.9051
No log 1.7938 174 0.7623 0.7738 0.7623 0.8731
No log 1.8144 176 0.6998 0.7436 0.6998 0.8365
No log 1.8351 178 0.6744 0.775 0.6744 0.8212
No log 1.8557 180 0.6923 0.7882 0.6923 0.8321
No log 1.8763 182 1.0034 0.6889 1.0034 1.0017
No log 1.8969 184 1.1008 0.6243 1.1008 1.0492
No log 1.9175 186 0.8802 0.6962 0.8802 0.9382
No log 1.9381 188 0.6508 0.7432 0.6508 0.8067
No log 1.9588 190 0.6284 0.7821 0.6284 0.7927
No log 1.9794 192 0.6294 0.7547 0.6294 0.7933
No log 2.0 194 0.5511 0.8187 0.5511 0.7424
No log 2.0206 196 0.6162 0.8105 0.6162 0.7850
No log 2.0412 198 0.9315 0.7263 0.9315 0.9651
No log 2.0619 200 1.2573 0.6597 1.2573 1.1213
No log 2.0825 202 1.2732 0.6136 1.2732 1.1283
No log 2.1031 204 1.0180 0.6581 1.0180 1.0090
No log 2.1237 206 0.7410 0.6803 0.7410 0.8608
No log 2.1443 208 0.5425 0.7975 0.5425 0.7366
No log 2.1649 210 0.4957 0.7879 0.4957 0.7041
No log 2.1856 212 0.5009 0.8235 0.5009 0.7077
No log 2.2062 214 0.6764 0.7727 0.6764 0.8225
No log 2.2268 216 0.8855 0.7368 0.8855 0.9410
No log 2.2474 218 0.8688 0.7018 0.8688 0.9321
No log 2.2680 220 0.7828 0.7389 0.7828 0.8847
No log 2.2887 222 0.7133 0.7285 0.7133 0.8446
No log 2.3093 224 0.7029 0.7403 0.7029 0.8384
No log 2.3299 226 0.7274 0.7468 0.7274 0.8529
No log 2.3505 228 0.7905 0.7024 0.7905 0.8891
No log 2.3711 230 0.8163 0.7024 0.8163 0.9035
No log 2.3918 232 0.6817 0.7407 0.6817 0.8257
No log 2.4124 234 0.6279 0.7898 0.6279 0.7924
No log 2.4330 236 0.6566 0.7285 0.6566 0.8103
No log 2.4536 238 0.6737 0.7432 0.6737 0.8208
No log 2.4742 240 0.6652 0.7586 0.6652 0.8156
No log 2.4948 242 0.6573 0.7133 0.6573 0.8108
No log 2.5155 244 0.6947 0.7376 0.6947 0.8335
No log 2.5361 246 0.6919 0.7532 0.6919 0.8318
No log 2.5567 248 0.7783 0.7634 0.7783 0.8822
No log 2.5773 250 0.9080 0.7725 0.9080 0.9529
No log 2.5979 252 0.7760 0.7789 0.7760 0.8809
No log 2.6186 254 0.5427 0.8046 0.5427 0.7367
No log 2.6392 256 0.5157 0.8395 0.5157 0.7181
No log 2.6598 258 0.5074 0.8199 0.5074 0.7123
No log 2.6804 260 0.5683 0.7879 0.5683 0.7539
No log 2.7010 262 0.7725 0.6994 0.7725 0.8789
No log 2.7216 264 0.9238 0.6790 0.9238 0.9612
No log 2.7423 266 0.8716 0.7152 0.8716 0.9336
No log 2.7629 268 0.7049 0.7123 0.7049 0.8396
No log 2.7835 270 0.5827 0.8280 0.5827 0.7634
No log 2.8041 272 0.5649 0.8101 0.5649 0.7516
No log 2.8247 274 0.5398 0.8323 0.5398 0.7347
No log 2.8454 276 0.5942 0.8098 0.5942 0.7708
No log 2.8660 278 0.8162 0.7 0.8162 0.9035
No log 2.8866 280 1.0026 0.6341 1.0026 1.0013
No log 2.9072 282 1.0403 0.6174 1.0403 1.0199
No log 2.9278 284 0.9378 0.6968 0.9378 0.9684
No log 2.9485 286 0.7710 0.7273 0.7710 0.8781
No log 2.9691 288 0.6648 0.7758 0.6648 0.8153
No log 2.9897 290 0.6411 0.7952 0.6411 0.8007
No log 3.0103 292 0.6512 0.7719 0.6512 0.8070
No log 3.0309 294 0.6983 0.7273 0.6983 0.8356
No log 3.0515 296 0.7335 0.7051 0.7335 0.8565
No log 3.0722 298 0.7403 0.7179 0.7403 0.8604
No log 3.0928 300 0.6836 0.7179 0.6836 0.8268
No log 3.1134 302 0.6452 0.7296 0.6452 0.8033
No log 3.1340 304 0.6395 0.7297 0.6395 0.7997
No log 3.1546 306 0.6961 0.7 0.6961 0.8343
No log 3.1753 308 0.7678 0.6567 0.7678 0.8762
No log 3.1959 310 0.7398 0.6912 0.7398 0.8601
No log 3.2165 312 0.7046 0.7123 0.7046 0.8394
No log 3.2371 314 0.8046 0.7037 0.8046 0.8970
No log 3.2577 316 0.9462 0.7273 0.9462 0.9727
No log 3.2784 318 0.8458 0.7135 0.8458 0.9197
No log 3.2990 320 0.6890 0.7160 0.6890 0.8300
No log 3.3196 322 0.6354 0.7654 0.6354 0.7971
No log 3.3402 324 0.6535 0.7403 0.6535 0.8084
No log 3.3608 326 0.6979 0.7237 0.6979 0.8354
No log 3.3814 328 0.8189 0.72 0.8189 0.9049
No log 3.4021 330 0.8727 0.6714 0.8727 0.9342
No log 3.4227 332 0.8652 0.7059 0.8652 0.9302
No log 3.4433 334 0.7842 0.7286 0.7842 0.8856
No log 3.4639 336 0.7216 0.7517 0.7216 0.8495
No log 3.4845 338 0.7338 0.7215 0.7338 0.8566
No log 3.5052 340 0.7771 0.6957 0.7771 0.8815
No log 3.5258 342 0.7209 0.7305 0.7209 0.8490
No log 3.5464 344 0.6624 0.7262 0.6624 0.8139
No log 3.5670 346 0.6127 0.7879 0.6127 0.7828
No log 3.5876 348 0.6189 0.7665 0.6189 0.7867
No log 3.6082 350 0.6325 0.7456 0.6325 0.7953
No log 3.6289 352 0.6053 0.7952 0.6053 0.7780
No log 3.6495 354 0.5672 0.8323 0.5672 0.7531
No log 3.6701 356 0.5962 0.7719 0.5962 0.7721
No log 3.6907 358 0.6806 0.7650 0.6806 0.8250
No log 3.7113 360 0.7369 0.7701 0.7369 0.8585
No log 3.7320 362 0.7234 0.7514 0.7234 0.8505
No log 3.7526 364 0.6946 0.7262 0.6946 0.8334
No log 3.7732 366 0.7153 0.7375 0.7153 0.8457
No log 3.7938 368 0.8381 0.6795 0.8381 0.9155
No log 3.8144 370 0.9907 0.6467 0.9907 0.9953
No log 3.8351 372 0.9842 0.6588 0.9842 0.9921
No log 3.8557 374 0.8263 0.7030 0.8263 0.9090
No log 3.8763 376 0.6364 0.7826 0.6364 0.7978
No log 3.8969 378 0.5979 0.8129 0.5979 0.7733
No log 3.9175 380 0.6032 0.8129 0.6032 0.7766
No log 3.9381 382 0.6340 0.7821 0.6340 0.7963
No log 3.9588 384 0.6742 0.7515 0.6742 0.8211
No log 3.9794 386 0.7577 0.7066 0.7577 0.8705
No log 4.0 388 0.7956 0.6918 0.7956 0.8920
No log 4.0206 390 0.8222 0.6623 0.8222 0.9068
No log 4.0412 392 0.7831 0.6835 0.7831 0.8849
No log 4.0619 394 0.7014 0.7453 0.7014 0.8375
No log 4.0825 396 0.6360 0.7595 0.6360 0.7975
No log 4.1031 398 0.6268 0.7848 0.6268 0.7917
No log 4.1237 400 0.6348 0.7853 0.6348 0.7968
No log 4.1443 402 0.7172 0.7337 0.7172 0.8469
No log 4.1649 404 0.8698 0.7262 0.8698 0.9327
No log 4.1856 406 0.9461 0.6909 0.9461 0.9727
No log 4.2062 408 0.8941 0.6795 0.8941 0.9456
No log 4.2268 410 0.8287 0.7123 0.8287 0.9103
No log 4.2474 412 0.8115 0.7123 0.8115 0.9008
No log 4.2680 414 0.8251 0.6994 0.8251 0.9084
No log 4.2887 416 0.9469 0.7396 0.9469 0.9731
No log 4.3093 418 0.9229 0.7437 0.9229 0.9607
No log 4.3299 420 0.7387 0.7668 0.7387 0.8595
No log 4.3505 422 0.6246 0.7294 0.6246 0.7903
No log 4.3711 424 0.6276 0.7355 0.6276 0.7922
No log 4.3918 426 0.6380 0.7368 0.6380 0.7987
No log 4.4124 428 0.6312 0.7651 0.6312 0.7945
No log 4.4330 430 0.6137 0.7651 0.6137 0.7834
No log 4.4536 432 0.5521 0.8182 0.5521 0.7430
No log 4.4742 434 0.5049 0.8354 0.5049 0.7106
No log 4.4948 436 0.4706 0.8415 0.4706 0.6860
No log 4.5155 438 0.4786 0.8276 0.4786 0.6918
No log 4.5361 440 0.5208 0.8161 0.5208 0.7217
No log 4.5567 442 0.5843 0.7758 0.5843 0.7644
No log 4.5773 444 0.6199 0.7451 0.6199 0.7874
No log 4.5979 446 0.6672 0.7397 0.6672 0.8168
No log 4.6186 448 0.6637 0.7465 0.6637 0.8147
No log 4.6392 450 0.6270 0.7361 0.6270 0.7918
No log 4.6598 452 0.6211 0.7467 0.6211 0.7881
No log 4.6804 454 0.6129 0.7595 0.6129 0.7829
No log 4.7010 456 0.6952 0.7375 0.6952 0.8338
No log 4.7216 458 0.8275 0.7487 0.8275 0.9097
No log 4.7423 460 0.8799 0.7391 0.8799 0.9381
No log 4.7629 462 0.7715 0.7143 0.7715 0.8784
No log 4.7835 464 0.6494 0.7451 0.6494 0.8059
No log 4.8041 466 0.6108 0.7792 0.6108 0.7815
No log 4.8247 468 0.5989 0.8 0.5989 0.7739
No log 4.8454 470 0.5888 0.8101 0.5888 0.7673
No log 4.8660 472 0.6099 0.7643 0.6099 0.7809
No log 4.8866 474 0.6723 0.7485 0.6723 0.8199
No log 4.9072 476 0.6485 0.7826 0.6485 0.8053
No log 4.9278 478 0.5903 0.8 0.5903 0.7683
No log 4.9485 480 0.5514 0.8068 0.5514 0.7426
No log 4.9691 482 0.5580 0.8298 0.5580 0.7470
No log 4.9897 484 0.5587 0.8046 0.5587 0.7475
No log 5.0103 486 0.5812 0.7976 0.5812 0.7624
No log 5.0309 488 0.6220 0.8182 0.6220 0.7887
No log 5.0515 490 0.6713 0.7465 0.6713 0.8194
No log 5.0722 492 0.7329 0.6883 0.7329 0.8561
No log 5.0928 494 0.8174 0.7195 0.8174 0.9041
No log 5.1134 496 0.8007 0.7253 0.8007 0.8948
No log 5.1340 498 0.7276 0.7429 0.7276 0.8530
0.3973 5.1546 500 0.6351 0.7717 0.6351 0.7969
0.3973 5.1753 502 0.6046 0.8222 0.6046 0.7775
0.3973 5.1959 504 0.5989 0.8161 0.5989 0.7739
0.3973 5.2165 506 0.5907 0.8 0.5907 0.7686
0.3973 5.2371 508 0.6053 0.7746 0.6053 0.7780
0.3973 5.2577 510 0.5994 0.7929 0.5994 0.7742
0.3973 5.2784 512 0.6174 0.7927 0.6174 0.7858
0.3973 5.2990 514 0.5979 0.8121 0.5979 0.7732
0.3973 5.3196 516 0.5881 0.8049 0.5881 0.7669
0.3973 5.3402 518 0.5806 0.8199 0.5806 0.7619
0.3973 5.3608 520 0.5868 0.8228 0.5868 0.7660
0.3973 5.3814 522 0.5853 0.8302 0.5853 0.7651
0.3973 5.4021 524 0.6008 0.7826 0.6008 0.7751
0.3973 5.4227 526 0.6094 0.7826 0.6094 0.7807
0.3973 5.4433 528 0.5773 0.7853 0.5773 0.7598
0.3973 5.4639 530 0.5429 0.8050 0.5429 0.7368
0.3973 5.4845 532 0.5545 0.8050 0.5545 0.7446
0.3973 5.5052 534 0.6134 0.7925 0.6134 0.7832
0.3973 5.5258 536 0.7517 0.7067 0.7517 0.8670
0.3973 5.5464 538 0.8842 0.6621 0.8842 0.9403
0.3973 5.5670 540 0.9498 0.6377 0.9498 0.9746
0.3973 5.5876 542 0.9129 0.6569 0.9129 0.9555
0.3973 5.6082 544 0.8239 0.6567 0.8239 0.9077

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
4
Safetensors
Model size
135M params
Tensor type
F32
·
Inference Providers NEW
This model is not currently available via any of the supported Inference Providers.

Model tree for MayBashendy/ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run1_AugV5_k13_task1_organization

Finetuned
(4222)
this model