ArabicNewSplits8_usingWellWrittenEssays_FineTuningAraBERT_run2_AugV5_k18_task2_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.6260
  • Qwk: 0.4721
  • Mse: 0.6260
  • Rmse: 0.7912

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0211 2 4.2097 -0.0149 4.2097 2.0517
No log 0.0421 4 2.5782 0.0587 2.5782 1.6057
No log 0.0632 6 1.3197 0.0773 1.3197 1.1488
No log 0.0842 8 1.0302 -0.0011 1.0302 1.0150
No log 0.1053 10 1.0478 -0.0444 1.0478 1.0236
No log 0.1263 12 0.8546 0.1340 0.8546 0.9244
No log 0.1474 14 0.8383 0.1842 0.8383 0.9156
No log 0.1684 16 0.9568 0.0212 0.9568 0.9782
No log 0.1895 18 1.0898 -0.0175 1.0898 1.0439
No log 0.2105 20 1.4527 0.0210 1.4527 1.2053
No log 0.2316 22 1.4635 0.0 1.4635 1.2098
No log 0.2526 24 1.1380 -0.0548 1.1380 1.0668
No log 0.2737 26 0.9040 0.0616 0.9040 0.9508
No log 0.2947 28 0.8778 0.0795 0.8778 0.9369
No log 0.3158 30 0.8393 0.1340 0.8393 0.9161
No log 0.3368 32 0.8704 0.0573 0.8704 0.9330
No log 0.3579 34 0.8873 0.0642 0.8873 0.9420
No log 0.3789 36 0.9541 0.1455 0.9541 0.9768
No log 0.4 38 1.0408 0.0283 1.0408 1.0202
No log 0.4211 40 1.0647 0.0058 1.0647 1.0318
No log 0.4421 42 1.0126 0.0737 1.0126 1.0063
No log 0.4632 44 1.1037 0.0071 1.1037 1.0505
No log 0.4842 46 1.1039 0.0893 1.1039 1.0507
No log 0.5053 48 0.9551 0.2100 0.9551 0.9773
No log 0.5263 50 0.9332 0.1881 0.9332 0.9660
No log 0.5474 52 0.9323 0.2176 0.9323 0.9655
No log 0.5684 54 0.9447 0.2197 0.9447 0.9720
No log 0.5895 56 0.9117 0.1971 0.9117 0.9548
No log 0.6105 58 0.8912 0.1461 0.8912 0.9440
No log 0.6316 60 0.8505 0.1416 0.8505 0.9222
No log 0.6526 62 0.8079 0.2140 0.8079 0.8988
No log 0.6737 64 0.8848 0.1982 0.8848 0.9406
No log 0.6947 66 0.9672 0.1261 0.9672 0.9834
No log 0.7158 68 1.1036 -0.0481 1.1036 1.0505
No log 0.7368 70 1.2673 -0.0635 1.2673 1.1257
No log 0.7579 72 1.1015 0.1101 1.1015 1.0495
No log 0.7789 74 0.8036 0.2688 0.8036 0.8965
No log 0.8 76 0.8736 0.2730 0.8736 0.9347
No log 0.8211 78 0.8582 0.2843 0.8582 0.9264
No log 0.8421 80 0.8516 0.2412 0.8516 0.9228
No log 0.8632 82 0.9163 0.2853 0.9163 0.9572
No log 0.8842 84 0.8865 0.2421 0.8865 0.9415
No log 0.9053 86 0.7909 0.3096 0.7909 0.8893
No log 0.9263 88 0.7728 0.2646 0.7728 0.8791
No log 0.9474 90 0.8157 0.2540 0.8157 0.9032
No log 0.9684 92 0.7664 0.2916 0.7664 0.8754
No log 0.9895 94 0.7379 0.3307 0.7379 0.8590
No log 1.0105 96 0.7555 0.3486 0.7555 0.8692
No log 1.0316 98 0.7372 0.3729 0.7372 0.8586
No log 1.0526 100 1.0513 0.2452 1.0513 1.0253
No log 1.0737 102 1.2066 0.2635 1.2066 1.0984
No log 1.0947 104 0.9465 0.3190 0.9465 0.9729
No log 1.1158 106 0.7233 0.5012 0.7233 0.8505
No log 1.1368 108 0.7298 0.4387 0.7298 0.8543
No log 1.1579 110 0.7545 0.4543 0.7545 0.8686
No log 1.1789 112 1.0095 0.3183 1.0095 1.0048
No log 1.2 114 1.0038 0.3231 1.0038 1.0019
No log 1.2211 116 0.7347 0.4328 0.7347 0.8571
No log 1.2421 118 0.7638 0.2997 0.7638 0.8739
No log 1.2632 120 0.9173 0.2788 0.9173 0.9578
No log 1.2842 122 0.9928 0.2731 0.9928 0.9964
No log 1.3053 124 0.8292 0.3495 0.8292 0.9106
No log 1.3263 126 0.6645 0.3885 0.6645 0.8152
No log 1.3474 128 0.7403 0.4690 0.7403 0.8604
No log 1.3684 130 0.7894 0.4238 0.7894 0.8885
No log 1.3895 132 0.6733 0.3970 0.6733 0.8206
No log 1.4105 134 0.6410 0.4080 0.6410 0.8006
No log 1.4316 136 0.8066 0.3832 0.8066 0.8981
No log 1.4526 138 0.8016 0.3678 0.8016 0.8953
No log 1.4737 140 0.6775 0.4304 0.6775 0.8231
No log 1.4947 142 0.6536 0.5012 0.6536 0.8085
No log 1.5158 144 0.6903 0.4669 0.6903 0.8309
No log 1.5368 146 0.7039 0.4357 0.7039 0.8390
No log 1.5579 148 0.7402 0.3687 0.7403 0.8604
No log 1.5789 150 0.8454 0.3297 0.8454 0.9194
No log 1.6 152 0.8778 0.3284 0.8778 0.9369
No log 1.6211 154 0.9024 0.3144 0.9024 0.9500
No log 1.6421 156 0.7700 0.3676 0.7700 0.8775
No log 1.6632 158 0.7262 0.3446 0.7262 0.8522
No log 1.6842 160 0.7249 0.2797 0.7249 0.8514
No log 1.7053 162 0.7517 0.2716 0.7517 0.8670
No log 1.7263 164 0.7267 0.3062 0.7267 0.8525
No log 1.7474 166 0.6821 0.3244 0.6821 0.8259
No log 1.7684 168 0.6691 0.4585 0.6691 0.8180
No log 1.7895 170 0.6660 0.4781 0.6660 0.8161
No log 1.8105 172 0.6748 0.5159 0.6748 0.8215
No log 1.8316 174 0.6465 0.4474 0.6465 0.8041
No log 1.8526 176 0.6373 0.4537 0.6373 0.7983
No log 1.8737 178 0.6940 0.4529 0.6940 0.8330
No log 1.8947 180 0.6828 0.4578 0.6828 0.8263
No log 1.9158 182 0.6647 0.5162 0.6647 0.8153
No log 1.9368 184 0.6667 0.5309 0.6667 0.8165
No log 1.9579 186 0.7051 0.4672 0.7051 0.8397
No log 1.9789 188 0.8478 0.3844 0.8478 0.9208
No log 2.0 190 0.8585 0.3759 0.8585 0.9266
No log 2.0211 192 0.7694 0.3995 0.7694 0.8771
No log 2.0421 194 0.7231 0.4761 0.7231 0.8504
No log 2.0632 196 0.7431 0.4644 0.7431 0.8620
No log 2.0842 198 0.7901 0.4451 0.7901 0.8889
No log 2.1053 200 0.7577 0.4987 0.7577 0.8705
No log 2.1263 202 0.7762 0.4852 0.7762 0.8810
No log 2.1474 204 0.7244 0.4557 0.7244 0.8511
No log 2.1684 206 0.6841 0.4130 0.6841 0.8271
No log 2.1895 208 0.6894 0.3373 0.6894 0.8303
No log 2.2105 210 0.6650 0.3532 0.6650 0.8155
No log 2.2316 212 0.7266 0.3969 0.7266 0.8524
No log 2.2526 214 0.8106 0.3769 0.8106 0.9003
No log 2.2737 216 0.8297 0.3616 0.8297 0.9109
No log 2.2947 218 0.7389 0.4336 0.7389 0.8596
No log 2.3158 220 0.7121 0.4198 0.7121 0.8439
No log 2.3368 222 0.7115 0.3672 0.7115 0.8435
No log 2.3579 224 0.7399 0.4270 0.7399 0.8602
No log 2.3789 226 0.9752 0.3976 0.9752 0.9875
No log 2.4 228 1.0765 0.3517 1.0765 1.0375
No log 2.4211 230 0.8772 0.3867 0.8772 0.9366
No log 2.4421 232 0.7032 0.4354 0.7032 0.8386
No log 2.4632 234 0.6810 0.3499 0.6810 0.8252
No log 2.4842 236 0.6927 0.4546 0.6927 0.8323
No log 2.5053 238 0.8415 0.3630 0.8415 0.9173
No log 2.5263 240 0.9222 0.3556 0.9222 0.9603
No log 2.5474 242 0.8166 0.3611 0.8166 0.9037
No log 2.5684 244 0.6657 0.4196 0.6657 0.8159
No log 2.5895 246 0.7061 0.4036 0.7061 0.8403
No log 2.6105 248 0.6788 0.4402 0.6788 0.8239
No log 2.6316 250 0.6703 0.4544 0.6703 0.8187
No log 2.6526 252 0.7386 0.3755 0.7386 0.8594
No log 2.6737 254 0.7893 0.3961 0.7893 0.8884
No log 2.6947 256 0.7305 0.4484 0.7305 0.8547
No log 2.7158 258 0.7320 0.3830 0.7320 0.8556
No log 2.7368 260 0.7651 0.4174 0.7651 0.8747
No log 2.7579 262 0.8126 0.3950 0.8126 0.9014
No log 2.7789 264 0.8648 0.4179 0.8648 0.9299
No log 2.8 266 0.7705 0.3847 0.7705 0.8778
No log 2.8211 268 0.7795 0.3750 0.7795 0.8829
No log 2.8421 270 0.7877 0.4347 0.7877 0.8875
No log 2.8632 272 0.7041 0.4229 0.7041 0.8391
No log 2.8842 274 0.8339 0.3898 0.8339 0.9132
No log 2.9053 276 0.9421 0.3376 0.9421 0.9706
No log 2.9263 278 0.8269 0.3658 0.8269 0.9094
No log 2.9474 280 0.6673 0.4418 0.6673 0.8169
No log 2.9684 282 0.7931 0.3983 0.7931 0.8906
No log 2.9895 284 1.0238 0.2709 1.0238 1.0118
No log 3.0105 286 0.9739 0.2852 0.9739 0.9868
No log 3.0316 288 0.7576 0.4356 0.7576 0.8704
No log 3.0526 290 0.6353 0.3241 0.6353 0.7971
No log 3.0737 292 0.7411 0.4093 0.7411 0.8609
No log 3.0947 294 0.8188 0.3670 0.8188 0.9049
No log 3.1158 296 0.7640 0.3758 0.7640 0.8741
No log 3.1368 298 0.6978 0.3798 0.6978 0.8353
No log 3.1579 300 0.6587 0.3644 0.6587 0.8116
No log 3.1789 302 0.6724 0.3188 0.6724 0.8200
No log 3.2 304 0.6651 0.3199 0.6651 0.8155
No log 3.2211 306 0.6479 0.3360 0.6479 0.8049
No log 3.2421 308 0.6698 0.3703 0.6698 0.8184
No log 3.2632 310 0.7515 0.3932 0.7515 0.8669
No log 3.2842 312 0.8026 0.3988 0.8026 0.8959
No log 3.3053 314 0.7554 0.3873 0.7554 0.8691
No log 3.3263 316 0.7466 0.3813 0.7466 0.8641
No log 3.3474 318 0.6728 0.4561 0.6728 0.8202
No log 3.3684 320 0.7147 0.4179 0.7147 0.8454
No log 3.3895 322 0.6946 0.4211 0.6946 0.8334
No log 3.4105 324 0.6398 0.4372 0.6398 0.7999
No log 3.4316 326 0.7547 0.3873 0.7547 0.8688
No log 3.4526 328 0.8605 0.4005 0.8605 0.9276
No log 3.4737 330 0.7764 0.3781 0.7764 0.8811
No log 3.4947 332 0.6440 0.4467 0.6440 0.8025
No log 3.5158 334 0.6857 0.3943 0.6857 0.8281
No log 3.5368 336 0.7939 0.3472 0.7939 0.8910
No log 3.5579 338 0.7265 0.3717 0.7265 0.8524
No log 3.5789 340 0.6277 0.4242 0.6277 0.7923
No log 3.6 342 0.6947 0.4548 0.6947 0.8335
No log 3.6211 344 0.7877 0.3959 0.7877 0.8875
No log 3.6421 346 0.8363 0.4228 0.8363 0.9145
No log 3.6632 348 0.7590 0.3833 0.7590 0.8712
No log 3.6842 350 0.6667 0.3235 0.6667 0.8165
No log 3.7053 352 0.6595 0.3144 0.6595 0.8121
No log 3.7263 354 0.6638 0.3686 0.6638 0.8147
No log 3.7474 356 0.6393 0.3557 0.6393 0.7996
No log 3.7684 358 0.6384 0.2974 0.6384 0.7990
No log 3.7895 360 0.6508 0.3170 0.6508 0.8068
No log 3.8105 362 0.6482 0.3362 0.6482 0.8051
No log 3.8316 364 0.6736 0.4040 0.6736 0.8207
No log 3.8526 366 0.7332 0.4153 0.7332 0.8563
No log 3.8737 368 0.7148 0.3641 0.7148 0.8454
No log 3.8947 370 0.6840 0.3528 0.6840 0.8271
No log 3.9158 372 0.6721 0.3564 0.6721 0.8198
No log 3.9368 374 0.6549 0.3588 0.6549 0.8093
No log 3.9579 376 0.6560 0.3373 0.6560 0.8099
No log 3.9789 378 0.6607 0.3418 0.6607 0.8128
No log 4.0 380 0.6800 0.3914 0.6800 0.8246
No log 4.0211 382 0.6647 0.4132 0.6647 0.8153
No log 4.0421 384 0.6203 0.4474 0.6203 0.7876
No log 4.0632 386 0.6266 0.4616 0.6266 0.7916
No log 4.0842 388 0.7167 0.4272 0.7167 0.8466
No log 4.1053 390 0.6939 0.4272 0.6939 0.8330
No log 4.1263 392 0.6092 0.5184 0.6092 0.7805
No log 4.1474 394 0.6887 0.4027 0.6887 0.8299
No log 4.1684 396 0.7666 0.4553 0.7666 0.8755
No log 4.1895 398 0.6921 0.3618 0.6921 0.8319
No log 4.2105 400 0.6369 0.4205 0.6369 0.7980
No log 4.2316 402 0.6484 0.4561 0.6484 0.8052
No log 4.2526 404 0.7014 0.3812 0.7014 0.8375
No log 4.2737 406 0.6990 0.3939 0.6990 0.8361
No log 4.2947 408 0.6831 0.4276 0.6831 0.8265
No log 4.3158 410 0.7152 0.4207 0.7152 0.8457
No log 4.3368 412 0.7372 0.4346 0.7372 0.8586
No log 4.3579 414 0.7056 0.3971 0.7056 0.8400
No log 4.3789 416 0.7290 0.4565 0.7290 0.8538
No log 4.4 418 0.7168 0.4565 0.7168 0.8466
No log 4.4211 420 0.7453 0.4365 0.7453 0.8633
No log 4.4421 422 0.7072 0.4387 0.7072 0.8409
No log 4.4632 424 0.6991 0.4244 0.6991 0.8361
No log 4.4842 426 0.6494 0.3804 0.6494 0.8059
No log 4.5053 428 0.6309 0.3513 0.6309 0.7943
No log 4.5263 430 0.6174 0.4199 0.6174 0.7857
No log 4.5474 432 0.6274 0.4315 0.6274 0.7921
No log 4.5684 434 0.6514 0.4499 0.6514 0.8071
No log 4.5895 436 0.6821 0.4036 0.6821 0.8259
No log 4.6105 438 0.6427 0.4209 0.6427 0.8017
No log 4.6316 440 0.6319 0.3924 0.6319 0.7949
No log 4.6526 442 0.6444 0.3879 0.6444 0.8027
No log 4.6737 444 0.6446 0.3268 0.6446 0.8028
No log 4.6947 446 0.6271 0.3479 0.6271 0.7919
No log 4.7158 448 0.6173 0.3587 0.6173 0.7857
No log 4.7368 450 0.6078 0.3606 0.6078 0.7796
No log 4.7579 452 0.6311 0.4183 0.6311 0.7944
No log 4.7789 454 0.6319 0.4679 0.6319 0.7949
No log 4.8 456 0.6251 0.3980 0.6251 0.7906
No log 4.8211 458 0.6283 0.4543 0.6283 0.7927
No log 4.8421 460 0.6463 0.4833 0.6463 0.8039
No log 4.8632 462 0.6245 0.4248 0.6245 0.7903
No log 4.8842 464 0.6091 0.3541 0.6091 0.7805
No log 4.9053 466 0.6028 0.3867 0.6028 0.7764
No log 4.9263 468 0.6079 0.4127 0.6079 0.7797
No log 4.9474 470 0.6053 0.4378 0.6053 0.7780
No log 4.9684 472 0.6196 0.3928 0.6196 0.7871
No log 4.9895 474 0.6466 0.4690 0.6466 0.8041
No log 5.0105 476 0.6386 0.4685 0.6386 0.7991
No log 5.0316 478 0.6699 0.4705 0.6699 0.8185
No log 5.0526 480 0.6630 0.4924 0.6630 0.8142
No log 5.0737 482 0.6278 0.5329 0.6278 0.7923
No log 5.0947 484 0.6128 0.4639 0.6128 0.7828
No log 5.1158 486 0.5971 0.4528 0.5971 0.7727
No log 5.1368 488 0.6146 0.4476 0.6146 0.7840
No log 5.1579 490 0.6708 0.4629 0.6708 0.8190
No log 5.1789 492 0.7083 0.4242 0.7083 0.8416
No log 5.2 494 0.7294 0.4543 0.7294 0.8540
No log 5.2211 496 0.6301 0.4880 0.6301 0.7938
No log 5.2421 498 0.6068 0.4938 0.6068 0.7790
0.4477 5.2632 500 0.6301 0.4843 0.6301 0.7938
0.4477 5.2842 502 0.7158 0.4787 0.7158 0.8460
0.4477 5.3053 504 0.7108 0.4797 0.7108 0.8431
0.4477 5.3263 506 0.6243 0.4351 0.6243 0.7901
0.4477 5.3474 508 0.5973 0.4747 0.5973 0.7728
0.4477 5.3684 510 0.6260 0.4721 0.6260 0.7912

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
4
Safetensors
Model size
135M params
Tensor type
F32
·
Inference Providers NEW
This model is not currently available via any of the supported Inference Providers.

Model tree for MayBashendy/ArabicNewSplits8_usingWellWrittenEssays_FineTuningAraBERT_run2_AugV5_k18_task2_organization

Finetuned
(4222)
this model