ArabicNewSplits8_usingWellWrittenEssays_FineTuningAraBERT_run2_AugV5_k4_task2_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.5126
  • Qwk: 0.5337
  • Mse: 0.5126
  • Rmse: 0.7160

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0833 2 4.2699 -0.0267 4.2699 2.0664
No log 0.1667 4 2.3765 0.0288 2.3765 1.5416
No log 0.25 6 1.4697 0.0058 1.4697 1.2123
No log 0.3333 8 0.9566 0.0970 0.9566 0.9781
No log 0.4167 10 0.8073 0.1769 0.8073 0.8985
No log 0.5 12 0.8574 0.1706 0.8574 0.9260
No log 0.5833 14 0.8846 0.0994 0.8846 0.9405
No log 0.6667 16 0.8850 0.1383 0.8850 0.9407
No log 0.75 18 0.9537 0.0840 0.9537 0.9766
No log 0.8333 20 1.0179 -0.0007 1.0179 1.0089
No log 0.9167 22 1.0599 0.0 1.0599 1.0295
No log 1.0 24 1.0810 0.0051 1.0810 1.0397
No log 1.0833 26 0.9678 0.0189 0.9678 0.9837
No log 1.1667 28 0.8722 0.1588 0.8722 0.9339
No log 1.25 30 0.8195 0.1254 0.8195 0.9053
No log 1.3333 32 0.8627 0.1638 0.8627 0.9288
No log 1.4167 34 0.9176 0.0670 0.9176 0.9579
No log 1.5 36 0.8582 0.1331 0.8582 0.9264
No log 1.5833 38 0.8685 0.1440 0.8685 0.9319
No log 1.6667 40 0.8457 0.2057 0.8457 0.9196
No log 1.75 42 0.8026 0.2359 0.8026 0.8959
No log 1.8333 44 0.8976 0.0437 0.8976 0.9474
No log 1.9167 46 1.1430 0.0200 1.1430 1.0691
No log 2.0 48 1.6351 0.0361 1.6351 1.2787
No log 2.0833 50 2.0277 0.0274 2.0277 1.4240
No log 2.1667 52 2.0157 0.0770 2.0157 1.4198
No log 2.25 54 1.7500 0.0457 1.7500 1.3229
No log 2.3333 56 1.3783 0.0262 1.3783 1.1740
No log 2.4167 58 1.0526 0.0051 1.0526 1.0260
No log 2.5 60 0.8119 0.2685 0.8119 0.9010
No log 2.5833 62 0.7668 0.1210 0.7668 0.8757
No log 2.6667 64 0.8322 0.1061 0.8322 0.9122
No log 2.75 66 1.1239 -0.0166 1.1239 1.0602
No log 2.8333 68 1.3673 -0.0381 1.3673 1.1693
No log 2.9167 70 1.2713 0.0195 1.2713 1.1275
No log 3.0 72 0.9343 0.1036 0.9343 0.9666
No log 3.0833 74 0.7853 0.2155 0.7853 0.8862
No log 3.1667 76 0.7889 0.0509 0.7889 0.8882
No log 3.25 78 0.7524 0.1624 0.7524 0.8674
No log 3.3333 80 0.7264 0.2269 0.7264 0.8523
No log 3.4167 82 0.6924 0.3268 0.6924 0.8321
No log 3.5 84 0.6689 0.3771 0.6689 0.8179
No log 3.5833 86 0.6453 0.3557 0.6453 0.8033
No log 3.6667 88 0.6005 0.3378 0.6005 0.7749
No log 3.75 90 0.5949 0.3682 0.5949 0.7713
No log 3.8333 92 0.6058 0.3905 0.6058 0.7784
No log 3.9167 94 0.6726 0.3456 0.6726 0.8201
No log 4.0 96 0.6943 0.3606 0.6943 0.8333
No log 4.0833 98 0.6752 0.3663 0.6752 0.8217
No log 4.1667 100 0.6453 0.4623 0.6453 0.8033
No log 4.25 102 0.6834 0.5125 0.6834 0.8267
No log 4.3333 104 0.7523 0.4255 0.7523 0.8673
No log 4.4167 106 0.7210 0.4603 0.7210 0.8491
No log 4.5 108 0.6372 0.4772 0.6372 0.7982
No log 4.5833 110 0.6113 0.4881 0.6113 0.7819
No log 4.6667 112 0.6188 0.4731 0.6188 0.7866
No log 4.75 114 0.6416 0.4640 0.6416 0.8010
No log 4.8333 116 0.6510 0.4899 0.6510 0.8068
No log 4.9167 118 0.6378 0.4835 0.6378 0.7986
No log 5.0 120 0.6156 0.4998 0.6156 0.7846
No log 5.0833 122 0.5981 0.4998 0.5981 0.7734
No log 5.1667 124 0.6083 0.5004 0.6083 0.7800
No log 5.25 126 0.6184 0.4919 0.6184 0.7864
No log 5.3333 128 0.6456 0.4834 0.6456 0.8035
No log 5.4167 130 0.6815 0.4474 0.6815 0.8255
No log 5.5 132 0.6765 0.4623 0.6765 0.8225
No log 5.5833 134 0.6850 0.4219 0.6850 0.8276
No log 5.6667 136 0.7403 0.4799 0.7403 0.8604
No log 5.75 138 0.7806 0.4563 0.7806 0.8835
No log 5.8333 140 0.8173 0.5044 0.8173 0.9040
No log 5.9167 142 0.8810 0.5015 0.8810 0.9386
No log 6.0 144 1.0334 0.4287 1.0334 1.0166
No log 6.0833 146 1.0998 0.3680 1.0998 1.0487
No log 6.1667 148 0.8341 0.4547 0.8341 0.9133
No log 6.25 150 0.6575 0.5925 0.6575 0.8109
No log 6.3333 152 0.6764 0.5188 0.6764 0.8225
No log 6.4167 154 0.6316 0.4846 0.6316 0.7947
No log 6.5 156 0.6373 0.5253 0.6373 0.7983
No log 6.5833 158 0.6336 0.5168 0.6336 0.7960
No log 6.6667 160 0.5680 0.4808 0.5680 0.7537
No log 6.75 162 0.6051 0.4528 0.6051 0.7779
No log 6.8333 164 0.5975 0.4573 0.5975 0.7730
No log 6.9167 166 0.5773 0.4630 0.5773 0.7598
No log 7.0 168 0.5971 0.5199 0.5971 0.7727
No log 7.0833 170 0.6094 0.5347 0.6094 0.7806
No log 7.1667 172 0.5964 0.5449 0.5964 0.7723
No log 7.25 174 0.6268 0.5067 0.6268 0.7917
No log 7.3333 176 0.7133 0.3931 0.7133 0.8446
No log 7.4167 178 0.7420 0.3675 0.7420 0.8614
No log 7.5 180 0.6816 0.3293 0.6816 0.8256
No log 7.5833 182 0.6105 0.4017 0.6105 0.7813
No log 7.6667 184 0.6210 0.4161 0.6210 0.7880
No log 7.75 186 0.6494 0.4344 0.6494 0.8058
No log 7.8333 188 0.6040 0.4613 0.6040 0.7771
No log 7.9167 190 0.6924 0.5286 0.6924 0.8321
No log 8.0 192 0.7614 0.5401 0.7614 0.8726
No log 8.0833 194 0.7012 0.5498 0.7012 0.8374
No log 8.1667 196 0.6897 0.4566 0.6897 0.8305
No log 8.25 198 0.7055 0.5101 0.7055 0.8400
No log 8.3333 200 0.6595 0.4872 0.6595 0.8121
No log 8.4167 202 0.7015 0.52 0.7015 0.8376
No log 8.5 204 0.8831 0.4709 0.8831 0.9397
No log 8.5833 206 0.8337 0.4475 0.8337 0.9131
No log 8.6667 208 0.6474 0.5447 0.6474 0.8046
No log 8.75 210 0.5901 0.4268 0.5901 0.7682
No log 8.8333 212 0.5834 0.4623 0.5834 0.7638
No log 8.9167 214 0.6285 0.5214 0.6285 0.7928
No log 9.0 216 0.6687 0.5172 0.6687 0.8178
No log 9.0833 218 0.6702 0.5280 0.6702 0.8187
No log 9.1667 220 0.6673 0.5217 0.6673 0.8169
No log 9.25 222 0.6838 0.5448 0.6838 0.8270
No log 9.3333 224 0.6926 0.5358 0.6926 0.8322
No log 9.4167 226 0.6687 0.5064 0.6687 0.8178
No log 9.5 228 0.6560 0.5140 0.6560 0.8099
No log 9.5833 230 0.6380 0.4905 0.6380 0.7988
No log 9.6667 232 0.6623 0.5114 0.6623 0.8138
No log 9.75 234 0.6773 0.5028 0.6773 0.8230
No log 9.8333 236 0.6433 0.5069 0.6433 0.8021
No log 9.9167 238 0.6407 0.5195 0.6407 0.8005
No log 10.0 240 0.6164 0.5353 0.6164 0.7851
No log 10.0833 242 0.5947 0.5536 0.5947 0.7712
No log 10.1667 244 0.5806 0.5368 0.5806 0.7620
No log 10.25 246 0.5853 0.5294 0.5853 0.7651
No log 10.3333 248 0.5988 0.5445 0.5988 0.7738
No log 10.4167 250 0.5889 0.5526 0.5889 0.7674
No log 10.5 252 0.6038 0.5447 0.6038 0.7770
No log 10.5833 254 0.6300 0.5 0.6300 0.7937
No log 10.6667 256 0.6136 0.5404 0.6136 0.7834
No log 10.75 258 0.6412 0.5369 0.6412 0.8008
No log 10.8333 260 0.6513 0.5187 0.6513 0.8070
No log 10.9167 262 0.7303 0.5283 0.7303 0.8546
No log 11.0 264 0.7119 0.4994 0.7119 0.8438
No log 11.0833 266 0.6009 0.5434 0.6009 0.7752
No log 11.1667 268 0.5639 0.4909 0.5639 0.7510
No log 11.25 270 0.5674 0.5366 0.5674 0.7532
No log 11.3333 272 0.6210 0.5319 0.6210 0.7880
No log 11.4167 274 0.7095 0.5014 0.7095 0.8423
No log 11.5 276 0.6723 0.5065 0.6723 0.8199
No log 11.5833 278 0.5777 0.5323 0.5777 0.7601
No log 11.6667 280 0.5626 0.5005 0.5626 0.7501
No log 11.75 282 0.5732 0.4579 0.5732 0.7571
No log 11.8333 284 0.5897 0.5347 0.5897 0.7679
No log 11.9167 286 0.6429 0.5732 0.6429 0.8018
No log 12.0 288 0.7396 0.5385 0.7396 0.8600
No log 12.0833 290 0.7664 0.5390 0.7664 0.8754
No log 12.1667 292 0.7162 0.5589 0.7162 0.8463
No log 12.25 294 0.6739 0.5404 0.6739 0.8209
No log 12.3333 296 0.6605 0.5496 0.6605 0.8127
No log 12.4167 298 0.6387 0.5257 0.6387 0.7992
No log 12.5 300 0.6241 0.5353 0.6241 0.7900
No log 12.5833 302 0.6143 0.5393 0.6143 0.7838
No log 12.6667 304 0.6097 0.5486 0.6097 0.7808
No log 12.75 306 0.6351 0.5253 0.6351 0.7969
No log 12.8333 308 0.6463 0.5068 0.6463 0.8039
No log 12.9167 310 0.5780 0.5534 0.5780 0.7603
No log 13.0 312 0.5492 0.4790 0.5492 0.7411
No log 13.0833 314 0.5476 0.4923 0.5476 0.7400
No log 13.1667 316 0.5499 0.4813 0.5499 0.7415
No log 13.25 318 0.6031 0.4993 0.6031 0.7766
No log 13.3333 320 0.6182 0.5109 0.6182 0.7863
No log 13.4167 322 0.5789 0.5613 0.5789 0.7609
No log 13.5 324 0.5910 0.5239 0.5910 0.7688
No log 13.5833 326 0.6108 0.5227 0.6108 0.7815
No log 13.6667 328 0.6133 0.5309 0.6133 0.7831
No log 13.75 330 0.5857 0.5644 0.5857 0.7653
No log 13.8333 332 0.5965 0.5463 0.5965 0.7723
No log 13.9167 334 0.6582 0.5614 0.6582 0.8113
No log 14.0 336 0.6504 0.5634 0.6504 0.8065
No log 14.0833 338 0.6269 0.5635 0.6269 0.7918
No log 14.1667 340 0.5578 0.5341 0.5578 0.7469
No log 14.25 342 0.5433 0.5567 0.5433 0.7371
No log 14.3333 344 0.5553 0.5712 0.5553 0.7452
No log 14.4167 346 0.5962 0.5805 0.5962 0.7721
No log 14.5 348 0.6377 0.5631 0.6377 0.7986
No log 14.5833 350 0.6278 0.5543 0.6278 0.7924
No log 14.6667 352 0.6416 0.5340 0.6416 0.8010
No log 14.75 354 0.6642 0.5424 0.6642 0.8150
No log 14.8333 356 0.6026 0.4782 0.6026 0.7763
No log 14.9167 358 0.5541 0.4972 0.5541 0.7444
No log 15.0 360 0.6284 0.5277 0.6284 0.7927
No log 15.0833 362 0.6436 0.5411 0.6436 0.8022
No log 15.1667 364 0.6087 0.5411 0.6087 0.7802
No log 15.25 366 0.5971 0.5257 0.5971 0.7727
No log 15.3333 368 0.5540 0.5032 0.5540 0.7443
No log 15.4167 370 0.5509 0.5338 0.5509 0.7422
No log 15.5 372 0.5506 0.4972 0.5506 0.7420
No log 15.5833 374 0.5555 0.5312 0.5555 0.7453
No log 15.6667 376 0.5677 0.5087 0.5677 0.7535
No log 15.75 378 0.5554 0.5006 0.5554 0.7453
No log 15.8333 380 0.5460 0.4958 0.5460 0.7389
No log 15.9167 382 0.5501 0.4958 0.5501 0.7417
No log 16.0 384 0.5738 0.5200 0.5738 0.7575
No log 16.0833 386 0.6190 0.5241 0.6190 0.7868
No log 16.1667 388 0.6635 0.5024 0.6635 0.8146
No log 16.25 390 0.6158 0.5530 0.6158 0.7847
No log 16.3333 392 0.5838 0.5613 0.5838 0.7641
No log 16.4167 394 0.5723 0.5637 0.5723 0.7565
No log 16.5 396 0.5716 0.5731 0.5716 0.7560
No log 16.5833 398 0.5743 0.5565 0.5743 0.7579
No log 16.6667 400 0.5501 0.5500 0.5501 0.7417
No log 16.75 402 0.5231 0.5316 0.5231 0.7233
No log 16.8333 404 0.5407 0.5095 0.5407 0.7353
No log 16.9167 406 0.5324 0.5018 0.5324 0.7296
No log 17.0 408 0.5155 0.4955 0.5155 0.7180
No log 17.0833 410 0.5358 0.5428 0.5358 0.7320
No log 17.1667 412 0.5702 0.5763 0.5702 0.7551
No log 17.25 414 0.5666 0.5522 0.5666 0.7527
No log 17.3333 416 0.5555 0.5492 0.5555 0.7453
No log 17.4167 418 0.5714 0.5561 0.5714 0.7559
No log 17.5 420 0.5861 0.53 0.5861 0.7656
No log 17.5833 422 0.6456 0.5306 0.6456 0.8035
No log 17.6667 424 0.7778 0.4863 0.7778 0.8819
No log 17.75 426 0.8152 0.4912 0.8152 0.9029
No log 17.8333 428 0.7436 0.4989 0.7436 0.8624
No log 17.9167 430 0.6497 0.5898 0.6497 0.8060
No log 18.0 432 0.6020 0.5409 0.6020 0.7759
No log 18.0833 434 0.5844 0.5263 0.5844 0.7645
No log 18.1667 436 0.5875 0.5017 0.5875 0.7665
No log 18.25 438 0.5823 0.5516 0.5823 0.7631
No log 18.3333 440 0.5704 0.5403 0.5704 0.7553
No log 18.4167 442 0.5808 0.5867 0.5808 0.7621
No log 18.5 444 0.5687 0.6046 0.5687 0.7541
No log 18.5833 446 0.5493 0.5852 0.5493 0.7412
No log 18.6667 448 0.5764 0.5452 0.5764 0.7592
No log 18.75 450 0.6407 0.4970 0.6407 0.8004
No log 18.8333 452 0.6551 0.4970 0.6551 0.8094
No log 18.9167 454 0.5991 0.5369 0.5991 0.7740
No log 19.0 456 0.5374 0.5696 0.5374 0.7331
No log 19.0833 458 0.5444 0.5823 0.5444 0.7378
No log 19.1667 460 0.5452 0.5801 0.5452 0.7384
No log 19.25 462 0.5846 0.5646 0.5846 0.7646
No log 19.3333 464 0.6659 0.4903 0.6659 0.8160
No log 19.4167 466 0.6531 0.4914 0.6531 0.8081
No log 19.5 468 0.5802 0.6046 0.5802 0.7617
No log 19.5833 470 0.5230 0.5717 0.5230 0.7232
No log 19.6667 472 0.5279 0.5534 0.5279 0.7265
No log 19.75 474 0.5288 0.5709 0.5288 0.7272
No log 19.8333 476 0.5111 0.5296 0.5111 0.7149
No log 19.9167 478 0.5170 0.5567 0.5170 0.7190
No log 20.0 480 0.5750 0.5780 0.5750 0.7583
No log 20.0833 482 0.6160 0.5611 0.6160 0.7849
No log 20.1667 484 0.5914 0.5736 0.5914 0.7690
No log 20.25 486 0.5423 0.5640 0.5423 0.7364
No log 20.3333 488 0.5176 0.5434 0.5176 0.7194
No log 20.4167 490 0.5454 0.5853 0.5454 0.7385
No log 20.5 492 0.5752 0.5734 0.5752 0.7584
No log 20.5833 494 0.5577 0.5946 0.5577 0.7468
No log 20.6667 496 0.5513 0.5421 0.5513 0.7425
No log 20.75 498 0.5901 0.5800 0.5901 0.7682
0.4318 20.8333 500 0.6190 0.5467 0.6190 0.7868
0.4318 20.9167 502 0.5812 0.5736 0.5812 0.7624
0.4318 21.0 504 0.5404 0.5758 0.5404 0.7351
0.4318 21.0833 506 0.5119 0.5461 0.5119 0.7155
0.4318 21.1667 508 0.4952 0.5395 0.4952 0.7037
0.4318 21.25 510 0.4988 0.5484 0.4988 0.7062
0.4318 21.3333 512 0.5037 0.5970 0.5037 0.7097
0.4318 21.4167 514 0.5218 0.5587 0.5218 0.7223
0.4318 21.5 516 0.5399 0.6163 0.5399 0.7348
0.4318 21.5833 518 0.5478 0.5894 0.5478 0.7401
0.4318 21.6667 520 0.5691 0.5574 0.5691 0.7544
0.4318 21.75 522 0.5927 0.5822 0.5927 0.7699
0.4318 21.8333 524 0.5617 0.5649 0.5617 0.7495
0.4318 21.9167 526 0.5343 0.5697 0.5343 0.7310
0.4318 22.0 528 0.5411 0.5471 0.5411 0.7356
0.4318 22.0833 530 0.5918 0.5654 0.5918 0.7693
0.4318 22.1667 532 0.6200 0.5565 0.6200 0.7874
0.4318 22.25 534 0.5889 0.5140 0.5889 0.7674
0.4318 22.3333 536 0.5479 0.5337 0.5479 0.7402
0.4318 22.4167 538 0.5220 0.5331 0.5220 0.7225
0.4318 22.5 540 0.5131 0.5453 0.5131 0.7163
0.4318 22.5833 542 0.5126 0.5337 0.5126 0.7160

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
4
Safetensors
Model size
135M params
Tensor type
F32
·
Inference Providers NEW
This model is not currently available via any of the supported Inference Providers.

Model tree for MayBashendy/ArabicNewSplits8_usingWellWrittenEssays_FineTuningAraBERT_run2_AugV5_k4_task2_organization

Finetuned
(4222)
this model