ArabicNewSplits8_usingWellWrittenEssays_FineTuningAraBERT_run2_AugV5_k6_task2_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.5843
  • Qwk: 0.5123
  • Mse: 0.5843
  • Rmse: 0.7644

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0588 2 4.3270 -0.0205 4.3270 2.0801
No log 0.1176 4 2.4873 0.0399 2.4873 1.5771
No log 0.1765 6 1.4350 0.0058 1.4350 1.1979
No log 0.2353 8 1.2471 -0.0427 1.2471 1.1167
No log 0.2941 10 1.0039 0.0334 1.0039 1.0019
No log 0.3529 12 0.9125 0.1083 0.9125 0.9553
No log 0.4118 14 0.9187 0.0967 0.9187 0.9585
No log 0.4706 16 0.8096 0.1666 0.8096 0.8998
No log 0.5294 18 0.7824 0.1614 0.7824 0.8845
No log 0.5882 20 0.7704 0.2237 0.7704 0.8777
No log 0.6471 22 0.7342 0.2269 0.7342 0.8568
No log 0.7059 24 0.7789 0.1845 0.7789 0.8826
No log 0.7647 26 0.8208 0.1780 0.8208 0.9060
No log 0.8235 28 0.7568 0.2110 0.7568 0.8699
No log 0.8824 30 0.7812 0.1977 0.7812 0.8839
No log 0.9412 32 0.9238 0.1610 0.9238 0.9611
No log 1.0 34 1.0711 0.1760 1.0711 1.0349
No log 1.0588 36 0.9958 0.1907 0.9958 0.9979
No log 1.1176 38 0.7303 0.2401 0.7303 0.8546
No log 1.1765 40 0.7402 0.1779 0.7402 0.8603
No log 1.2353 42 0.8061 0.1754 0.8061 0.8979
No log 1.2941 44 0.7217 0.1969 0.7217 0.8495
No log 1.3529 46 0.6909 0.3380 0.6909 0.8312
No log 1.4118 48 0.8088 0.3009 0.8088 0.8993
No log 1.4706 50 0.7690 0.3074 0.7690 0.8769
No log 1.5294 52 0.6560 0.3903 0.6560 0.8099
No log 1.5882 54 0.6260 0.4270 0.6260 0.7912
No log 1.6471 56 0.6229 0.4530 0.6229 0.7892
No log 1.7059 58 0.6211 0.4910 0.6211 0.7881
No log 1.7647 60 0.6414 0.4055 0.6414 0.8009
No log 1.8235 62 0.6456 0.3924 0.6456 0.8035
No log 1.8824 64 0.6381 0.4736 0.6381 0.7988
No log 1.9412 66 0.6353 0.4529 0.6353 0.7970
No log 2.0 68 0.6595 0.4823 0.6595 0.8121
No log 2.0588 70 0.6507 0.5187 0.6507 0.8067
No log 2.1176 72 0.6593 0.4963 0.6593 0.8119
No log 2.1765 74 0.7266 0.5388 0.7266 0.8524
No log 2.2353 76 1.0348 0.4524 1.0348 1.0173
No log 2.2941 78 1.2283 0.3968 1.2283 1.1083
No log 2.3529 80 1.1604 0.3837 1.1604 1.0772
No log 2.4118 82 0.8066 0.4387 0.8066 0.8981
No log 2.4706 84 0.5875 0.5871 0.5875 0.7665
No log 2.5294 86 0.6143 0.4801 0.6143 0.7838
No log 2.5882 88 0.5901 0.5166 0.5901 0.7682
No log 2.6471 90 0.5714 0.5658 0.5714 0.7559
No log 2.7059 92 0.6649 0.4081 0.6649 0.8154
No log 2.7647 94 0.8966 0.3809 0.8966 0.9469
No log 2.8235 96 0.8507 0.3774 0.8507 0.9223
No log 2.8824 98 0.6031 0.5018 0.6031 0.7766
No log 2.9412 100 0.8444 0.4501 0.8444 0.9189
No log 3.0 102 1.1756 0.3004 1.1756 1.0842
No log 3.0588 104 1.0834 0.3449 1.0834 1.0409
No log 3.1176 106 0.7606 0.5046 0.7606 0.8721
No log 3.1765 108 0.7188 0.5488 0.7188 0.8478
No log 3.2353 110 0.7473 0.4832 0.7473 0.8645
No log 3.2941 112 0.6776 0.5322 0.6776 0.8232
No log 3.3529 114 0.6443 0.5651 0.6443 0.8027
No log 3.4118 116 0.6300 0.5469 0.6300 0.7937
No log 3.4706 118 0.6580 0.4955 0.6580 0.8112
No log 3.5294 120 0.6387 0.4709 0.6387 0.7992
No log 3.5882 122 0.6914 0.5128 0.6914 0.8315
No log 3.6471 124 0.6274 0.5261 0.6274 0.7921
No log 3.7059 126 0.7002 0.4781 0.7002 0.8368
No log 3.7647 128 0.7365 0.4775 0.7365 0.8582
No log 3.8235 130 0.6742 0.5041 0.6742 0.8211
No log 3.8824 132 0.6576 0.5334 0.6576 0.8109
No log 3.9412 134 0.7783 0.5169 0.7783 0.8822
No log 4.0 136 0.8914 0.4680 0.8914 0.9441
No log 4.0588 138 0.7767 0.5138 0.7767 0.8813
No log 4.1176 140 0.6029 0.5348 0.6029 0.7765
No log 4.1765 142 0.6403 0.4843 0.6403 0.8002
No log 4.2353 144 0.6386 0.4827 0.6386 0.7991
No log 4.2941 146 0.5831 0.5041 0.5831 0.7636
No log 4.3529 148 0.5884 0.5493 0.5884 0.7671
No log 4.4118 150 0.6069 0.5596 0.6069 0.7790
No log 4.4706 152 0.6323 0.5699 0.6323 0.7952
No log 4.5294 154 0.6457 0.5461 0.6457 0.8036
No log 4.5882 156 0.6368 0.5415 0.6368 0.7980
No log 4.6471 158 0.6352 0.5153 0.6352 0.7970
No log 4.7059 160 0.7422 0.4482 0.7422 0.8615
No log 4.7647 162 0.7473 0.4269 0.7473 0.8645
No log 4.8235 164 0.6433 0.4916 0.6433 0.8020
No log 4.8824 166 0.6502 0.4761 0.6502 0.8064
No log 4.9412 168 0.7092 0.4821 0.7092 0.8421
No log 5.0 170 0.6827 0.4738 0.6827 0.8262
No log 5.0588 172 0.6398 0.5276 0.6398 0.7999
No log 5.1176 174 0.6412 0.5132 0.6412 0.8008
No log 5.1765 176 0.6491 0.4614 0.6491 0.8057
No log 5.2353 178 0.6183 0.4874 0.6183 0.7863
No log 5.2941 180 0.6218 0.4844 0.6218 0.7886
No log 5.3529 182 0.6231 0.4844 0.6231 0.7894
No log 5.4118 184 0.6441 0.4433 0.6441 0.8025
No log 5.4706 186 0.7385 0.4357 0.7385 0.8593
No log 5.5294 188 0.7866 0.4235 0.7866 0.8869
No log 5.5882 190 0.7734 0.3900 0.7734 0.8794
No log 5.6471 192 0.6686 0.4488 0.6686 0.8177
No log 5.7059 194 0.6286 0.4708 0.6286 0.7928
No log 5.7647 196 0.6241 0.4848 0.6241 0.7900
No log 5.8235 198 0.6854 0.4102 0.6854 0.8279
No log 5.8824 200 0.7545 0.4020 0.7545 0.8686
No log 5.9412 202 0.8249 0.4051 0.8249 0.9083
No log 6.0 204 0.7867 0.4770 0.7867 0.8869
No log 6.0588 206 0.6593 0.5484 0.6593 0.8120
No log 6.1176 208 0.6340 0.5696 0.6340 0.7962
No log 6.1765 210 0.6045 0.5514 0.6045 0.7775
No log 6.2353 212 0.5936 0.5484 0.5936 0.7705
No log 6.2941 214 0.6080 0.5803 0.6080 0.7797
No log 6.3529 216 0.6109 0.5556 0.6109 0.7816
No log 6.4118 218 0.6033 0.5553 0.6033 0.7767
No log 6.4706 220 0.6030 0.5655 0.6030 0.7765
No log 6.5294 222 0.6029 0.5753 0.6029 0.7764
No log 6.5882 224 0.6022 0.5471 0.6022 0.7760
No log 6.6471 226 0.5930 0.5757 0.5930 0.7701
No log 6.7059 228 0.5860 0.4569 0.5860 0.7655
No log 6.7647 230 0.5969 0.5155 0.5969 0.7726
No log 6.8235 232 0.5737 0.4882 0.5737 0.7574
No log 6.8824 234 0.5630 0.5391 0.5630 0.7503
No log 6.9412 236 0.5510 0.5209 0.5510 0.7423
No log 7.0 238 0.5470 0.4939 0.5470 0.7396
No log 7.0588 240 0.6037 0.5155 0.6037 0.7770
No log 7.1176 242 0.6544 0.5205 0.6544 0.8090
No log 7.1765 244 0.6598 0.5368 0.6598 0.8123
No log 7.2353 246 0.6432 0.5812 0.6432 0.8020
No log 7.2941 248 0.6729 0.5556 0.6729 0.8203
No log 7.3529 250 0.6941 0.5794 0.6941 0.8331
No log 7.4118 252 0.6931 0.5811 0.6931 0.8325
No log 7.4706 254 0.6291 0.5783 0.6291 0.7932
No log 7.5294 256 0.5935 0.5935 0.5935 0.7704
No log 7.5882 258 0.5576 0.5616 0.5576 0.7467
No log 7.6471 260 0.5905 0.5083 0.5905 0.7685
No log 7.7059 262 0.6103 0.5384 0.6103 0.7812
No log 7.7647 264 0.5817 0.5166 0.5817 0.7627
No log 7.8235 266 0.5813 0.5762 0.5813 0.7624
No log 7.8824 268 0.6523 0.4969 0.6523 0.8076
No log 7.9412 270 0.6318 0.5078 0.6318 0.7949
No log 8.0 272 0.5907 0.5414 0.5907 0.7685
No log 8.0588 274 0.5885 0.5061 0.5885 0.7672
No log 8.1176 276 0.6074 0.4719 0.6074 0.7793
No log 8.1765 278 0.5947 0.5128 0.5947 0.7711
No log 8.2353 280 0.6237 0.4465 0.6237 0.7898
No log 8.2941 282 0.6197 0.5095 0.6197 0.7872
No log 8.3529 284 0.6275 0.5288 0.6275 0.7921
No log 8.4118 286 0.6359 0.5288 0.6359 0.7975
No log 8.4706 288 0.6421 0.4953 0.6421 0.8013
No log 8.5294 290 0.6693 0.4736 0.6693 0.8181
No log 8.5882 292 0.6220 0.4679 0.6220 0.7886
No log 8.6471 294 0.6014 0.4462 0.6014 0.7755
No log 8.7059 296 0.6040 0.4625 0.6040 0.7772
No log 8.7647 298 0.6092 0.4909 0.6092 0.7805
No log 8.8235 300 0.6590 0.5159 0.6590 0.8118
No log 8.8824 302 0.6548 0.5377 0.6548 0.8092
No log 8.9412 304 0.6204 0.5171 0.6204 0.7876
No log 9.0 306 0.6034 0.5156 0.6034 0.7768
No log 9.0588 308 0.5937 0.5185 0.5937 0.7705
No log 9.1176 310 0.5902 0.5678 0.5902 0.7683
No log 9.1765 312 0.5860 0.5567 0.5860 0.7655
No log 9.2353 314 0.5931 0.5378 0.5931 0.7701
No log 9.2941 316 0.6069 0.5351 0.6069 0.7790
No log 9.3529 318 0.6066 0.5391 0.6066 0.7788
No log 9.4118 320 0.6289 0.5861 0.6289 0.7930
No log 9.4706 322 0.6530 0.6088 0.6530 0.8081
No log 9.5294 324 0.6483 0.5946 0.6483 0.8052
No log 9.5882 326 0.6262 0.5135 0.6262 0.7913
No log 9.6471 328 0.6703 0.5342 0.6703 0.8187
No log 9.7059 330 0.6490 0.4981 0.6490 0.8056
No log 9.7647 332 0.6126 0.5544 0.6126 0.7827
No log 9.8235 334 0.6884 0.5322 0.6884 0.8297
No log 9.8824 336 0.7392 0.4889 0.7392 0.8598
No log 9.9412 338 0.6557 0.5368 0.6557 0.8097
No log 10.0 340 0.6231 0.5490 0.6231 0.7894
No log 10.0588 342 0.6958 0.5024 0.6958 0.8341
No log 10.1176 344 0.6866 0.5025 0.6866 0.8286
No log 10.1765 346 0.6040 0.5561 0.6040 0.7772
No log 10.2353 348 0.5589 0.5395 0.5589 0.7476
No log 10.2941 350 0.5884 0.5037 0.5884 0.7671
No log 10.3529 352 0.6088 0.5882 0.6088 0.7803
No log 10.4118 354 0.6122 0.5394 0.6122 0.7824
No log 10.4706 356 0.6781 0.5535 0.6781 0.8235
No log 10.5294 358 0.7044 0.5439 0.7044 0.8393
No log 10.5882 360 0.7217 0.5352 0.7217 0.8496
No log 10.6471 362 0.7026 0.5793 0.7026 0.8382
No log 10.7059 364 0.7313 0.6069 0.7313 0.8552
No log 10.7647 366 0.8834 0.5058 0.8834 0.9399
No log 10.8235 368 0.8703 0.4398 0.8703 0.9329
No log 10.8824 370 0.7176 0.4790 0.7176 0.8471
No log 10.9412 372 0.5611 0.5579 0.5611 0.7491
No log 11.0 374 0.5433 0.5467 0.5433 0.7371
No log 11.0588 376 0.5560 0.5816 0.5560 0.7457
No log 11.1176 378 0.5429 0.5645 0.5429 0.7368
No log 11.1765 380 0.5562 0.5453 0.5562 0.7458
No log 11.2353 382 0.5863 0.5063 0.5863 0.7657
No log 11.2941 384 0.6042 0.5184 0.6042 0.7773
No log 11.3529 386 0.5707 0.5212 0.5707 0.7554
No log 11.4118 388 0.5614 0.5409 0.5614 0.7492
No log 11.4706 390 0.5591 0.5533 0.5591 0.7477
No log 11.5294 392 0.5553 0.5441 0.5553 0.7452
No log 11.5882 394 0.5703 0.5687 0.5703 0.7552
No log 11.6471 396 0.5948 0.5741 0.5948 0.7712
No log 11.7059 398 0.6189 0.5524 0.6189 0.7867
No log 11.7647 400 0.6209 0.5554 0.6209 0.7880
No log 11.8235 402 0.6263 0.5328 0.6263 0.7914
No log 11.8824 404 0.6239 0.5314 0.6239 0.7899
No log 11.9412 406 0.6100 0.5359 0.6100 0.7811
No log 12.0 408 0.6124 0.5679 0.6124 0.7826
No log 12.0588 410 0.6195 0.5499 0.6195 0.7871
No log 12.1176 412 0.6235 0.5487 0.6235 0.7896
No log 12.1765 414 0.6342 0.5282 0.6342 0.7963
No log 12.2353 416 0.6541 0.5266 0.6541 0.8088
No log 12.2941 418 0.6629 0.5605 0.6629 0.8142
No log 12.3529 420 0.6459 0.5461 0.6459 0.8036
No log 12.4118 422 0.6398 0.5415 0.6398 0.7998
No log 12.4706 424 0.6307 0.5953 0.6307 0.7941
No log 12.5294 426 0.6123 0.5996 0.6123 0.7825
No log 12.5882 428 0.6078 0.6314 0.6078 0.7796
No log 12.6471 430 0.6275 0.5587 0.6275 0.7921
No log 12.7059 432 0.6587 0.5687 0.6587 0.8116
No log 12.7647 434 0.6397 0.6069 0.6397 0.7998
No log 12.8235 436 0.6166 0.6070 0.6166 0.7852
No log 12.8824 438 0.6019 0.6014 0.6019 0.7758
No log 12.9412 440 0.6017 0.5184 0.6017 0.7757
No log 13.0 442 0.5853 0.5415 0.5853 0.7650
No log 13.0588 444 0.5750 0.5896 0.5750 0.7583
No log 13.1176 446 0.5900 0.5725 0.5900 0.7681
No log 13.1765 448 0.6594 0.5258 0.6594 0.8120
No log 13.2353 450 0.6891 0.5336 0.6891 0.8301
No log 13.2941 452 0.6295 0.5363 0.6295 0.7934
No log 13.3529 454 0.5856 0.5899 0.5856 0.7653
No log 13.4118 456 0.5794 0.5815 0.5794 0.7612
No log 13.4706 458 0.5846 0.5933 0.5846 0.7646
No log 13.5294 460 0.5654 0.5735 0.5654 0.7519
No log 13.5882 462 0.5622 0.5456 0.5622 0.7498
No log 13.6471 464 0.6327 0.5543 0.6327 0.7954
No log 13.7059 466 0.6739 0.5133 0.6739 0.8209
No log 13.7647 468 0.6285 0.5772 0.6285 0.7928
No log 13.8235 470 0.5615 0.5724 0.5615 0.7494
No log 13.8824 472 0.5644 0.5773 0.5644 0.7512
No log 13.9412 474 0.5627 0.5773 0.5627 0.7502
No log 14.0 476 0.5689 0.5663 0.5689 0.7543
No log 14.0588 478 0.5712 0.5897 0.5712 0.7558
No log 14.1176 480 0.5840 0.5804 0.5840 0.7642
No log 14.1765 482 0.5921 0.5824 0.5921 0.7694
No log 14.2353 484 0.5830 0.5798 0.5830 0.7635
No log 14.2941 486 0.5945 0.5784 0.5945 0.7710
No log 14.3529 488 0.7076 0.5131 0.7076 0.8412
No log 14.4118 490 0.7863 0.4648 0.7863 0.8867
No log 14.4706 492 0.7465 0.4674 0.7465 0.8640
No log 14.5294 494 0.6401 0.5253 0.6401 0.8001
No log 14.5882 496 0.5754 0.5477 0.5754 0.7585
No log 14.6471 498 0.5588 0.4473 0.5588 0.7475
0.3779 14.7059 500 0.5544 0.5343 0.5544 0.7446
0.3779 14.7647 502 0.5771 0.5347 0.5771 0.7597
0.3779 14.8235 504 0.6463 0.5098 0.6463 0.8040
0.3779 14.8824 506 0.6438 0.5111 0.6438 0.8024
0.3779 14.9412 508 0.5979 0.5948 0.5979 0.7732
0.3779 15.0 510 0.5803 0.5937 0.5803 0.7618
0.3779 15.0588 512 0.6063 0.5944 0.6063 0.7787
0.3779 15.1176 514 0.6316 0.5587 0.6316 0.7947
0.3779 15.1765 516 0.6430 0.5657 0.6430 0.8019
0.3779 15.2353 518 0.6629 0.5568 0.6629 0.8142
0.3779 15.2941 520 0.6668 0.5706 0.6668 0.8166
0.3779 15.3529 522 0.6666 0.5468 0.6666 0.8165
0.3779 15.4118 524 0.6529 0.5563 0.6529 0.8080
0.3779 15.4706 526 0.6279 0.5338 0.6279 0.7924
0.3779 15.5294 528 0.5976 0.5519 0.5976 0.7730
0.3779 15.5882 530 0.5882 0.5461 0.5882 0.7670
0.3779 15.6471 532 0.5870 0.5498 0.5870 0.7662
0.3779 15.7059 534 0.5878 0.5255 0.5878 0.7666
0.3779 15.7647 536 0.5821 0.5372 0.5821 0.7629
0.3779 15.8235 538 0.5915 0.5624 0.5915 0.7691
0.3779 15.8824 540 0.6075 0.5444 0.6075 0.7794
0.3779 15.9412 542 0.6122 0.5143 0.6122 0.7824
0.3779 16.0 544 0.6251 0.5427 0.6251 0.7906
0.3779 16.0588 546 0.6262 0.5220 0.6262 0.7913
0.3779 16.1176 548 0.6027 0.4669 0.6027 0.7763
0.3779 16.1765 550 0.5872 0.5257 0.5872 0.7663
0.3779 16.2353 552 0.5806 0.5034 0.5806 0.7620
0.3779 16.2941 554 0.5843 0.5123 0.5843 0.7644

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
4
Safetensors
Model size
135M params
Tensor type
F32
·
Inference Providers NEW
This model is not currently available via any of the supported Inference Providers.

Model tree for MayBashendy/ArabicNewSplits8_usingWellWrittenEssays_FineTuningAraBERT_run2_AugV5_k6_task2_organization

Finetuned
(4222)
this model