ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run2_AugV5_k4_task1_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.9623
  • Qwk: 0.6423
  • Mse: 0.9623
  • Rmse: 0.9810

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0667 2 6.5898 0.0308 6.5898 2.5671
No log 0.1333 4 4.1002 0.1096 4.1002 2.0249
No log 0.2 6 2.8690 0.0988 2.8690 1.6938
No log 0.2667 8 2.1510 0.0839 2.1510 1.4666
No log 0.3333 10 2.0596 0.1552 2.0596 1.4351
No log 0.4 12 2.0180 0.1416 2.0180 1.4206
No log 0.4667 14 2.3763 -0.0455 2.3763 1.5415
No log 0.5333 16 2.2862 0.0458 2.2862 1.5120
No log 0.6 18 1.7609 0.2281 1.7609 1.3270
No log 0.6667 20 1.4508 0.2593 1.4508 1.2045
No log 0.7333 22 1.4121 0.2430 1.4121 1.1883
No log 0.8 24 1.3413 0.3119 1.3413 1.1582
No log 0.8667 26 1.4091 0.2593 1.4091 1.1870
No log 0.9333 28 1.5483 0.2586 1.5483 1.2443
No log 1.0 30 1.5390 0.2881 1.5390 1.2406
No log 1.0667 32 1.2379 0.4211 1.2379 1.1126
No log 1.1333 34 1.1614 0.4746 1.1614 1.0777
No log 1.2 36 1.1503 0.4959 1.1503 1.0725
No log 1.2667 38 1.0579 0.56 1.0579 1.0285
No log 1.3333 40 1.1309 0.5806 1.1309 1.0634
No log 1.4 42 1.0754 0.5760 1.0754 1.0370
No log 1.4667 44 1.1479 0.5077 1.1479 1.0714
No log 1.5333 46 1.2002 0.4480 1.2002 1.0955
No log 1.6 48 1.0564 0.6 1.0564 1.0278
No log 1.6667 50 1.0704 0.5758 1.0704 1.0346
No log 1.7333 52 1.0743 0.5481 1.0743 1.0365
No log 1.8 54 1.2541 0.5735 1.2541 1.1199
No log 1.8667 56 1.4963 0.4397 1.4963 1.2232
No log 1.9333 58 1.2862 0.6187 1.2862 1.1341
No log 2.0 60 1.0796 0.6197 1.0796 1.0390
No log 2.0667 62 1.0801 0.6351 1.0801 1.0393
No log 2.1333 64 1.1478 0.6358 1.1478 1.0714
No log 2.2 66 1.0344 0.6081 1.0344 1.0171
No log 2.2667 68 1.3379 0.4427 1.3379 1.1567
No log 2.3333 70 1.5250 0.3939 1.5250 1.2349
No log 2.4 72 1.0353 0.6667 1.0353 1.0175
No log 2.4667 74 1.5442 0.5730 1.5442 1.2427
No log 2.5333 76 1.6867 0.4706 1.6867 1.2987
No log 2.6 78 1.2340 0.56 1.2340 1.1109
No log 2.6667 80 0.8541 0.6715 0.8541 0.9242
No log 2.7333 82 0.9859 0.6418 0.9859 0.9929
No log 2.8 84 0.8516 0.7015 0.8516 0.9228
No log 2.8667 86 0.8169 0.6475 0.8169 0.9038
No log 2.9333 88 0.8734 0.6377 0.8734 0.9346
No log 3.0 90 0.9176 0.6522 0.9176 0.9579
No log 3.0667 92 0.9204 0.6131 0.9204 0.9594
No log 3.1333 94 0.9179 0.6486 0.9179 0.9581
No log 3.2 96 0.9576 0.6410 0.9576 0.9786
No log 3.2667 98 0.9264 0.6923 0.9264 0.9625
No log 3.3333 100 0.8061 0.6980 0.8061 0.8978
No log 3.4 102 0.8356 0.7413 0.8356 0.9141
No log 3.4667 104 0.8203 0.7194 0.8203 0.9057
No log 3.5333 106 0.8020 0.6714 0.8020 0.8956
No log 3.6 108 0.8542 0.6429 0.8542 0.9242
No log 3.6667 110 1.0296 0.6331 1.0296 1.0147
No log 3.7333 112 1.0753 0.6755 1.0753 1.0370
No log 3.8 114 0.9588 0.6667 0.9588 0.9792
No log 3.8667 116 0.8876 0.6531 0.8876 0.9421
No log 3.9333 118 0.8772 0.6901 0.8772 0.9366
No log 4.0 120 0.9096 0.6667 0.9096 0.9537
No log 4.0667 122 0.9657 0.6755 0.9657 0.9827
No log 4.1333 124 0.8346 0.6619 0.8346 0.9136
No log 4.2 126 0.8225 0.6525 0.8225 0.9069
No log 4.2667 128 0.8480 0.6714 0.8480 0.9208
No log 4.3333 130 0.8679 0.6187 0.8679 0.9316
No log 4.4 132 0.8787 0.6757 0.8787 0.9374
No log 4.4667 134 0.8485 0.6757 0.8485 0.9212
No log 4.5333 136 0.8708 0.6753 0.8708 0.9332
No log 4.6 138 0.9105 0.6626 0.9105 0.9542
No log 4.6667 140 0.8824 0.6792 0.8824 0.9394
No log 4.7333 142 0.9089 0.7134 0.9089 0.9534
No log 4.8 144 0.8693 0.6423 0.8693 0.9324
No log 4.8667 146 0.8799 0.6466 0.8799 0.9380
No log 4.9333 148 0.9244 0.6324 0.9244 0.9615
No log 5.0 150 0.8927 0.6522 0.8927 0.9448
No log 5.0667 152 0.9597 0.6533 0.9597 0.9797
No log 5.1333 154 1.1648 0.6626 1.1648 1.0793
No log 5.2 156 1.0693 0.6541 1.0693 1.0341
No log 5.2667 158 0.7995 0.6892 0.7995 0.8941
No log 5.3333 160 0.7091 0.7050 0.7091 0.8421
No log 5.4 162 0.7041 0.7007 0.7041 0.8391
No log 5.4667 164 0.7466 0.6525 0.7466 0.8641
No log 5.5333 166 0.8796 0.6759 0.8796 0.9379
No log 5.6 168 0.9083 0.6846 0.9083 0.9530
No log 5.6667 170 0.9503 0.6968 0.9503 0.9748
No log 5.7333 172 0.8548 0.6620 0.8548 0.9246
No log 5.8 174 0.7393 0.7234 0.7393 0.8598
No log 5.8667 176 0.7585 0.7042 0.7585 0.8709
No log 5.9333 178 0.8468 0.6573 0.8468 0.9202
No log 6.0 180 0.8371 0.7 0.8371 0.9149
No log 6.0667 182 0.7581 0.6716 0.7581 0.8707
No log 6.1333 184 0.7501 0.6963 0.7501 0.8661
No log 6.2 186 0.8186 0.6861 0.8186 0.9048
No log 6.2667 188 0.8324 0.6715 0.8324 0.9124
No log 6.3333 190 0.7225 0.6912 0.7225 0.8500
No log 6.4 192 0.6160 0.7286 0.6160 0.7849
No log 6.4667 194 0.6252 0.7286 0.6252 0.7907
No log 6.5333 196 0.7215 0.7027 0.7215 0.8494
No log 6.6 198 1.0630 0.6923 1.0630 1.0310
No log 6.6667 200 1.4818 0.5820 1.4818 1.2173
No log 6.7333 202 1.4016 0.6038 1.4016 1.1839
No log 6.8 204 1.0635 0.6286 1.0635 1.0313
No log 6.8667 206 0.8135 0.6429 0.8135 0.9019
No log 6.9333 208 0.6309 0.7692 0.6309 0.7943
No log 7.0 210 0.5882 0.7397 0.5882 0.7669
No log 7.0667 212 0.6221 0.7578 0.6221 0.7887
No log 7.1333 214 0.6629 0.7662 0.6629 0.8142
No log 7.2 216 0.7156 0.7133 0.7156 0.8459
No log 7.2667 218 0.7985 0.7067 0.7985 0.8936
No log 7.3333 220 0.8546 0.6623 0.8546 0.9245
No log 7.4 222 0.8947 0.6624 0.8947 0.9459
No log 7.4667 224 0.7794 0.6846 0.7794 0.8828
No log 7.5333 226 0.7250 0.7183 0.7250 0.8515
No log 7.6 228 0.7134 0.6906 0.7134 0.8446
No log 7.6667 230 0.7353 0.7 0.7353 0.8575
No log 7.7333 232 0.7388 0.7143 0.7388 0.8595
No log 7.8 234 0.7667 0.6857 0.7667 0.8756
No log 7.8667 236 0.8228 0.6443 0.8228 0.9071
No log 7.9333 238 0.8296 0.6584 0.8296 0.9108
No log 8.0 240 0.8301 0.6584 0.8301 0.9111
No log 8.0667 242 0.8927 0.6918 0.8927 0.9448
No log 8.1333 244 0.9754 0.6358 0.9754 0.9876
No log 8.2 246 0.9609 0.6475 0.9609 0.9803
No log 8.2667 248 0.8882 0.6619 0.8882 0.9424
No log 8.3333 250 0.8000 0.6667 0.8000 0.8944
No log 8.4 252 0.7797 0.6759 0.7797 0.8830
No log 8.4667 254 0.8174 0.6792 0.8174 0.9041
No log 8.5333 256 0.8627 0.7 0.8627 0.9288
No log 8.6 258 0.9078 0.6879 0.9078 0.9528
No log 8.6667 260 0.9506 0.6795 0.9506 0.9750
No log 8.7333 262 1.0135 0.6914 1.0135 1.0067
No log 8.8 264 0.9392 0.6918 0.9392 0.9691
No log 8.8667 266 0.8188 0.6187 0.8188 0.9049
No log 8.9333 268 0.7519 0.7007 0.7519 0.8671
No log 9.0 270 0.7534 0.6714 0.7534 0.8680
No log 9.0667 272 0.8321 0.6933 0.8321 0.9122
No log 9.1333 274 0.9466 0.6883 0.9466 0.9729
No log 9.2 276 0.9984 0.6797 0.9984 0.9992
No log 9.2667 278 0.9535 0.6667 0.9535 0.9765
No log 9.3333 280 0.8545 0.6471 0.8545 0.9244
No log 9.4 282 0.7851 0.6519 0.7851 0.8860
No log 9.4667 284 0.7610 0.6528 0.7610 0.8723
No log 9.5333 286 0.8096 0.7059 0.8096 0.8998
No log 9.6 288 0.8675 0.6901 0.8675 0.9314
No log 9.6667 290 0.8885 0.6711 0.8885 0.9426
No log 9.7333 292 0.8709 0.6753 0.8709 0.9332
No log 9.8 294 0.7805 0.6806 0.7805 0.8834
No log 9.8667 296 0.7907 0.6667 0.7907 0.8892
No log 9.9333 298 0.8183 0.6383 0.8183 0.9046
No log 10.0 300 0.8798 0.6800 0.8798 0.9380
No log 10.0667 302 0.9333 0.6531 0.9333 0.9661
No log 10.1333 304 0.9721 0.6389 0.9721 0.9860
No log 10.2 306 0.9860 0.6434 0.9860 0.9930
No log 10.2667 308 0.9670 0.5970 0.9670 0.9833
No log 10.3333 310 0.9505 0.5581 0.9505 0.9749
No log 10.4 312 0.9518 0.5714 0.9518 0.9756
No log 10.4667 314 0.9779 0.625 0.9779 0.9889
No log 10.5333 316 0.9451 0.6623 0.9451 0.9722
No log 10.6 318 0.8618 0.7044 0.8618 0.9283
No log 10.6667 320 0.7837 0.6962 0.7837 0.8853
No log 10.7333 322 0.7495 0.7114 0.7495 0.8657
No log 10.8 324 0.7643 0.7368 0.7643 0.8742
No log 10.8667 326 0.9595 0.7126 0.9595 0.9795
No log 10.9333 328 1.0521 0.6821 1.0521 1.0257
No log 11.0 330 1.0081 0.6821 1.0081 1.0040
No log 11.0667 332 0.7862 0.7368 0.7862 0.8867
No log 11.1333 334 0.6482 0.7324 0.6482 0.8051
No log 11.2 336 0.6450 0.6957 0.6450 0.8031
No log 11.2667 338 0.6401 0.7273 0.6401 0.8001
No log 11.3333 340 0.7108 0.7453 0.7108 0.8431
No log 11.4 342 0.9454 0.7033 0.9454 0.9723
No log 11.4667 344 1.1103 0.6740 1.1103 1.0537
No log 11.5333 346 1.0559 0.6927 1.0559 1.0276
No log 11.6 348 0.9332 0.6909 0.9332 0.9660
No log 11.6667 350 0.8719 0.6710 0.8719 0.9337
No log 11.7333 352 0.8469 0.6710 0.8469 0.9203
No log 11.8 354 0.8208 0.6755 0.8208 0.9060
No log 11.8667 356 0.7294 0.6617 0.7294 0.8541
No log 11.9333 358 0.7217 0.7068 0.7217 0.8495
No log 12.0 360 0.7425 0.6617 0.7425 0.8617
No log 12.0667 362 0.8143 0.6471 0.8143 0.9024
No log 12.1333 364 0.8158 0.6479 0.8158 0.9032
No log 12.2 366 0.7990 0.6667 0.7990 0.8939
No log 12.2667 368 0.7183 0.6718 0.7183 0.8475
No log 12.3333 370 0.6947 0.6963 0.6947 0.8335
No log 12.4 372 0.6912 0.7042 0.6912 0.8314
No log 12.4667 374 0.7607 0.7067 0.7607 0.8722
No log 12.5333 376 0.8340 0.6962 0.8340 0.9132
No log 12.6 378 0.8934 0.7097 0.8934 0.9452
No log 12.6667 380 0.8405 0.6974 0.8405 0.9168
No log 12.7333 382 0.7367 0.7000 0.7367 0.8583
No log 12.8 384 0.7013 0.6906 0.7013 0.8374
No log 12.8667 386 0.7070 0.7234 0.7070 0.8408
No log 12.9333 388 0.7528 0.7234 0.7528 0.8677
No log 13.0 390 0.8443 0.6757 0.8443 0.9188
No log 13.0667 392 0.8975 0.6757 0.8975 0.9474
No log 13.1333 394 0.8717 0.6471 0.8717 0.9336
No log 13.2 396 0.7901 0.6618 0.7901 0.8889
No log 13.2667 398 0.7323 0.6565 0.7323 0.8557
No log 13.3333 400 0.7102 0.6763 0.7102 0.8427
No log 13.4 402 0.7029 0.6857 0.7029 0.8384
No log 13.4667 404 0.7288 0.7421 0.7288 0.8537
No log 13.5333 406 0.8497 0.7209 0.8497 0.9218
No log 13.6 408 0.8999 0.7052 0.8999 0.9486
No log 13.6667 410 0.8398 0.7168 0.8398 0.9164
No log 13.7333 412 0.7564 0.7453 0.7564 0.8697
No log 13.8 414 0.6652 0.7397 0.6652 0.8156
No log 13.8667 416 0.6202 0.75 0.6202 0.7875
No log 13.9333 418 0.6382 0.7518 0.6382 0.7989
No log 14.0 420 0.6875 0.7383 0.6875 0.8292
No log 14.0667 422 0.7636 0.7317 0.7636 0.8739
No log 14.1333 424 0.7962 0.7239 0.7962 0.8923
No log 14.2 426 0.8544 0.7011 0.8544 0.9243
No log 14.2667 428 0.8612 0.7011 0.8612 0.9280
No log 14.3333 430 0.7854 0.7006 0.7854 0.8862
No log 14.4 432 0.6887 0.7034 0.6887 0.8299
No log 14.4667 434 0.6668 0.7413 0.6668 0.8166
No log 14.5333 436 0.6481 0.7651 0.6481 0.8050
No log 14.6 438 0.6567 0.7568 0.6567 0.8104
No log 14.6667 440 0.7198 0.7172 0.7198 0.8484
No log 14.7333 442 0.8552 0.6486 0.8552 0.9248
No log 14.8 444 0.9370 0.6623 0.9370 0.9680
No log 14.8667 446 0.9147 0.6621 0.9147 0.9564
No log 14.9333 448 0.8590 0.6331 0.8590 0.9268
No log 15.0 450 0.7815 0.6462 0.7815 0.8840
No log 15.0667 452 0.7460 0.6617 0.7460 0.8637
No log 15.1333 454 0.7534 0.6716 0.7534 0.8680
No log 15.2 456 0.7613 0.6471 0.7613 0.8725
No log 15.2667 458 0.8049 0.6621 0.8049 0.8972
No log 15.3333 460 0.8072 0.6757 0.8072 0.8984
No log 15.4 462 0.7956 0.6331 0.7956 0.8919
No log 15.4667 464 0.8230 0.6621 0.8230 0.9072
No log 15.5333 466 0.8422 0.7006 0.8422 0.9177
No log 15.6 468 0.8554 0.6839 0.8554 0.9249
No log 15.6667 470 0.8584 0.6711 0.8584 0.9265
No log 15.7333 472 0.8662 0.6338 0.8662 0.9307
No log 15.8 474 0.8628 0.6331 0.8628 0.9289
No log 15.8667 476 0.8646 0.6176 0.8646 0.9299
No log 15.9333 478 0.8689 0.6176 0.8689 0.9322
No log 16.0 480 0.9225 0.6176 0.9225 0.9605
No log 16.0667 482 1.0289 0.6187 1.0289 1.0143
No log 16.1333 484 1.0436 0.6187 1.0436 1.0216
No log 16.2 486 0.9546 0.6029 0.9546 0.9770
No log 16.2667 488 0.8682 0.6316 0.8682 0.9318
No log 16.3333 490 0.8093 0.6515 0.8093 0.8996
No log 16.4 492 0.7927 0.6765 0.7927 0.8903
No log 16.4667 494 0.7957 0.6471 0.7957 0.8920
No log 16.5333 496 0.8585 0.6957 0.8585 0.9266
No log 16.6 498 0.8677 0.7073 0.8677 0.9315
0.3285 16.6667 500 0.8179 0.6619 0.8179 0.9044
0.3285 16.7333 502 0.8098 0.6667 0.8098 0.8999
0.3285 16.8 504 0.8096 0.6718 0.8096 0.8998
0.3285 16.8667 506 0.8420 0.6515 0.8420 0.9176
0.3285 16.9333 508 0.8654 0.6515 0.8654 0.9303
0.3285 17.0 510 0.8929 0.6515 0.8929 0.9449
0.3285 17.0667 512 0.9919 0.6832 0.9919 0.9960
0.3285 17.1333 514 1.1133 0.6857 1.1133 1.0551
0.3285 17.2 516 1.0949 0.6746 1.0949 1.0464
0.3285 17.2667 518 0.9684 0.6579 0.9684 0.9841
0.3285 17.3333 520 0.8481 0.6515 0.8481 0.9209
0.3285 17.4 522 0.8142 0.6667 0.8142 0.9023
0.3285 17.4667 524 0.8133 0.6667 0.8133 0.9019
0.3285 17.5333 526 0.8190 0.6412 0.8190 0.9050
0.3285 17.6 528 0.8312 0.6515 0.8312 0.9117
0.3285 17.6667 530 0.8522 0.6667 0.8522 0.9232
0.3285 17.7333 532 0.8804 0.6933 0.8804 0.9383
0.3285 17.8 534 0.8994 0.6928 0.8994 0.9484
0.3285 17.8667 536 0.8993 0.6667 0.8993 0.9483
0.3285 17.9333 538 0.9062 0.6324 0.9062 0.9520
0.3285 18.0 540 0.8769 0.6316 0.8769 0.9364
0.3285 18.0667 542 0.8570 0.6316 0.8570 0.9258
0.3285 18.1333 544 0.8864 0.6316 0.8864 0.9415
0.3285 18.2 546 0.9364 0.6711 0.9364 0.9677
0.3285 18.2667 548 0.9597 0.6946 0.9597 0.9796
0.3285 18.3333 550 0.9022 0.7059 0.9022 0.9498
0.3285 18.4 552 0.8338 0.7108 0.8338 0.9131
0.3285 18.4667 554 0.7915 0.7089 0.7915 0.8897
0.3285 18.5333 556 0.8113 0.6711 0.8113 0.9007
0.3285 18.6 558 0.9303 0.6573 0.9303 0.9645
0.3285 18.6667 560 1.0452 0.6525 1.0452 1.0223
0.3285 18.7333 562 1.0327 0.6377 1.0327 1.0162
0.3285 18.8 564 1.0406 0.6377 1.0406 1.0201
0.3285 18.8667 566 0.9623 0.6423 0.9623 0.9810

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
5
Safetensors
Model size
135M params
Tensor type
F32
·
Inference Providers NEW
This model is not currently available via any of the supported Inference Providers.

Model tree for MayBashendy/ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run2_AugV5_k4_task1_organization

Finetuned
(4222)
this model