ArabicNewSplits8_usingWellWrittenEssays_FineTuningAraBERT_run2_AugV5_k7_task2_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.5574
  • Qwk: 0.4587
  • Mse: 0.5574
  • Rmse: 0.7466

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0526 2 4.3533 -0.0170 4.3533 2.0864
No log 0.1053 4 2.1785 0.0693 2.1785 1.4760
No log 0.1579 6 1.2412 0.0258 1.2412 1.1141
No log 0.2105 8 1.0737 -0.0498 1.0737 1.0362
No log 0.2632 10 0.8612 0.2216 0.8612 0.9280
No log 0.3158 12 0.7974 0.2319 0.7974 0.8929
No log 0.3684 14 0.8692 0.1765 0.8692 0.9323
No log 0.4211 16 1.0856 -0.0560 1.0856 1.0419
No log 0.4737 18 1.1331 0.0313 1.1331 1.0645
No log 0.5263 20 1.1033 0.0153 1.1033 1.0504
No log 0.5789 22 0.8862 0.1423 0.8862 0.9414
No log 0.6316 24 0.7566 0.2397 0.7566 0.8698
No log 0.6842 26 0.7896 0.1199 0.7896 0.8886
No log 0.7368 28 0.7765 0.1671 0.7765 0.8812
No log 0.7895 30 0.7675 0.2713 0.7675 0.8761
No log 0.8421 32 0.9557 0.1186 0.9557 0.9776
No log 0.8947 34 1.3440 0.0258 1.3440 1.1593
No log 0.9474 36 1.3523 0.0055 1.3523 1.1629
No log 1.0 38 1.1376 0.0421 1.1376 1.0666
No log 1.0526 40 0.9085 0.1339 0.9085 0.9532
No log 1.1053 42 0.9344 0.2254 0.9344 0.9666
No log 1.1579 44 0.7955 0.3340 0.7955 0.8919
No log 1.2105 46 0.7030 0.3169 0.7030 0.8384
No log 1.2632 48 0.7063 0.3583 0.7063 0.8404
No log 1.3158 50 0.7188 0.2988 0.7188 0.8478
No log 1.3684 52 0.7522 0.1853 0.7522 0.8673
No log 1.4211 54 0.7311 0.3205 0.7311 0.8550
No log 1.4737 56 0.7373 0.2749 0.7373 0.8586
No log 1.5263 58 0.8022 0.2085 0.8022 0.8957
No log 1.5789 60 1.0071 0.1417 1.0071 1.0035
No log 1.6316 62 0.9699 0.1647 0.9699 0.9848
No log 1.6842 64 0.8524 0.2249 0.8524 0.9233
No log 1.7368 66 0.7678 0.2743 0.7678 0.8763
No log 1.7895 68 0.7636 0.3371 0.7636 0.8739
No log 1.8421 70 0.7623 0.3557 0.7623 0.8731
No log 1.8947 72 0.7625 0.3323 0.7625 0.8732
No log 1.9474 74 0.7888 0.3202 0.7888 0.8882
No log 2.0 76 0.7790 0.2961 0.7790 0.8826
No log 2.0526 78 0.7731 0.2647 0.7731 0.8793
No log 2.1053 80 0.8045 0.3281 0.8045 0.8969
No log 2.1579 82 0.8773 0.4052 0.8773 0.9367
No log 2.2105 84 0.7997 0.3464 0.7997 0.8943
No log 2.2632 86 0.8047 0.3263 0.8047 0.8971
No log 2.3158 88 0.8443 0.3092 0.8443 0.9189
No log 2.3684 90 0.7888 0.3986 0.7888 0.8881
No log 2.4211 92 0.9508 0.3661 0.9508 0.9751
No log 2.4737 94 0.9173 0.3192 0.9173 0.9578
No log 2.5263 96 0.7885 0.3820 0.7885 0.8880
No log 2.5789 98 0.6663 0.4376 0.6663 0.8163
No log 2.6316 100 0.6346 0.4498 0.6346 0.7966
No log 2.6842 102 0.6298 0.4384 0.6298 0.7936
No log 2.7368 104 0.6677 0.4724 0.6677 0.8172
No log 2.7895 106 0.6566 0.4623 0.6566 0.8103
No log 2.8421 108 0.6477 0.4695 0.6477 0.8048
No log 2.8947 110 0.6687 0.4785 0.6687 0.8177
No log 2.9474 112 0.6722 0.4920 0.6722 0.8199
No log 3.0 114 0.6362 0.5120 0.6362 0.7976
No log 3.0526 116 0.6580 0.5356 0.6580 0.8112
No log 3.1053 118 0.6640 0.5674 0.6640 0.8148
No log 3.1579 120 0.6765 0.5356 0.6765 0.8225
No log 3.2105 122 0.6768 0.5421 0.6768 0.8227
No log 3.2632 124 0.7224 0.4612 0.7224 0.8499
No log 3.3158 126 0.8046 0.4477 0.8046 0.8970
No log 3.3684 128 0.7139 0.5115 0.7139 0.8449
No log 3.4211 130 0.7423 0.4607 0.7423 0.8616
No log 3.4737 132 1.1024 0.3865 1.1024 1.0499
No log 3.5263 134 1.0915 0.3476 1.0915 1.0447
No log 3.5789 136 0.8338 0.4561 0.8338 0.9131
No log 3.6316 138 0.6989 0.4065 0.6989 0.8360
No log 3.6842 140 0.6135 0.4300 0.6135 0.7833
No log 3.7368 142 0.6312 0.4694 0.6312 0.7945
No log 3.7895 144 0.6269 0.4705 0.6269 0.7917
No log 3.8421 146 0.5991 0.4877 0.5991 0.7740
No log 3.8947 148 0.5890 0.4654 0.5890 0.7674
No log 3.9474 150 0.6375 0.4620 0.6375 0.7984
No log 4.0 152 0.5974 0.4611 0.5974 0.7729
No log 4.0526 154 0.6818 0.4636 0.6818 0.8257
No log 4.1053 156 0.9920 0.2853 0.9920 0.9960
No log 4.1579 158 0.9733 0.3084 0.9733 0.9866
No log 4.2105 160 0.7333 0.4452 0.7333 0.8564
No log 4.2632 162 0.6046 0.5679 0.6046 0.7775
No log 4.3158 164 0.6220 0.5858 0.6220 0.7886
No log 4.3684 166 0.6380 0.5548 0.6380 0.7988
No log 4.4211 168 0.6510 0.5372 0.6510 0.8069
No log 4.4737 170 0.6406 0.5378 0.6406 0.8004
No log 4.5263 172 0.6744 0.5271 0.6744 0.8212
No log 4.5789 174 0.7127 0.5167 0.7127 0.8442
No log 4.6316 176 0.6856 0.5140 0.6856 0.8280
No log 4.6842 178 0.6866 0.5200 0.6866 0.8286
No log 4.7368 180 0.6040 0.5224 0.6040 0.7772
No log 4.7895 182 0.5713 0.4391 0.5713 0.7559
No log 4.8421 184 0.5617 0.4479 0.5617 0.7495
No log 4.8947 186 0.5565 0.4731 0.5565 0.7460
No log 4.9474 188 0.5651 0.4792 0.5651 0.7517
No log 5.0 190 0.5655 0.5170 0.5655 0.7520
No log 5.0526 192 0.6294 0.5422 0.6294 0.7933
No log 5.1053 194 0.6244 0.5581 0.6244 0.7902
No log 5.1579 196 0.5719 0.5607 0.5719 0.7562
No log 5.2105 198 0.6166 0.4474 0.6166 0.7852
No log 5.2632 200 0.5818 0.4497 0.5818 0.7628
No log 5.3158 202 0.5798 0.5351 0.5798 0.7614
No log 5.3684 204 0.5762 0.5429 0.5762 0.7591
No log 5.4211 206 0.5666 0.5113 0.5666 0.7527
No log 5.4737 208 0.5596 0.4523 0.5596 0.7481
No log 5.5263 210 0.5647 0.5216 0.5647 0.7514
No log 5.5789 212 0.5628 0.5178 0.5628 0.7502
No log 5.6316 214 0.6056 0.5241 0.6056 0.7782
No log 5.6842 216 0.7523 0.5008 0.7523 0.8673
No log 5.7368 218 0.6531 0.5408 0.6531 0.8081
No log 5.7895 220 0.5845 0.5495 0.5845 0.7645
No log 5.8421 222 0.5986 0.5189 0.5986 0.7737
No log 5.8947 224 0.5644 0.5253 0.5644 0.7513
No log 5.9474 226 0.5779 0.5174 0.5779 0.7602
No log 6.0 228 0.6709 0.4290 0.6709 0.8191
No log 6.0526 230 0.6713 0.3953 0.6713 0.8193
No log 6.1053 232 0.5740 0.5495 0.5740 0.7577
No log 6.1579 234 0.6036 0.4692 0.6036 0.7769
No log 6.2105 236 0.5959 0.4637 0.5959 0.7719
No log 6.2632 238 0.5638 0.5106 0.5638 0.7509
No log 6.3158 240 0.5617 0.5056 0.5617 0.7495
No log 6.3684 242 0.5740 0.4774 0.5740 0.7576
No log 6.4211 244 0.5639 0.4571 0.5639 0.7509
No log 6.4737 246 0.5632 0.4844 0.5632 0.7505
No log 6.5263 248 0.5665 0.5219 0.5665 0.7527
No log 6.5789 250 0.6038 0.4972 0.6038 0.7770
No log 6.6316 252 0.7081 0.5124 0.7081 0.8415
No log 6.6842 254 0.7528 0.4817 0.7528 0.8676
No log 6.7368 256 0.6657 0.4874 0.6657 0.8159
No log 6.7895 258 0.6040 0.5036 0.6040 0.7772
No log 6.8421 260 0.6256 0.4515 0.6256 0.7909
No log 6.8947 262 0.6382 0.4938 0.6382 0.7989
No log 6.9474 264 0.5963 0.4596 0.5963 0.7722
No log 7.0 266 0.5744 0.4896 0.5744 0.7579
No log 7.0526 268 0.5808 0.4615 0.5808 0.7621
No log 7.1053 270 0.5934 0.4449 0.5934 0.7703
No log 7.1579 272 0.6067 0.4738 0.6067 0.7789
No log 7.2105 274 0.5909 0.4703 0.5909 0.7687
No log 7.2632 276 0.6116 0.4863 0.6116 0.7820
No log 7.3158 278 0.5785 0.4262 0.5785 0.7606
No log 7.3684 280 0.5865 0.4540 0.5865 0.7658
No log 7.4211 282 0.6204 0.5071 0.6204 0.7876
No log 7.4737 284 0.6101 0.5151 0.6101 0.7811
No log 7.5263 286 0.5625 0.3788 0.5625 0.7500
No log 7.5789 288 0.6337 0.4007 0.6337 0.7961
No log 7.6316 290 0.7809 0.4390 0.7809 0.8837
No log 7.6842 292 0.7341 0.4308 0.7341 0.8568
No log 7.7368 294 0.5860 0.5115 0.5860 0.7655
No log 7.7895 296 0.5585 0.5311 0.5585 0.7473
No log 7.8421 298 0.6262 0.5383 0.6262 0.7914
No log 7.8947 300 0.6290 0.5224 0.6290 0.7931
No log 7.9474 302 0.5560 0.5556 0.5560 0.7456
No log 8.0 304 0.5861 0.5109 0.5861 0.7656
No log 8.0526 306 0.7648 0.4412 0.7648 0.8745
No log 8.1053 308 0.7923 0.4831 0.7923 0.8901
No log 8.1579 310 0.6582 0.4589 0.6582 0.8113
No log 8.2105 312 0.5568 0.5652 0.5568 0.7462
No log 8.2632 314 0.6431 0.5076 0.6431 0.8020
No log 8.3158 316 0.6844 0.4698 0.6844 0.8273
No log 8.3684 318 0.6089 0.5021 0.6089 0.7803
No log 8.4211 320 0.5544 0.5253 0.5544 0.7446
No log 8.4737 322 0.6151 0.4650 0.6151 0.7843
No log 8.5263 324 0.6430 0.4878 0.6430 0.8019
No log 8.5789 326 0.5909 0.4603 0.5909 0.7687
No log 8.6316 328 0.5674 0.5182 0.5674 0.7532
No log 8.6842 330 0.5808 0.5084 0.5808 0.7621
No log 8.7368 332 0.5712 0.4977 0.5712 0.7558
No log 8.7895 334 0.5831 0.4856 0.5831 0.7636
No log 8.8421 336 0.6091 0.4644 0.6091 0.7804
No log 8.8947 338 0.5972 0.5046 0.5972 0.7728
No log 8.9474 340 0.6240 0.5040 0.6240 0.7900
No log 9.0 342 0.6815 0.5178 0.6815 0.8256
No log 9.0526 344 0.6932 0.5095 0.6932 0.8326
No log 9.1053 346 0.7149 0.5026 0.7149 0.8455
No log 9.1579 348 0.6500 0.4669 0.6500 0.8062
No log 9.2105 350 0.6266 0.4529 0.6266 0.7916
No log 9.2632 352 0.6338 0.4412 0.6338 0.7961
No log 9.3158 354 0.6361 0.4349 0.6361 0.7976
No log 9.3684 356 0.6849 0.5260 0.6849 0.8276
No log 9.4211 358 0.7929 0.4376 0.7929 0.8905
No log 9.4737 360 0.7964 0.4156 0.7964 0.8924
No log 9.5263 362 0.7236 0.4814 0.7236 0.8507
No log 9.5789 364 0.6587 0.5076 0.6587 0.8116
No log 9.6316 366 0.6356 0.4998 0.6356 0.7973
No log 9.6842 368 0.6694 0.4406 0.6694 0.8182
No log 9.7368 370 0.6717 0.4707 0.6717 0.8196
No log 9.7895 372 0.6770 0.5727 0.6770 0.8228
No log 9.8421 374 0.6824 0.5330 0.6824 0.8261
No log 9.8947 376 0.7110 0.5319 0.7110 0.8432
No log 9.9474 378 0.7264 0.5224 0.7264 0.8523
No log 10.0 380 0.6872 0.4953 0.6872 0.8290
No log 10.0526 382 0.6226 0.4473 0.6226 0.7890
No log 10.1053 384 0.6103 0.4377 0.6103 0.7812
No log 10.1579 386 0.6001 0.4173 0.6001 0.7746
No log 10.2105 388 0.6042 0.4212 0.6042 0.7773
No log 10.2632 390 0.6156 0.3986 0.6156 0.7846
No log 10.3158 392 0.6484 0.4361 0.6484 0.8052
No log 10.3684 394 0.6695 0.4532 0.6695 0.8182
No log 10.4211 396 0.6463 0.4949 0.6463 0.8039
No log 10.4737 398 0.6031 0.4505 0.6031 0.7766
No log 10.5263 400 0.6026 0.4481 0.6026 0.7763
No log 10.5789 402 0.6037 0.4664 0.6037 0.7770
No log 10.6316 404 0.6744 0.4612 0.6744 0.8212
No log 10.6842 406 0.6845 0.4705 0.6845 0.8274
No log 10.7368 408 0.6161 0.4292 0.6161 0.7849
No log 10.7895 410 0.5667 0.4302 0.5667 0.7528
No log 10.8421 412 0.5628 0.4771 0.5628 0.7502
No log 10.8947 414 0.5693 0.4841 0.5693 0.7545
No log 10.9474 416 0.6161 0.4317 0.6161 0.7849
No log 11.0 418 0.6093 0.4567 0.6093 0.7805
No log 11.0526 420 0.5878 0.4736 0.5878 0.7667
No log 11.1053 422 0.5784 0.4963 0.5784 0.7605
No log 11.1579 424 0.5772 0.4748 0.5772 0.7597
No log 11.2105 426 0.5698 0.4785 0.5698 0.7549
No log 11.2632 428 0.5657 0.4676 0.5657 0.7522
No log 11.3158 430 0.5632 0.4522 0.5632 0.7504
No log 11.3684 432 0.5690 0.4531 0.5690 0.7543
No log 11.4211 434 0.5634 0.4541 0.5634 0.7506
No log 11.4737 436 0.5650 0.4497 0.5650 0.7516
No log 11.5263 438 0.5703 0.4294 0.5703 0.7552
No log 11.5789 440 0.5748 0.4297 0.5748 0.7582
No log 11.6316 442 0.5875 0.4583 0.5875 0.7665
No log 11.6842 444 0.6270 0.5065 0.6270 0.7918
No log 11.7368 446 0.6455 0.5134 0.6455 0.8035
No log 11.7895 448 0.6371 0.5251 0.6371 0.7982
No log 11.8421 450 0.6038 0.4728 0.6038 0.7771
No log 11.8947 452 0.5832 0.4270 0.5832 0.7637
No log 11.9474 454 0.5915 0.4308 0.5915 0.7691
No log 12.0 456 0.5854 0.4405 0.5854 0.7651
No log 12.0526 458 0.5736 0.4439 0.5736 0.7574
No log 12.1053 460 0.5881 0.4455 0.5881 0.7669
No log 12.1579 462 0.5872 0.4653 0.5872 0.7663
No log 12.2105 464 0.5726 0.4573 0.5726 0.7567
No log 12.2632 466 0.5897 0.4515 0.5897 0.7679
No log 12.3158 468 0.6064 0.4370 0.6064 0.7787
No log 12.3684 470 0.5794 0.4572 0.5794 0.7612
No log 12.4211 472 0.5701 0.4570 0.5701 0.7550
No log 12.4737 474 0.5811 0.4835 0.5811 0.7623
No log 12.5263 476 0.5920 0.5041 0.5920 0.7694
No log 12.5789 478 0.5649 0.4395 0.5649 0.7516
No log 12.6316 480 0.5550 0.4539 0.5550 0.7450
No log 12.6842 482 0.5659 0.4474 0.5659 0.7523
No log 12.7368 484 0.5888 0.4587 0.5888 0.7673
No log 12.7895 486 0.5862 0.4587 0.5862 0.7656
No log 12.8421 488 0.5746 0.4618 0.5746 0.7580
No log 12.8947 490 0.5669 0.4668 0.5669 0.7529
No log 12.9474 492 0.5634 0.4538 0.5634 0.7506
No log 13.0 494 0.5699 0.4217 0.5699 0.7549
No log 13.0526 496 0.5920 0.4395 0.5920 0.7694
No log 13.1053 498 0.6046 0.4827 0.6046 0.7776
0.3662 13.1579 500 0.5881 0.4800 0.5881 0.7669
0.3662 13.2105 502 0.5660 0.4992 0.5660 0.7524
0.3662 13.2632 504 0.5679 0.4957 0.5679 0.7536
0.3662 13.3158 506 0.5680 0.5304 0.5680 0.7536
0.3662 13.3684 508 0.5582 0.4957 0.5582 0.7471
0.3662 13.4211 510 0.5487 0.5109 0.5487 0.7407
0.3662 13.4737 512 0.5520 0.4819 0.5520 0.7429
0.3662 13.5263 514 0.5829 0.4602 0.5829 0.7635
0.3662 13.5789 516 0.5764 0.4099 0.5764 0.7592
0.3662 13.6316 518 0.5580 0.4348 0.5580 0.7470
0.3662 13.6842 520 0.5543 0.4661 0.5543 0.7445
0.3662 13.7368 522 0.5574 0.4587 0.5574 0.7466

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
4
Safetensors
Model size
135M params
Tensor type
F32
·
Inference Providers NEW
This model is not currently available via any of the supported Inference Providers.

Model tree for MayBashendy/ArabicNewSplits8_usingWellWrittenEssays_FineTuningAraBERT_run2_AugV5_k7_task2_organization

Finetuned
(4222)
this model