ArabicNewSplits8_usingALLEssays_FineTuningAraBERT_run1_AugV5_k18_task2_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.5999
  • Qwk: 0.4729
  • Mse: 0.5999
  • Rmse: 0.7745

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0217 2 4.4755 -0.0182 4.4755 2.1155
No log 0.0435 4 2.5785 -0.0311 2.5785 1.6058
No log 0.0652 6 1.5327 0.0107 1.5327 1.2380
No log 0.0870 8 0.9215 0.0112 0.9215 0.9600
No log 0.1087 10 0.7554 0.1989 0.7554 0.8692
No log 0.1304 12 0.7954 0.1761 0.7954 0.8918
No log 0.1522 14 0.7669 0.2863 0.7669 0.8758
No log 0.1739 16 0.7581 0.2950 0.7581 0.8707
No log 0.1957 18 0.9114 0.3127 0.9114 0.9547
No log 0.2174 20 1.5961 0.1776 1.5961 1.2634
No log 0.2391 22 1.4258 0.2409 1.4258 1.1940
No log 0.2609 24 0.9936 0.3063 0.9936 0.9968
No log 0.2826 26 0.8759 0.2610 0.8759 0.9359
No log 0.3043 28 0.9690 0.248 0.9690 0.9844
No log 0.3261 30 0.9485 0.2807 0.9485 0.9739
No log 0.3478 32 0.9661 0.2807 0.9661 0.9829
No log 0.3696 34 0.7324 0.4511 0.7324 0.8558
No log 0.3913 36 0.7103 0.4749 0.7103 0.8428
No log 0.4130 38 1.0714 0.2719 1.0714 1.0351
No log 0.4348 40 1.3635 0.2395 1.3635 1.1677
No log 0.4565 42 1.2104 0.3335 1.2104 1.1002
No log 0.4783 44 0.8032 0.5201 0.8032 0.8962
No log 0.5 46 0.6329 0.4895 0.6329 0.7955
No log 0.5217 48 0.7703 0.3785 0.7703 0.8777
No log 0.5435 50 0.8004 0.3348 0.8004 0.8947
No log 0.5652 52 0.8132 0.4617 0.8132 0.9017
No log 0.5870 54 1.0158 0.4151 1.0158 1.0079
No log 0.6087 56 1.0812 0.3787 1.0812 1.0398
No log 0.6304 58 0.9299 0.4668 0.9299 0.9643
No log 0.6522 60 0.6429 0.4905 0.6429 0.8018
No log 0.6739 62 0.5667 0.4573 0.5667 0.7528
No log 0.6957 64 0.5821 0.5065 0.5821 0.7629
No log 0.7174 66 0.5890 0.5496 0.5890 0.7675
No log 0.7391 68 0.5501 0.5222 0.5501 0.7417
No log 0.7609 70 0.6946 0.4673 0.6946 0.8335
No log 0.7826 72 1.1268 0.4068 1.1268 1.0615
No log 0.8043 74 1.3513 0.3602 1.3513 1.1624
No log 0.8261 76 1.2013 0.4126 1.2013 1.0960
No log 0.8478 78 0.9217 0.4446 0.9217 0.9601
No log 0.8696 80 0.8619 0.4546 0.8619 0.9284
No log 0.8913 82 0.7562 0.5145 0.7562 0.8696
No log 0.9130 84 0.7483 0.5250 0.7483 0.8651
No log 0.9348 86 0.7401 0.5099 0.7401 0.8603
No log 0.9565 88 0.7755 0.4642 0.7755 0.8806
No log 0.9783 90 0.7854 0.4701 0.7854 0.8862
No log 1.0 92 0.7081 0.5145 0.7081 0.8415
No log 1.0217 94 0.6405 0.5004 0.6405 0.8003
No log 1.0435 96 0.6606 0.5402 0.6606 0.8128
No log 1.0652 98 0.7107 0.5232 0.7107 0.8431
No log 1.0870 100 0.7390 0.5717 0.7390 0.8596
No log 1.1087 102 0.7150 0.5496 0.7150 0.8456
No log 1.1304 104 0.7347 0.4854 0.7347 0.8571
No log 1.1522 106 0.7103 0.5598 0.7103 0.8428
No log 1.1739 108 0.7524 0.5357 0.7524 0.8674
No log 1.1957 110 0.8977 0.4867 0.8977 0.9475
No log 1.2174 112 0.8473 0.5447 0.8473 0.9205
No log 1.2391 114 0.7036 0.5124 0.7036 0.8388
No log 1.2609 116 0.6730 0.5322 0.6730 0.8204
No log 1.2826 118 0.6736 0.5129 0.6736 0.8207
No log 1.3043 120 0.6857 0.4898 0.6857 0.8280
No log 1.3261 122 0.6725 0.4921 0.6725 0.8200
No log 1.3478 124 0.7127 0.5077 0.7127 0.8442
No log 1.3696 126 0.6528 0.4721 0.6528 0.8079
No log 1.3913 128 0.6565 0.4880 0.6565 0.8102
No log 1.4130 130 0.6378 0.4998 0.6378 0.7986
No log 1.4348 132 0.6440 0.4779 0.6440 0.8025
No log 1.4565 134 0.6357 0.5158 0.6357 0.7973
No log 1.4783 136 0.6257 0.5162 0.6257 0.7910
No log 1.5 138 0.6453 0.4807 0.6453 0.8033
No log 1.5217 140 0.6374 0.4711 0.6374 0.7984
No log 1.5435 142 0.7102 0.5208 0.7102 0.8427
No log 1.5652 144 0.8421 0.5120 0.8421 0.9177
No log 1.5870 146 0.7337 0.5066 0.7337 0.8566
No log 1.6087 148 0.7579 0.5431 0.7579 0.8706
No log 1.6304 150 0.7645 0.5108 0.7645 0.8743
No log 1.6522 152 0.8169 0.5347 0.8169 0.9038
No log 1.6739 154 0.8666 0.5251 0.8666 0.9309
No log 1.6957 156 0.7571 0.5045 0.7571 0.8701
No log 1.7174 158 0.7130 0.5066 0.7130 0.8444
No log 1.7391 160 0.7535 0.5464 0.7535 0.8680
No log 1.7609 162 0.8011 0.5066 0.8011 0.8951
No log 1.7826 164 0.7444 0.5564 0.7444 0.8628
No log 1.8043 166 0.6537 0.5398 0.6537 0.8085
No log 1.8261 168 0.6463 0.4779 0.6463 0.8039
No log 1.8478 170 0.6609 0.5555 0.6609 0.8130
No log 1.8696 172 0.8188 0.4798 0.8188 0.9049
No log 1.8913 174 0.7376 0.4815 0.7376 0.8588
No log 1.9130 176 0.6160 0.5475 0.6160 0.7849
No log 1.9348 178 0.6283 0.5355 0.6283 0.7927
No log 1.9565 180 0.6199 0.4909 0.6199 0.7874
No log 1.9783 182 0.6708 0.5634 0.6708 0.8190
No log 2.0 184 1.0537 0.3513 1.0537 1.0265
No log 2.0217 186 1.3159 0.3190 1.3159 1.1471
No log 2.0435 188 1.1424 0.3017 1.1424 1.0688
No log 2.0652 190 0.7831 0.5185 0.7831 0.8849
No log 2.0870 192 0.6712 0.5076 0.6712 0.8193
No log 2.1087 194 0.6744 0.5248 0.6744 0.8212
No log 2.1304 196 0.7026 0.5563 0.7026 0.8382
No log 2.1522 198 0.7320 0.5479 0.7320 0.8555
No log 2.1739 200 0.6724 0.4919 0.6724 0.8200
No log 2.1957 202 0.6699 0.5289 0.6699 0.8185
No log 2.2174 204 0.7057 0.4478 0.7057 0.8401
No log 2.2391 206 0.6530 0.5306 0.6530 0.8081
No log 2.2609 208 0.6552 0.5028 0.6552 0.8095
No log 2.2826 210 0.6854 0.5179 0.6854 0.8279
No log 2.3043 212 0.6292 0.5048 0.6292 0.7932
No log 2.3261 214 0.6122 0.5378 0.6122 0.7824
No log 2.3478 216 0.5978 0.4948 0.5978 0.7732
No log 2.3696 218 0.5894 0.4576 0.5894 0.7677
No log 2.3913 220 0.5921 0.4408 0.5921 0.7695
No log 2.4130 222 0.6028 0.4726 0.6028 0.7764
No log 2.4348 224 0.6187 0.5258 0.6187 0.7866
No log 2.4565 226 0.6515 0.5408 0.6515 0.8071
No log 2.4783 228 0.6535 0.5353 0.6535 0.8084
No log 2.5 230 0.6408 0.5336 0.6408 0.8005
No log 2.5217 232 0.6459 0.5149 0.6459 0.8037
No log 2.5435 234 0.6606 0.5213 0.6606 0.8128
No log 2.5652 236 0.6701 0.5475 0.6701 0.8186
No log 2.5870 238 0.6759 0.5303 0.6759 0.8221
No log 2.6087 240 0.7164 0.5466 0.7164 0.8464
No log 2.6304 242 0.7211 0.5590 0.7211 0.8492
No log 2.6522 244 0.6597 0.5616 0.6597 0.8122
No log 2.6739 246 0.6412 0.5264 0.6412 0.8008
No log 2.6957 248 0.6380 0.5548 0.6380 0.7988
No log 2.7174 250 0.6565 0.5793 0.6565 0.8102
No log 2.7391 252 0.6687 0.5799 0.6687 0.8177
No log 2.7609 254 0.7328 0.5324 0.7328 0.8560
No log 2.7826 256 0.8840 0.4164 0.8840 0.9402
No log 2.8043 258 1.1472 0.3276 1.1472 1.0711
No log 2.8261 260 1.1223 0.3539 1.1223 1.0594
No log 2.8478 262 0.9535 0.4276 0.9535 0.9765
No log 2.8696 264 0.8000 0.4760 0.8000 0.8944
No log 2.8913 266 0.7087 0.5262 0.7087 0.8418
No log 2.9130 268 0.6779 0.5919 0.6779 0.8234
No log 2.9348 270 0.6172 0.6196 0.6172 0.7856
No log 2.9565 272 0.6113 0.6084 0.6113 0.7818
No log 2.9783 274 0.6249 0.5986 0.6249 0.7905
No log 3.0 276 0.6665 0.5590 0.6665 0.8164
No log 3.0217 278 0.7998 0.5120 0.7998 0.8943
No log 3.0435 280 0.9150 0.4323 0.9150 0.9566
No log 3.0652 282 0.8291 0.4985 0.8291 0.9105
No log 3.0870 284 0.6780 0.5186 0.6780 0.8234
No log 3.1087 286 0.5657 0.5556 0.5657 0.7521
No log 3.1304 288 0.5673 0.5026 0.5673 0.7532
No log 3.1522 290 0.5735 0.5319 0.5735 0.7573
No log 3.1739 292 0.5776 0.5063 0.5776 0.7600
No log 3.1957 294 0.6022 0.5614 0.6022 0.7760
No log 3.2174 296 0.6055 0.5356 0.6055 0.7781
No log 3.2391 298 0.6045 0.5534 0.6045 0.7775
No log 3.2609 300 0.6234 0.5201 0.6234 0.7896
No log 3.2826 302 0.6360 0.5201 0.6360 0.7975
No log 3.3043 304 0.7113 0.5133 0.7113 0.8434
No log 3.3261 306 0.6486 0.5217 0.6486 0.8053
No log 3.3478 308 0.5948 0.5595 0.5948 0.7712
No log 3.3696 310 0.6315 0.5121 0.6315 0.7947
No log 3.3913 312 0.6244 0.5528 0.6244 0.7902
No log 3.4130 314 0.6193 0.5528 0.6193 0.7870
No log 3.4348 316 0.5971 0.5587 0.5971 0.7727
No log 3.4565 318 0.6555 0.5329 0.6555 0.8096
No log 3.4783 320 0.7068 0.5569 0.7068 0.8407
No log 3.5 322 0.6325 0.5186 0.6325 0.7953
No log 3.5217 324 0.5767 0.4918 0.5767 0.7594
No log 3.5435 326 0.6252 0.5111 0.6252 0.7907
No log 3.5652 328 0.6068 0.5371 0.6068 0.7790
No log 3.5870 330 0.5819 0.5561 0.5819 0.7628
No log 3.6087 332 0.6270 0.5306 0.6270 0.7918
No log 3.6304 334 0.6816 0.5355 0.6816 0.8256
No log 3.6522 336 0.6788 0.5502 0.6788 0.8239
No log 3.6739 338 0.6662 0.5304 0.6662 0.8162
No log 3.6957 340 0.6170 0.5520 0.6170 0.7855
No log 3.7174 342 0.6180 0.5362 0.6180 0.7861
No log 3.7391 344 0.6189 0.5463 0.6189 0.7867
No log 3.7609 346 0.6201 0.5221 0.6201 0.7875
No log 3.7826 348 0.6704 0.5096 0.6704 0.8188
No log 3.8043 350 0.6256 0.5036 0.6256 0.7910
No log 3.8261 352 0.5805 0.4367 0.5805 0.7619
No log 3.8478 354 0.5856 0.4227 0.5856 0.7653
No log 3.8696 356 0.5752 0.4329 0.5752 0.7584
No log 3.8913 358 0.5738 0.4387 0.5738 0.7575
No log 3.9130 360 0.6211 0.5084 0.6211 0.7881
No log 3.9348 362 0.6213 0.5449 0.6213 0.7882
No log 3.9565 364 0.6150 0.5881 0.6150 0.7842
No log 3.9783 366 0.6006 0.5640 0.6006 0.7750
No log 4.0 368 0.5939 0.5290 0.5939 0.7706
No log 4.0217 370 0.5988 0.5723 0.5988 0.7738
No log 4.0435 372 0.6693 0.5009 0.6693 0.8181
No log 4.0652 374 0.8070 0.4601 0.8070 0.8983
No log 4.0870 376 0.8670 0.4438 0.8670 0.9311
No log 4.1087 378 0.7286 0.4644 0.7286 0.8536
No log 4.1304 380 0.5927 0.4810 0.5927 0.7699
No log 4.1522 382 0.5890 0.4772 0.5890 0.7675
No log 4.1739 384 0.6137 0.5021 0.6137 0.7834
No log 4.1957 386 0.5920 0.5403 0.5920 0.7694
No log 4.2174 388 0.6020 0.4882 0.6020 0.7759
No log 4.2391 390 0.7943 0.4940 0.7943 0.8913
No log 4.2609 392 1.0526 0.4151 1.0526 1.0259
No log 4.2826 394 1.0880 0.4066 1.0880 1.0431
No log 4.3043 396 0.9139 0.4664 0.9139 0.9560
No log 4.3261 398 0.7078 0.5172 0.7078 0.8413
No log 4.3478 400 0.6309 0.5363 0.6309 0.7943
No log 4.3696 402 0.6044 0.5556 0.6044 0.7774
No log 4.3913 404 0.5947 0.5474 0.5947 0.7712
No log 4.4130 406 0.5897 0.5107 0.5897 0.7679
No log 4.4348 408 0.6018 0.5038 0.6018 0.7758
No log 4.4565 410 0.5948 0.5164 0.5948 0.7712
No log 4.4783 412 0.5953 0.5074 0.5953 0.7715
No log 4.5 414 0.6267 0.5289 0.6267 0.7916
No log 4.5217 416 0.6291 0.5053 0.6291 0.7932
No log 4.5435 418 0.6327 0.4743 0.6327 0.7954
No log 4.5652 420 0.6297 0.4743 0.6297 0.7936
No log 4.5870 422 0.5881 0.5025 0.5881 0.7669
No log 4.6087 424 0.5662 0.4548 0.5662 0.7524
No log 4.6304 426 0.5653 0.4539 0.5653 0.7519
No log 4.6522 428 0.5775 0.4840 0.5775 0.7600
No log 4.6739 430 0.5836 0.4704 0.5836 0.7639
No log 4.6957 432 0.6074 0.4845 0.6074 0.7794
No log 4.7174 434 0.6440 0.4639 0.6440 0.8025
No log 4.7391 436 0.6787 0.4984 0.6787 0.8238
No log 4.7609 438 0.6285 0.5363 0.6285 0.7928
No log 4.7826 440 0.6145 0.5778 0.6145 0.7839
No log 4.8043 442 0.6317 0.5237 0.6317 0.7948
No log 4.8261 444 0.6222 0.5445 0.6222 0.7888
No log 4.8478 446 0.5819 0.5427 0.5819 0.7628
No log 4.8696 448 0.5689 0.5475 0.5689 0.7543
No log 4.8913 450 0.5795 0.4644 0.5795 0.7612
No log 4.9130 452 0.5949 0.4873 0.5949 0.7713
No log 4.9348 454 0.5666 0.4833 0.5666 0.7528
No log 4.9565 456 0.5647 0.4797 0.5647 0.7515
No log 4.9783 458 0.5713 0.5040 0.5713 0.7558
No log 5.0 460 0.5740 0.5621 0.5740 0.7576
No log 5.0217 462 0.5835 0.5234 0.5835 0.7639
No log 5.0435 464 0.5848 0.4957 0.5848 0.7647
No log 5.0652 466 0.5743 0.5088 0.5743 0.7578
No log 5.0870 468 0.5683 0.4489 0.5683 0.7538
No log 5.1087 470 0.5709 0.4822 0.5709 0.7556
No log 5.1304 472 0.5895 0.5071 0.5895 0.7678
No log 5.1522 474 0.5748 0.4726 0.5748 0.7582
No log 5.1739 476 0.5587 0.5316 0.5587 0.7474
No log 5.1957 478 0.5664 0.5288 0.5664 0.7526
No log 5.2174 480 0.5816 0.4794 0.5816 0.7626
No log 5.2391 482 0.5707 0.4937 0.5707 0.7554
No log 5.2609 484 0.5757 0.4876 0.5757 0.7587
No log 5.2826 486 0.5779 0.5123 0.5779 0.7602
No log 5.3043 488 0.5710 0.5299 0.5710 0.7557
No log 5.3261 490 0.5678 0.5040 0.5678 0.7536
No log 5.3478 492 0.5779 0.5058 0.5779 0.7602
No log 5.3696 494 0.6011 0.5168 0.6011 0.7753
No log 5.3913 496 0.5961 0.5168 0.5961 0.7721
No log 5.4130 498 0.5710 0.5310 0.5710 0.7557
0.352 5.4348 500 0.5853 0.5042 0.5853 0.7651
0.352 5.4565 502 0.6063 0.5179 0.6063 0.7787
0.352 5.4783 504 0.5909 0.5587 0.5909 0.7687
0.352 5.5 506 0.6180 0.4866 0.6180 0.7861
0.352 5.5217 508 0.6526 0.5113 0.6526 0.8078
0.352 5.5435 510 0.6067 0.5216 0.6067 0.7789
0.352 5.5652 512 0.5736 0.5248 0.5736 0.7574
0.352 5.5870 514 0.5770 0.5161 0.5770 0.7596
0.352 5.6087 516 0.5816 0.5135 0.5816 0.7626
0.352 5.6304 518 0.5988 0.5428 0.5988 0.7738
0.352 5.6522 520 0.5896 0.5527 0.5896 0.7678
0.352 5.6739 522 0.5792 0.4467 0.5792 0.7611
0.352 5.6957 524 0.5840 0.4251 0.5840 0.7642
0.352 5.7174 526 0.5797 0.4393 0.5797 0.7614
0.352 5.7391 528 0.5813 0.4662 0.5813 0.7624
0.352 5.7609 530 0.5999 0.4729 0.5999 0.7745

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
4
Safetensors
Model size
135M params
Tensor type
F32
·
Inference Providers NEW
This model is not currently available via any of the supported Inference Providers.

Model tree for MayBashendy/ArabicNewSplits8_usingALLEssays_FineTuningAraBERT_run1_AugV5_k18_task2_organization

Finetuned
(4222)
this model