ArabicNewSplits8_usingWellWrittenEssays_FineTuningAraBERT_run2_AugV5_k10_task2_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.5385
  • Qwk: 0.5762
  • Mse: 0.5385
  • Rmse: 0.7339

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0370 2 4.6666 -0.0101 4.6666 2.1602
No log 0.0741 4 2.6579 0.0165 2.6579 1.6303
No log 0.1111 6 1.6278 -0.0618 1.6278 1.2759
No log 0.1481 8 1.0826 -0.0498 1.0826 1.0405
No log 0.1852 10 0.9343 0.0228 0.9343 0.9666
No log 0.2222 12 1.0024 0.0673 1.0024 1.0012
No log 0.2593 14 0.9361 0.1284 0.9361 0.9675
No log 0.2963 16 0.7802 0.2150 0.7802 0.8833
No log 0.3333 18 0.7909 0.2436 0.7909 0.8893
No log 0.3704 20 0.9289 0.2171 0.9289 0.9638
No log 0.4074 22 1.0195 0.1287 1.0195 1.0097
No log 0.4444 24 1.0792 0.0848 1.0792 1.0389
No log 0.4815 26 1.1570 0.0262 1.1570 1.0756
No log 0.5185 28 1.0403 0.1175 1.0403 1.0199
No log 0.5556 30 0.8599 0.2097 0.8599 0.9273
No log 0.5926 32 0.8642 0.1927 0.8642 0.9296
No log 0.6296 34 0.8831 0.2085 0.8831 0.9397
No log 0.6667 36 0.8455 0.2171 0.8455 0.9195
No log 0.7037 38 0.7585 0.2213 0.7585 0.8709
No log 0.7407 40 0.7268 0.2825 0.7268 0.8525
No log 0.7778 42 0.6922 0.3785 0.6922 0.8320
No log 0.8148 44 0.7243 0.2985 0.7243 0.8511
No log 0.8519 46 0.8091 0.2551 0.8091 0.8995
No log 0.8889 48 0.7171 0.2929 0.7171 0.8468
No log 0.9259 50 0.7815 0.3420 0.7815 0.8841
No log 0.9630 52 0.7787 0.3794 0.7787 0.8824
No log 1.0 54 0.9737 0.2334 0.9737 0.9867
No log 1.0370 56 1.1437 0.2003 1.1437 1.0695
No log 1.0741 58 0.9231 0.2724 0.9231 0.9608
No log 1.1111 60 0.5911 0.5012 0.5911 0.7688
No log 1.1481 62 0.7417 0.3984 0.7417 0.8612
No log 1.1852 64 0.7534 0.4223 0.7534 0.8680
No log 1.2222 66 0.6008 0.4344 0.6008 0.7751
No log 1.2593 68 0.8308 0.3179 0.8308 0.9115
No log 1.2963 70 0.8112 0.3412 0.8112 0.9007
No log 1.3333 72 0.5699 0.5421 0.5699 0.7549
No log 1.3704 74 0.6328 0.4741 0.6328 0.7955
No log 1.4074 76 0.5622 0.5329 0.5622 0.7498
No log 1.4444 78 0.5883 0.5214 0.5883 0.7670
No log 1.4815 80 0.6002 0.5241 0.6002 0.7747
No log 1.5185 82 0.5598 0.4976 0.5598 0.7482
No log 1.5556 84 0.5503 0.5589 0.5503 0.7418
No log 1.5926 86 0.5756 0.5494 0.5756 0.7587
No log 1.6296 88 0.5861 0.5199 0.5861 0.7656
No log 1.6667 90 0.5764 0.5999 0.5764 0.7592
No log 1.7037 92 0.5850 0.5878 0.5850 0.7649
No log 1.7407 94 0.5705 0.5485 0.5705 0.7553
No log 1.7778 96 0.5761 0.5444 0.5761 0.7590
No log 1.8148 98 0.5933 0.5313 0.5933 0.7703
No log 1.8519 100 0.6172 0.5814 0.6172 0.7856
No log 1.8889 102 0.6480 0.5843 0.6480 0.8050
No log 1.9259 104 0.6541 0.5626 0.6541 0.8087
No log 1.9630 106 0.7146 0.5273 0.7146 0.8453
No log 2.0 108 0.6753 0.5378 0.6753 0.8218
No log 2.0370 110 0.6642 0.5228 0.6642 0.8150
No log 2.0741 112 0.7652 0.5275 0.7652 0.8748
No log 2.1111 114 0.9059 0.4435 0.9059 0.9518
No log 2.1481 116 0.7155 0.5278 0.7155 0.8459
No log 2.1852 118 0.6341 0.5167 0.6341 0.7963
No log 2.2222 120 0.6279 0.5461 0.6279 0.7924
No log 2.2593 122 0.6341 0.5768 0.6341 0.7963
No log 2.2963 124 0.6144 0.5274 0.6144 0.7838
No log 2.3333 126 0.8900 0.4897 0.8900 0.9434
No log 2.3704 128 1.1403 0.3719 1.1403 1.0678
No log 2.4074 130 0.8808 0.4465 0.8808 0.9385
No log 2.4444 132 0.6629 0.5581 0.6629 0.8142
No log 2.4815 134 0.6587 0.5581 0.6587 0.8116
No log 2.5185 136 0.7803 0.4881 0.7803 0.8834
No log 2.5556 138 0.8016 0.5114 0.8016 0.8953
No log 2.5926 140 0.6509 0.5821 0.6509 0.8068
No log 2.6296 142 0.6588 0.5393 0.6588 0.8117
No log 2.6667 144 0.6807 0.5450 0.6807 0.8251
No log 2.7037 146 0.6564 0.5373 0.6564 0.8102
No log 2.7407 148 0.6451 0.5349 0.6451 0.8032
No log 2.7778 150 0.6319 0.5260 0.6319 0.7949
No log 2.8148 152 0.7999 0.5532 0.7999 0.8944
No log 2.8519 154 0.9569 0.4603 0.9569 0.9782
No log 2.8889 156 0.7503 0.5255 0.7503 0.8662
No log 2.9259 158 0.6366 0.5054 0.6366 0.7979
No log 2.9630 160 0.6763 0.5160 0.6763 0.8224
No log 3.0 162 0.6524 0.5674 0.6524 0.8077
No log 3.0370 164 0.6542 0.5405 0.6542 0.8088
No log 3.0741 166 0.6845 0.5622 0.6845 0.8274
No log 3.1111 168 0.6983 0.5497 0.6983 0.8357
No log 3.1481 170 0.7137 0.6075 0.7137 0.8448
No log 3.1852 172 0.6872 0.6186 0.6872 0.8290
No log 3.2222 174 0.7287 0.5266 0.7287 0.8536
No log 3.2593 176 0.8724 0.4212 0.8724 0.9340
No log 3.2963 178 0.7356 0.4672 0.7356 0.8577
No log 3.3333 180 0.6125 0.5848 0.6125 0.7826
No log 3.3704 182 0.6139 0.5884 0.6139 0.7835
No log 3.4074 184 0.5979 0.5487 0.5979 0.7732
No log 3.4444 186 0.6362 0.5638 0.6362 0.7976
No log 3.4815 188 0.6456 0.5358 0.6456 0.8035
No log 3.5185 190 0.6026 0.5729 0.6026 0.7763
No log 3.5556 192 0.6116 0.5768 0.6116 0.7820
No log 3.5926 194 0.6156 0.5843 0.6156 0.7846
No log 3.6296 196 0.6163 0.6093 0.6163 0.7851
No log 3.6667 198 0.6544 0.6136 0.6544 0.8090
No log 3.7037 200 0.7151 0.5660 0.7151 0.8456
No log 3.7407 202 0.7535 0.5724 0.7535 0.8680
No log 3.7778 204 0.7407 0.5844 0.7407 0.8607
No log 3.8148 206 0.8255 0.5953 0.8255 0.9086
No log 3.8519 208 0.8690 0.5611 0.8690 0.9322
No log 3.8889 210 0.8343 0.5855 0.8343 0.9134
No log 3.9259 212 0.7909 0.5872 0.7909 0.8893
No log 3.9630 214 0.7680 0.5973 0.7680 0.8764
No log 4.0 216 0.7346 0.6033 0.7346 0.8571
No log 4.0370 218 0.7426 0.6105 0.7426 0.8618
No log 4.0741 220 0.8577 0.5074 0.8577 0.9261
No log 4.1111 222 0.7738 0.5987 0.7738 0.8797
No log 4.1481 224 0.6327 0.6005 0.6327 0.7954
No log 4.1852 226 0.5939 0.5930 0.5939 0.7707
No log 4.2222 228 0.5918 0.6093 0.5918 0.7693
No log 4.2593 230 0.5892 0.5803 0.5892 0.7676
No log 4.2963 232 0.6659 0.5702 0.6659 0.8160
No log 4.3333 234 0.7278 0.5725 0.7278 0.8531
No log 4.3704 236 0.7009 0.6193 0.7009 0.8372
No log 4.4074 238 0.6552 0.5853 0.6552 0.8094
No log 4.4444 240 0.7608 0.5472 0.7608 0.8723
No log 4.4815 242 0.7806 0.4995 0.7806 0.8835
No log 4.5185 244 0.6629 0.5862 0.6629 0.8142
No log 4.5556 246 0.6333 0.5864 0.6333 0.7958
No log 4.5926 248 0.6262 0.6070 0.6262 0.7913
No log 4.6296 250 0.6404 0.5953 0.6404 0.8003
No log 4.6667 252 0.6771 0.5517 0.6771 0.8229
No log 4.7037 254 0.7019 0.5242 0.7019 0.8378
No log 4.7407 256 0.6583 0.5493 0.6583 0.8114
No log 4.7778 258 0.5930 0.5625 0.5930 0.7701
No log 4.8148 260 0.6261 0.5640 0.6261 0.7913
No log 4.8519 262 0.6204 0.5732 0.6204 0.7876
No log 4.8889 264 0.5988 0.5833 0.5988 0.7738
No log 4.9259 266 0.6029 0.5606 0.6029 0.7764
No log 4.9630 268 0.5912 0.5753 0.5912 0.7689
No log 5.0 270 0.5899 0.6350 0.5899 0.7680
No log 5.0370 272 0.6150 0.5850 0.6150 0.7842
No log 5.0741 274 0.5974 0.5701 0.5974 0.7729
No log 5.1111 276 0.5816 0.6085 0.5816 0.7626
No log 5.1481 278 0.6190 0.6057 0.6190 0.7868
No log 5.1852 280 0.6779 0.5639 0.6779 0.8233
No log 5.2222 282 0.6216 0.5849 0.6216 0.7884
No log 5.2593 284 0.5624 0.6052 0.5624 0.7499
No log 5.2963 286 0.5605 0.5212 0.5605 0.7487
No log 5.3333 288 0.5447 0.4991 0.5447 0.7381
No log 5.3704 290 0.5445 0.5032 0.5445 0.7379
No log 5.4074 292 0.5518 0.4786 0.5518 0.7428
No log 5.4444 294 0.5599 0.4751 0.5599 0.7483
No log 5.4815 296 0.5752 0.4850 0.5752 0.7584
No log 5.5185 298 0.5956 0.5327 0.5956 0.7718
No log 5.5556 300 0.6236 0.5382 0.6236 0.7897
No log 5.5926 302 0.6306 0.5635 0.6306 0.7941
No log 5.6296 304 0.6319 0.5885 0.6319 0.7949
No log 5.6667 306 0.6438 0.5776 0.6438 0.8024
No log 5.7037 308 0.6311 0.5607 0.6311 0.7944
No log 5.7407 310 0.5720 0.5165 0.5720 0.7563
No log 5.7778 312 0.5507 0.5492 0.5507 0.7421
No log 5.8148 314 0.5394 0.5352 0.5394 0.7344
No log 5.8519 316 0.5505 0.5491 0.5505 0.7420
No log 5.8889 318 0.5214 0.5586 0.5214 0.7221
No log 5.9259 320 0.5339 0.5345 0.5339 0.7307
No log 5.9630 322 0.5443 0.5377 0.5443 0.7378
No log 6.0 324 0.5245 0.5778 0.5245 0.7242
No log 6.0370 326 0.5820 0.5445 0.5820 0.7629
No log 6.0741 328 0.5603 0.5642 0.5603 0.7485
No log 6.1111 330 0.5056 0.5502 0.5056 0.7111
No log 6.1481 332 0.5076 0.5440 0.5076 0.7125
No log 6.1852 334 0.5065 0.5479 0.5065 0.7117
No log 6.2222 336 0.5153 0.5422 0.5153 0.7179
No log 6.2593 338 0.5167 0.5779 0.5167 0.7188
No log 6.2963 340 0.5266 0.5868 0.5266 0.7257
No log 6.3333 342 0.5507 0.5883 0.5507 0.7421
No log 6.3704 344 0.5780 0.5631 0.5780 0.7602
No log 6.4074 346 0.5507 0.5752 0.5507 0.7421
No log 6.4444 348 0.5398 0.5349 0.5398 0.7347
No log 6.4815 350 0.5329 0.5427 0.5329 0.7300
No log 6.5185 352 0.5332 0.5820 0.5332 0.7302
No log 6.5556 354 0.5406 0.5797 0.5406 0.7352
No log 6.5926 356 0.5419 0.5733 0.5419 0.7361
No log 6.6296 358 0.5379 0.5633 0.5379 0.7334
No log 6.6667 360 0.5510 0.5560 0.5510 0.7423
No log 6.7037 362 0.5597 0.5762 0.5597 0.7481
No log 6.7407 364 0.5845 0.5772 0.5845 0.7645
No log 6.7778 366 0.6407 0.4963 0.6407 0.8004
No log 6.8148 368 0.8010 0.4578 0.8010 0.8950
No log 6.8519 370 0.8308 0.4017 0.8308 0.9115
No log 6.8889 372 0.6263 0.5680 0.6263 0.7914
No log 6.9259 374 0.5728 0.5731 0.5728 0.7568
No log 6.9630 376 0.6501 0.5108 0.6501 0.8063
No log 7.0 378 0.6053 0.5645 0.6053 0.7780
No log 7.0370 380 0.5670 0.5678 0.5670 0.7530
No log 7.0741 382 0.5684 0.5466 0.5684 0.7539
No log 7.1111 384 0.5879 0.5576 0.5879 0.7667
No log 7.1481 386 0.6668 0.5395 0.6668 0.8166
No log 7.1852 388 0.7959 0.5047 0.7959 0.8921
No log 7.2222 390 0.8071 0.4634 0.8071 0.8984
No log 7.2593 392 0.6750 0.5264 0.6750 0.8216
No log 7.2963 394 0.5774 0.5622 0.5774 0.7599
No log 7.3333 396 0.5669 0.5581 0.5669 0.7530
No log 7.3704 398 0.5955 0.5839 0.5955 0.7717
No log 7.4074 400 0.7667 0.5125 0.7667 0.8756
No log 7.4444 402 0.9009 0.4476 0.9009 0.9492
No log 7.4815 404 0.8068 0.5125 0.8068 0.8982
No log 7.5185 406 0.6169 0.5383 0.6169 0.7854
No log 7.5556 408 0.5543 0.4916 0.5543 0.7445
No log 7.5926 410 0.5802 0.5170 0.5802 0.7617
No log 7.6296 412 0.5661 0.5363 0.5661 0.7524
No log 7.6667 414 0.5507 0.5111 0.5507 0.7421
No log 7.7037 416 0.5627 0.5338 0.5627 0.7501
No log 7.7407 418 0.5613 0.5355 0.5613 0.7492
No log 7.7778 420 0.5402 0.4949 0.5402 0.7350
No log 7.8148 422 0.5678 0.5414 0.5678 0.7535
No log 7.8519 424 0.6102 0.4996 0.6102 0.7812
No log 7.8889 426 0.5985 0.5253 0.5985 0.7736
No log 7.9259 428 0.5484 0.5112 0.5484 0.7406
No log 7.9630 430 0.5439 0.5091 0.5439 0.7375
No log 8.0 432 0.5487 0.5199 0.5487 0.7407
No log 8.0370 434 0.6097 0.4983 0.6097 0.7809
No log 8.0741 436 0.5810 0.5351 0.5810 0.7622
No log 8.1111 438 0.5390 0.4786 0.5390 0.7342
No log 8.1481 440 0.5996 0.5354 0.5996 0.7744
No log 8.1852 442 0.6372 0.5484 0.6372 0.7983
No log 8.2222 444 0.5854 0.5168 0.5854 0.7651
No log 8.2593 446 0.5477 0.5130 0.5477 0.7401
No log 8.2963 448 0.5571 0.4916 0.5571 0.7464
No log 8.3333 450 0.5513 0.4916 0.5513 0.7425
No log 8.3704 452 0.5414 0.5345 0.5414 0.7358
No log 8.4074 454 0.5874 0.6055 0.5874 0.7664
No log 8.4444 456 0.6224 0.5278 0.6224 0.7889
No log 8.4815 458 0.6250 0.5678 0.6250 0.7906
No log 8.5185 460 0.5906 0.5889 0.5906 0.7685
No log 8.5556 462 0.5842 0.5799 0.5842 0.7643
No log 8.5926 464 0.5830 0.5605 0.5830 0.7636
No log 8.6296 466 0.5627 0.6006 0.5627 0.7501
No log 8.6667 468 0.5795 0.5965 0.5795 0.7612
No log 8.7037 470 0.6190 0.5677 0.6190 0.7868
No log 8.7407 472 0.6712 0.5400 0.6712 0.8192
No log 8.7778 474 0.7151 0.5267 0.7151 0.8457
No log 8.8148 476 0.7080 0.5290 0.7080 0.8415
No log 8.8519 478 0.6745 0.5462 0.6745 0.8213
No log 8.8889 480 0.6371 0.5457 0.6371 0.7982
No log 8.9259 482 0.6035 0.5516 0.6035 0.7769
No log 8.9630 484 0.5660 0.5802 0.5660 0.7523
No log 9.0 486 0.5362 0.5547 0.5362 0.7323
No log 9.0370 488 0.5325 0.5572 0.5325 0.7297
No log 9.0741 490 0.5401 0.5668 0.5401 0.7349
No log 9.1111 492 0.5361 0.6016 0.5361 0.7322
No log 9.1481 494 0.5468 0.6242 0.5468 0.7395
No log 9.1852 496 0.5587 0.6145 0.5587 0.7475
No log 9.2222 498 0.5873 0.5827 0.5873 0.7664
0.3372 9.2593 500 0.5798 0.5916 0.5798 0.7614
0.3372 9.2963 502 0.6258 0.6252 0.6258 0.7911
0.3372 9.3333 504 0.7786 0.4914 0.7786 0.8824
0.3372 9.3704 506 0.8184 0.4403 0.8184 0.9047
0.3372 9.4074 508 0.6552 0.6038 0.6552 0.8094
0.3372 9.4444 510 0.5901 0.5788 0.5901 0.7682
0.3372 9.4815 512 0.6745 0.5438 0.6745 0.8213
0.3372 9.5185 514 0.6972 0.5544 0.6972 0.8350
0.3372 9.5556 516 0.6064 0.5503 0.6064 0.7787
0.3372 9.5926 518 0.5385 0.5762 0.5385 0.7339

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
4
Safetensors
Model size
135M params
Tensor type
F32
·
Inference Providers NEW
This model is not currently available via any of the supported Inference Providers.

Model tree for MayBashendy/ArabicNewSplits8_usingWellWrittenEssays_FineTuningAraBERT_run2_AugV5_k10_task2_organization

Finetuned
(4222)
this model