ArabicNewSplits8_usingALLEssays_FineTuningAraBERT_run2_AugV5_k11_task2_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.5166
  • Qwk: 0.5299
  • Mse: 0.5166
  • Rmse: 0.7188

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0351 2 4.4549 -0.0327 4.4549 2.1107
No log 0.0702 4 2.3982 -0.0091 2.3982 1.5486
No log 0.1053 6 1.5556 -0.0289 1.5556 1.2473
No log 0.1404 8 1.1684 0.0010 1.1684 1.0809
No log 0.1754 10 0.9518 0.0672 0.9518 0.9756
No log 0.2105 12 0.8118 0.2308 0.8118 0.9010
No log 0.2456 14 0.7682 0.3122 0.7682 0.8764
No log 0.2807 16 0.8405 0.1437 0.8405 0.9168
No log 0.3158 18 1.3727 0.0461 1.3727 1.1716
No log 0.3509 20 1.1706 0.0538 1.1706 1.0819
No log 0.3860 22 0.9043 0.2085 0.9043 0.9509
No log 0.4211 24 1.0386 0.0632 1.0386 1.0191
No log 0.4561 26 0.9969 0.1509 0.9969 0.9984
No log 0.4912 28 0.8671 0.1888 0.8671 0.9312
No log 0.5263 30 0.8462 0.2339 0.8462 0.9199
No log 0.5614 32 0.7171 0.3407 0.7171 0.8468
No log 0.5965 34 0.6873 0.3424 0.6873 0.8290
No log 0.6316 36 0.7038 0.3304 0.7038 0.8389
No log 0.6667 38 0.7994 0.2551 0.7994 0.8941
No log 0.7018 40 0.7435 0.3247 0.7435 0.8622
No log 0.7368 42 0.6997 0.3272 0.6997 0.8365
No log 0.7719 44 0.7183 0.3535 0.7183 0.8475
No log 0.8070 46 0.6750 0.3362 0.6750 0.8216
No log 0.8421 48 0.6731 0.3519 0.6731 0.8204
No log 0.8772 50 0.7335 0.3373 0.7335 0.8564
No log 0.9123 52 0.9689 0.1842 0.9689 0.9843
No log 0.9474 54 1.2964 0.2597 1.2964 1.1386
No log 0.9825 56 1.1620 0.2479 1.1620 1.0780
No log 1.0175 58 0.8838 0.3191 0.8838 0.9401
No log 1.0526 60 0.7594 0.3322 0.7594 0.8714
No log 1.0877 62 0.7029 0.3652 0.7029 0.8384
No log 1.1228 64 0.7401 0.3874 0.7401 0.8603
No log 1.1579 66 0.9522 0.3069 0.9522 0.9758
No log 1.1930 68 1.1377 0.2803 1.1377 1.0666
No log 1.2281 70 1.0244 0.2762 1.0244 1.0121
No log 1.2632 72 0.6931 0.3529 0.6931 0.8325
No log 1.2982 74 0.6048 0.4625 0.6048 0.7777
No log 1.3333 76 0.6388 0.4115 0.6388 0.7992
No log 1.3684 78 0.5982 0.4833 0.5982 0.7734
No log 1.4035 80 0.6361 0.3941 0.6361 0.7975
No log 1.4386 82 0.7411 0.3487 0.7411 0.8609
No log 1.4737 84 0.7613 0.3722 0.7613 0.8725
No log 1.5088 86 0.7625 0.3722 0.7625 0.8732
No log 1.5439 88 0.6572 0.3798 0.6572 0.8107
No log 1.5789 90 0.5902 0.4549 0.5902 0.7682
No log 1.6140 92 0.5865 0.4796 0.5865 0.7658
No log 1.6491 94 0.6313 0.4166 0.6313 0.7946
No log 1.6842 96 0.6409 0.4220 0.6409 0.8006
No log 1.7193 98 0.5950 0.4839 0.5950 0.7713
No log 1.7544 100 0.5801 0.5468 0.5801 0.7616
No log 1.7895 102 0.5896 0.5789 0.5896 0.7678
No log 1.8246 104 0.6056 0.5860 0.6056 0.7782
No log 1.8596 106 0.6970 0.5602 0.6970 0.8349
No log 1.8947 108 0.7064 0.5535 0.7064 0.8405
No log 1.9298 110 0.5695 0.5735 0.5695 0.7547
No log 1.9649 112 0.5579 0.6070 0.5579 0.7470
No log 2.0 114 0.5267 0.5700 0.5267 0.7257
No log 2.0351 116 0.5301 0.5312 0.5301 0.7281
No log 2.0702 118 0.5255 0.4644 0.5255 0.7249
No log 2.1053 120 0.5628 0.5086 0.5628 0.7502
No log 2.1404 122 0.5411 0.4990 0.5411 0.7356
No log 2.1754 124 0.5398 0.4835 0.5398 0.7347
No log 2.2105 126 0.5631 0.5540 0.5631 0.7504
No log 2.2456 128 0.5820 0.5896 0.5820 0.7629
No log 2.2807 130 0.6387 0.5284 0.6387 0.7992
No log 2.3158 132 0.6135 0.5595 0.6135 0.7832
No log 2.3509 134 0.6376 0.5525 0.6376 0.7985
No log 2.3860 136 0.7248 0.5090 0.7248 0.8513
No log 2.4211 138 0.6617 0.5054 0.6617 0.8135
No log 2.4561 140 0.5816 0.5483 0.5816 0.7626
No log 2.4912 142 0.6348 0.5002 0.6348 0.7967
No log 2.5263 144 0.9056 0.4423 0.9056 0.9516
No log 2.5614 146 0.9395 0.4442 0.9395 0.9693
No log 2.5965 148 0.6388 0.5248 0.6388 0.7992
No log 2.6316 150 0.5151 0.5434 0.5151 0.7177
No log 2.6667 152 0.5756 0.5422 0.5756 0.7587
No log 2.7018 154 0.5418 0.5687 0.5418 0.7361
No log 2.7368 156 0.5272 0.5274 0.5272 0.7261
No log 2.7719 158 0.6258 0.4536 0.6258 0.7911
No log 2.8070 160 0.6668 0.4541 0.6668 0.8166
No log 2.8421 162 0.5882 0.4969 0.5882 0.7669
No log 2.8772 164 0.5953 0.5088 0.5953 0.7716
No log 2.9123 166 0.6243 0.5368 0.6243 0.7901
No log 2.9474 168 0.6184 0.5614 0.6184 0.7864
No log 2.9825 170 0.8080 0.4634 0.8080 0.8989
No log 3.0175 172 0.9737 0.4332 0.9737 0.9868
No log 3.0526 174 0.7610 0.4693 0.7610 0.8723
No log 3.0877 176 0.5799 0.5377 0.5799 0.7615
No log 3.1228 178 0.5510 0.5086 0.5510 0.7423
No log 3.1579 180 0.5446 0.5447 0.5446 0.7380
No log 3.1930 182 0.5406 0.5441 0.5406 0.7353
No log 3.2281 184 0.5236 0.5490 0.5236 0.7236
No log 3.2632 186 0.5909 0.4837 0.5909 0.7687
No log 3.2982 188 0.5838 0.5633 0.5838 0.7641
No log 3.3333 190 0.5591 0.5633 0.5591 0.7478
No log 3.3684 192 0.5740 0.5276 0.5740 0.7576
No log 3.4035 194 0.5681 0.5838 0.5681 0.7537
No log 3.4386 196 0.5899 0.5740 0.5899 0.7680
No log 3.4737 198 0.6211 0.5950 0.6211 0.7881
No log 3.5088 200 0.6506 0.5564 0.6506 0.8066
No log 3.5439 202 0.6324 0.5460 0.6324 0.7952
No log 3.5789 204 0.6231 0.5187 0.6231 0.7893
No log 3.6140 206 0.6157 0.5234 0.6157 0.7847
No log 3.6491 208 0.6378 0.4621 0.6378 0.7986
No log 3.6842 210 0.6541 0.4727 0.6541 0.8088
No log 3.7193 212 0.6112 0.4667 0.6112 0.7818
No log 3.7544 214 0.6189 0.5676 0.6189 0.7867
No log 3.7895 216 0.6465 0.4845 0.6465 0.8040
No log 3.8246 218 0.6226 0.5372 0.6226 0.7890
No log 3.8596 220 0.5936 0.5255 0.5936 0.7704
No log 3.8947 222 0.5849 0.5338 0.5849 0.7648
No log 3.9298 224 0.5765 0.5299 0.5765 0.7592
No log 3.9649 226 0.5891 0.5892 0.5891 0.7675
No log 4.0 228 0.7262 0.4894 0.7262 0.8522
No log 4.0351 230 0.7059 0.4879 0.7059 0.8402
No log 4.0702 232 0.6420 0.5520 0.6420 0.8012
No log 4.1053 234 0.5808 0.5024 0.5808 0.7621
No log 4.1404 236 0.5898 0.5011 0.5898 0.7680
No log 4.1754 238 0.6486 0.5297 0.6486 0.8054
No log 4.2105 240 0.6681 0.5358 0.6681 0.8174
No log 4.2456 242 0.6324 0.5386 0.6324 0.7952
No log 4.2807 244 0.6513 0.5427 0.6513 0.8071
No log 4.3158 246 0.6555 0.5171 0.6555 0.8096
No log 4.3509 248 0.6394 0.5238 0.6394 0.7996
No log 4.3860 250 0.6329 0.5773 0.6329 0.7956
No log 4.4211 252 0.6119 0.5832 0.6119 0.7822
No log 4.4561 254 0.6222 0.5727 0.6222 0.7888
No log 4.4912 256 0.5933 0.6159 0.5933 0.7703
No log 4.5263 258 0.5886 0.6373 0.5886 0.7672
No log 4.5614 260 0.5809 0.5885 0.5809 0.7622
No log 4.5965 262 0.6174 0.5949 0.6174 0.7858
No log 4.6316 264 0.6006 0.5922 0.6006 0.7750
No log 4.6667 266 0.5942 0.5556 0.5942 0.7709
No log 4.7018 268 0.6108 0.5338 0.6108 0.7815
No log 4.7368 270 0.5828 0.5189 0.5828 0.7634
No log 4.7719 272 0.5744 0.4915 0.5744 0.7579
No log 4.8070 274 0.5982 0.5427 0.5982 0.7734
No log 4.8421 276 0.5803 0.5811 0.5803 0.7618
No log 4.8772 278 0.5718 0.6336 0.5718 0.7561
No log 4.9123 280 0.6287 0.5584 0.6287 0.7929
No log 4.9474 282 0.6336 0.5352 0.6336 0.7960
No log 4.9825 284 0.5510 0.5451 0.5510 0.7423
No log 5.0175 286 0.6676 0.5715 0.6676 0.8171
No log 5.0526 288 0.6725 0.5715 0.6725 0.8200
No log 5.0877 290 0.6625 0.5697 0.6625 0.8140
No log 5.1228 292 0.5610 0.5181 0.5610 0.7490
No log 5.1579 294 0.5990 0.5195 0.5990 0.7739
No log 5.1930 296 0.5928 0.5195 0.5928 0.7699
No log 5.2281 298 0.5520 0.5012 0.5520 0.7429
No log 5.2632 300 0.5801 0.5153 0.5801 0.7616
No log 5.2982 302 0.5576 0.4611 0.5576 0.7467
No log 5.3333 304 0.5817 0.4931 0.5817 0.7627
No log 5.3684 306 0.6310 0.4963 0.6310 0.7943
No log 5.4035 308 0.5812 0.4882 0.5812 0.7624
No log 5.4386 310 0.5603 0.4614 0.5603 0.7485
No log 5.4737 312 0.5815 0.5422 0.5815 0.7625
No log 5.5088 314 0.5604 0.5200 0.5604 0.7486
No log 5.5439 316 0.5887 0.5179 0.5887 0.7673
No log 5.5789 318 0.5858 0.5472 0.5858 0.7654
No log 5.6140 320 0.5842 0.5116 0.5842 0.7643
No log 5.6491 322 0.6782 0.4919 0.6782 0.8235
No log 5.6842 324 0.7056 0.4562 0.7056 0.8400
No log 5.7193 326 0.6383 0.5062 0.6383 0.7990
No log 5.7544 328 0.5845 0.4425 0.5845 0.7645
No log 5.7895 330 0.5922 0.4759 0.5922 0.7696
No log 5.8246 332 0.5962 0.4943 0.5962 0.7721
No log 5.8596 334 0.5740 0.4815 0.5740 0.7576
No log 5.8947 336 0.5718 0.5075 0.5718 0.7562
No log 5.9298 338 0.5676 0.5155 0.5676 0.7534
No log 5.9649 340 0.5395 0.5079 0.5395 0.7345
No log 6.0 342 0.5810 0.5329 0.5810 0.7622
No log 6.0351 344 0.6298 0.5430 0.6298 0.7936
No log 6.0702 346 0.5720 0.5419 0.5720 0.7563
No log 6.1053 348 0.5485 0.6074 0.5485 0.7406
No log 6.1404 350 0.5457 0.5606 0.5457 0.7387
No log 6.1754 352 0.5664 0.5787 0.5664 0.7526
No log 6.2105 354 0.5821 0.5735 0.5821 0.7629
No log 6.2456 356 0.5702 0.5911 0.5702 0.7551
No log 6.2807 358 0.5689 0.5913 0.5689 0.7543
No log 6.3158 360 0.5656 0.5589 0.5656 0.7521
No log 6.3509 362 0.5548 0.5457 0.5548 0.7448
No log 6.3860 364 0.5502 0.5122 0.5502 0.7417
No log 6.4211 366 0.5531 0.5502 0.5531 0.7437
No log 6.4561 368 0.5866 0.5348 0.5866 0.7659
No log 6.4912 370 0.6023 0.5305 0.6023 0.7761
No log 6.5263 372 0.5682 0.5565 0.5682 0.7538
No log 6.5614 374 0.6178 0.5230 0.6178 0.7860
No log 6.5965 376 0.6211 0.5109 0.6211 0.7881
No log 6.6316 378 0.6002 0.5264 0.6002 0.7747
No log 6.6667 380 0.5513 0.5823 0.5513 0.7425
No log 6.7018 382 0.5825 0.5936 0.5825 0.7632
No log 6.7368 384 0.6321 0.5054 0.6321 0.7950
No log 6.7719 386 0.5869 0.5056 0.5869 0.7661
No log 6.8070 388 0.5587 0.5271 0.5587 0.7475
No log 6.8421 390 0.5695 0.5396 0.5695 0.7547
No log 6.8772 392 0.5810 0.5206 0.5810 0.7623
No log 6.9123 394 0.6514 0.5292 0.6514 0.8071
No log 6.9474 396 0.6784 0.4986 0.6784 0.8237
No log 6.9825 398 0.6152 0.5378 0.6152 0.7844
No log 7.0175 400 0.5781 0.4576 0.5781 0.7603
No log 7.0526 402 0.5767 0.4670 0.5767 0.7594
No log 7.0877 404 0.5788 0.4928 0.5788 0.7608
No log 7.1228 406 0.5839 0.5161 0.5839 0.7641
No log 7.1579 408 0.6564 0.5503 0.6564 0.8102
No log 7.1930 410 0.7350 0.5475 0.7350 0.8573
No log 7.2281 412 0.7243 0.5471 0.7243 0.8511
No log 7.2632 414 0.6127 0.5632 0.6127 0.7828
No log 7.2982 416 0.5998 0.5730 0.5998 0.7744
No log 7.3333 418 0.6460 0.5699 0.6460 0.8038
No log 7.3684 420 0.6053 0.5621 0.6053 0.7780
No log 7.4035 422 0.5466 0.5004 0.5466 0.7393
No log 7.4386 424 0.5551 0.5081 0.5551 0.7450
No log 7.4737 426 0.6155 0.4951 0.6155 0.7845
No log 7.5088 428 0.6202 0.4717 0.6202 0.7875
No log 7.5439 430 0.5581 0.5164 0.5581 0.7471
No log 7.5789 432 0.5332 0.4781 0.5332 0.7302
No log 7.6140 434 0.5478 0.5283 0.5478 0.7401
No log 7.6491 436 0.5423 0.5203 0.5423 0.7364
No log 7.6842 438 0.5479 0.5741 0.5479 0.7402
No log 7.7193 440 0.5594 0.5875 0.5594 0.7479
No log 7.7544 442 0.5851 0.5582 0.5851 0.7649
No log 7.7895 444 0.5834 0.5102 0.5834 0.7638
No log 7.8246 446 0.5622 0.5213 0.5622 0.7498
No log 7.8596 448 0.5566 0.5182 0.5566 0.7461
No log 7.8947 450 0.5486 0.5701 0.5486 0.7407
No log 7.9298 452 0.5564 0.5628 0.5564 0.7459
No log 7.9649 454 0.5485 0.5746 0.5485 0.7406
No log 8.0 456 0.5345 0.5783 0.5345 0.7311
No log 8.0351 458 0.5310 0.5804 0.5310 0.7287
No log 8.0702 460 0.5244 0.5707 0.5244 0.7242
No log 8.1053 462 0.5215 0.5707 0.5215 0.7222
No log 8.1404 464 0.5275 0.5831 0.5275 0.7263
No log 8.1754 466 0.5489 0.6030 0.5489 0.7409
No log 8.2105 468 0.5202 0.5670 0.5202 0.7213
No log 8.2456 470 0.5099 0.5778 0.5099 0.7141
No log 8.2807 472 0.5167 0.6293 0.5167 0.7188
No log 8.3158 474 0.5222 0.6033 0.5222 0.7227
No log 8.3509 476 0.5221 0.6192 0.5221 0.7225
No log 8.3860 478 0.5299 0.6302 0.5299 0.7280
No log 8.4211 480 0.5394 0.6404 0.5394 0.7344
No log 8.4561 482 0.5773 0.6337 0.5773 0.7598
No log 8.4912 484 0.5997 0.6227 0.5997 0.7744
No log 8.5263 486 0.5795 0.6071 0.5795 0.7612
No log 8.5614 488 0.5406 0.6153 0.5406 0.7353
No log 8.5965 490 0.5328 0.5553 0.5328 0.7299
No log 8.6316 492 0.5361 0.5472 0.5361 0.7322
No log 8.6667 494 0.5579 0.5260 0.5579 0.7469
No log 8.7018 496 0.6312 0.5619 0.6312 0.7945
No log 8.7368 498 0.6532 0.5457 0.6532 0.8082
0.4026 8.7719 500 0.6204 0.5504 0.6204 0.7876
0.4026 8.8070 502 0.5891 0.5250 0.5891 0.7675
0.4026 8.8421 504 0.5807 0.5415 0.5807 0.7621
0.4026 8.8772 506 0.5954 0.5787 0.5954 0.7716
0.4026 8.9123 508 0.5744 0.5824 0.5744 0.7579
0.4026 8.9474 510 0.5502 0.4792 0.5502 0.7418
0.4026 8.9825 512 0.5467 0.4506 0.5467 0.7394
0.4026 9.0175 514 0.5550 0.4015 0.5550 0.7450
0.4026 9.0526 516 0.5328 0.4383 0.5328 0.7299
0.4026 9.0877 518 0.5166 0.5299 0.5166 0.7188

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
4
Safetensors
Model size
135M params
Tensor type
F32
·
Inference Providers NEW
This model is not currently available via any of the supported Inference Providers.

Model tree for MayBashendy/ArabicNewSplits8_usingALLEssays_FineTuningAraBERT_run2_AugV5_k11_task2_organization

Finetuned
(4222)
this model