Arabic_FineTuningAraBERT_AugV4_k2_task1_organization_fold1

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.6295
  • Qwk: 0.7853
  • Mse: 0.6295
  • Rmse: 0.7934

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 10

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0328 2 2.9853 0.0162 2.9853 1.7278
No log 0.0656 4 1.8248 0.0398 1.8248 1.3508
No log 0.0984 6 1.1350 -0.0448 1.1350 1.0653
No log 0.1311 8 0.7698 0.2009 0.7698 0.8774
No log 0.1639 10 0.7870 0.5625 0.7870 0.8871
No log 0.1967 12 0.7479 0.5333 0.7479 0.8648
No log 0.2295 14 0.7392 0.5625 0.7392 0.8597
No log 0.2623 16 0.8045 0.6237 0.8045 0.8969
No log 0.2951 18 0.5964 0.4776 0.5964 0.7723
No log 0.3279 20 0.5325 0.4251 0.5325 0.7298
No log 0.3607 22 0.6515 0.5435 0.6515 0.8072
No log 0.3934 24 0.9643 0.4940 0.9643 0.9820
No log 0.4262 26 1.3324 0.4560 1.3324 1.1543
No log 0.4590 28 1.1402 0.4286 1.1402 1.0678
No log 0.4918 30 0.5148 0.6419 0.5148 0.7175
No log 0.5246 32 0.4997 0.4878 0.4997 0.7069
No log 0.5574 34 0.4645 0.5267 0.4645 0.6815
No log 0.5902 36 0.5464 0.5051 0.5464 0.7392
No log 0.6230 38 0.8865 0.5930 0.8865 0.9415
No log 0.6557 40 0.9210 0.5444 0.9210 0.9597
No log 0.6885 42 0.7804 0.56 0.7804 0.8834
No log 0.7213 44 0.5783 0.5051 0.5783 0.7604
No log 0.7541 46 0.5713 0.5767 0.5713 0.7559
No log 0.7869 48 0.6810 0.4972 0.6810 0.8252
No log 0.8197 50 0.9239 0.5444 0.9239 0.9612
No log 0.8525 52 1.1282 0.4928 1.1282 1.0622
No log 0.8852 54 0.9613 0.6516 0.9613 0.9804
No log 0.9180 56 0.8232 0.6578 0.8232 0.9073
No log 0.9508 58 0.5109 0.5172 0.5109 0.7147
No log 0.9836 60 0.5462 0.5524 0.5462 0.7390
No log 1.0164 62 0.6688 0.5224 0.6688 0.8178
No log 1.0492 64 0.4696 0.6228 0.4696 0.6853
No log 1.0820 66 0.6338 0.6160 0.6338 0.7961
No log 1.1148 68 0.9894 0.7368 0.9894 0.9947
No log 1.1475 70 0.9964 0.7368 0.9964 0.9982
No log 1.1803 72 0.8761 0.5586 0.8761 0.9360
No log 1.2131 74 0.6508 0.6419 0.6508 0.8067
No log 1.2459 76 0.5969 0.5249 0.5969 0.7726
No log 1.2787 78 0.5614 0.5882 0.5614 0.7493
No log 1.3115 80 0.6076 0.5882 0.6076 0.7795
No log 1.3443 82 0.7124 0.6375 0.7124 0.8440
No log 1.3770 84 0.6798 0.6375 0.6798 0.8245
No log 1.4098 86 0.6025 0.6209 0.6025 0.7762
No log 1.4426 88 0.6167 0.6209 0.6167 0.7853
No log 1.4754 90 0.6997 0.6375 0.6997 0.8365
No log 1.5082 92 0.9169 0.6244 0.9169 0.9575
No log 1.5410 94 1.0666 0.4324 1.0666 1.0328
No log 1.5738 96 0.9853 0.4787 0.9853 0.9926
No log 1.6066 98 0.7919 0.5806 0.7919 0.8899
No log 1.6393 100 0.5909 0.5000 0.5909 0.7687
No log 1.6721 102 0.4942 0.5659 0.4942 0.7030
No log 1.7049 104 0.4893 0.5758 0.4893 0.6995
No log 1.7377 106 0.4794 0.5659 0.4794 0.6924
No log 1.7705 108 0.6436 0.6142 0.6436 0.8022
No log 1.8033 110 0.8995 0.5789 0.8995 0.9484
No log 1.8361 112 1.1165 0.6255 1.1165 1.0567
No log 1.8689 114 0.9947 0.6290 0.9947 0.9974
No log 1.9016 116 0.7021 0.7048 0.7021 0.8379
No log 1.9344 118 0.5857 0.6755 0.5857 0.7653
No log 1.9672 120 0.5400 0.6392 0.5400 0.7348
No log 2.0 122 0.6076 0.7948 0.6076 0.7795
No log 2.0328 124 0.6114 0.7240 0.6114 0.7819
No log 2.0656 126 0.4548 0.7220 0.4548 0.6744
No log 2.0984 128 0.4054 0.5977 0.4054 0.6367
No log 2.1311 130 0.4112 0.6842 0.4112 0.6412
No log 2.1639 132 0.5051 0.6866 0.5051 0.7107
No log 2.1967 134 0.5804 0.6977 0.5804 0.7618
No log 2.2295 136 0.8923 0.7336 0.8923 0.9446
No log 2.2623 138 0.9691 0.7336 0.9691 0.9844
No log 2.2951 140 0.7198 0.7660 0.7198 0.8484
No log 2.3279 142 0.5497 0.6120 0.5497 0.7414
No log 2.3607 144 0.5435 0.6120 0.5435 0.7373
No log 2.3934 146 0.5864 0.7705 0.5864 0.7658
No log 2.4262 148 0.7011 0.7660 0.7011 0.8373
No log 2.4590 150 0.7394 0.7423 0.7394 0.8599
No log 2.4918 152 0.6681 0.7660 0.6681 0.8174
No log 2.5246 154 0.7074 0.7390 0.7074 0.8411
No log 2.5574 156 0.7161 0.7390 0.7161 0.8462
No log 2.5902 158 0.7172 0.7535 0.7172 0.8469
No log 2.6230 160 0.7751 0.7260 0.7751 0.8804
No log 2.6557 162 0.7207 0.7260 0.7207 0.8489
No log 2.6885 164 0.5841 0.7290 0.5841 0.7642
No log 2.7213 166 0.5031 0.6744 0.5031 0.7093
No log 2.7541 168 0.5072 0.6744 0.5072 0.7122
No log 2.7869 170 0.5150 0.6426 0.5150 0.7176
No log 2.8197 172 0.5184 0.6744 0.5184 0.7200
No log 2.8525 174 0.6507 0.7692 0.6507 0.8066
No log 2.8852 176 0.8258 0.6851 0.8258 0.9088
No log 2.9180 178 0.7666 0.6978 0.7666 0.8756
No log 2.9508 180 0.7073 0.7407 0.7073 0.8410
No log 2.9836 182 0.5460 0.5914 0.5460 0.7389
No log 3.0164 184 0.4783 0.4936 0.4783 0.6916
No log 3.0492 186 0.4813 0.5914 0.4813 0.6937
No log 3.0820 188 0.4742 0.6423 0.4742 0.6886
No log 3.1148 190 0.5596 0.6580 0.5596 0.7481
No log 3.1475 192 0.6937 0.7111 0.6937 0.8329
No log 3.1803 194 0.8550 0.7336 0.8550 0.9247
No log 3.2131 196 0.8812 0.7552 0.8812 0.9387
No log 3.2459 198 0.7880 0.7336 0.7880 0.8877
No log 3.2787 200 0.5960 0.6028 0.5960 0.7720
No log 3.3115 202 0.5371 0.6111 0.5371 0.7329
No log 3.3443 204 0.5572 0.6488 0.5572 0.7464
No log 3.3770 206 0.6643 0.7660 0.6643 0.8150
No log 3.4098 208 0.7965 0.7712 0.7965 0.8925
No log 3.4426 210 0.7372 0.7961 0.7372 0.8586
No log 3.4754 212 0.5720 0.7358 0.5720 0.7563
No log 3.5082 214 0.4918 0.7036 0.4918 0.7013
No log 3.5410 216 0.4924 0.7036 0.4924 0.7017
No log 3.5738 218 0.4667 0.7036 0.4667 0.6831
No log 3.6066 220 0.4806 0.7036 0.4806 0.6933
No log 3.6393 222 0.4662 0.7036 0.4662 0.6828
No log 3.6721 224 0.4550 0.6597 0.4550 0.6745
No log 3.7049 226 0.4980 0.6387 0.4980 0.7057
No log 3.7377 228 0.5937 0.7358 0.5937 0.7705
No log 3.7705 230 0.6906 0.7660 0.6906 0.8310
No log 3.8033 232 0.6775 0.7660 0.6775 0.8231
No log 3.8361 234 0.6712 0.7660 0.6712 0.8193
No log 3.8689 236 0.6741 0.7660 0.6741 0.8211
No log 3.9016 238 0.7698 0.7336 0.7698 0.8774
No log 3.9344 240 0.7350 0.6691 0.7350 0.8573
No log 3.9672 242 0.5885 0.6932 0.5885 0.7672
No log 4.0 244 0.4615 0.6316 0.4615 0.6793
No log 4.0328 246 0.4310 0.6097 0.4310 0.6565
No log 4.0656 248 0.4498 0.6097 0.4498 0.6707
No log 4.0984 250 0.5190 0.7298 0.5190 0.7204
No log 4.1311 252 0.6948 0.7308 0.6948 0.8335
No log 4.1639 254 0.8024 0.7308 0.8024 0.8958
No log 4.1967 256 0.8300 0.7308 0.8300 0.9110
No log 4.2295 258 0.7374 0.7660 0.7374 0.8587
No log 4.2623 260 0.5950 0.6182 0.5950 0.7713
No log 4.2951 262 0.5481 0.5692 0.5481 0.7403
No log 4.3279 264 0.5302 0.6067 0.5302 0.7282
No log 4.3607 266 0.5516 0.6899 0.5516 0.7427
No log 4.3934 268 0.6537 0.7358 0.6537 0.8085
No log 4.4262 270 0.8451 0.7255 0.8451 0.9193
No log 4.4590 272 0.8501 0.7063 0.8501 0.9220
No log 4.4918 274 0.6752 0.7697 0.6752 0.8217
No log 4.5246 276 0.5061 0.7298 0.5061 0.7114
No log 4.5574 278 0.4724 0.7298 0.4724 0.6873
No log 4.5902 280 0.4389 0.7298 0.4389 0.6625
No log 4.6230 282 0.4827 0.7298 0.4827 0.6948
No log 4.6557 284 0.6064 0.7805 0.6064 0.7787
No log 4.6885 286 0.7938 0.7667 0.7938 0.8910
No log 4.7213 288 0.7934 0.7667 0.7934 0.8907
No log 4.7541 290 0.6361 0.7853 0.6361 0.7976
No log 4.7869 292 0.4802 0.6839 0.4802 0.6929
No log 4.8197 294 0.4433 0.7342 0.4433 0.6658
No log 4.8525 296 0.4713 0.6839 0.4713 0.6865
No log 4.8852 298 0.5977 0.6977 0.5977 0.7731
No log 4.9180 300 0.8032 0.72 0.8032 0.8962
No log 4.9508 302 0.8613 0.6912 0.8613 0.9281
No log 4.9836 304 0.7760 0.7921 0.7760 0.8809
No log 5.0164 306 0.6560 0.6859 0.6560 0.8100
No log 5.0492 308 0.5322 0.6839 0.5322 0.7295
No log 5.0820 310 0.5377 0.7165 0.5377 0.7333
No log 5.1148 312 0.5984 0.7048 0.5984 0.7735
No log 5.1475 314 0.7122 0.7048 0.7122 0.8439
No log 5.1803 316 0.8262 0.7853 0.8262 0.9090
No log 5.2131 318 0.7961 0.7853 0.7961 0.8922
No log 5.2459 320 0.7537 0.7853 0.7537 0.8681
No log 5.2787 322 0.6458 0.7660 0.6458 0.8036
No log 5.3115 324 0.5010 0.6420 0.5010 0.7078
No log 5.3443 326 0.4562 0.6974 0.4562 0.6754
No log 5.3770 328 0.4852 0.6839 0.4852 0.6966
No log 5.4098 330 0.5468 0.7660 0.5468 0.7394
No log 5.4426 332 0.6275 0.7853 0.6275 0.7921
No log 5.4754 334 0.6158 0.7660 0.6158 0.7847
No log 5.5082 336 0.5542 0.7660 0.5542 0.7444
No log 5.5410 338 0.5584 0.7660 0.5584 0.7473
No log 5.5738 340 0.5173 0.7660 0.5173 0.7192
No log 5.6066 342 0.4900 0.7290 0.4900 0.7000
No log 5.6393 344 0.5020 0.7799 0.5020 0.7085
No log 5.6721 346 0.5918 0.7660 0.5918 0.7693
No log 5.7049 348 0.6695 0.7660 0.6695 0.8183
No log 5.7377 350 0.6620 0.7660 0.6620 0.8136
No log 5.7705 352 0.6004 0.7660 0.6004 0.7749
No log 5.8033 354 0.5212 0.7165 0.5212 0.7219
No log 5.8361 356 0.5039 0.6957 0.5039 0.7098
No log 5.8689 358 0.5536 0.7799 0.5536 0.7440
No log 5.9016 360 0.6016 0.7660 0.6016 0.7756
No log 5.9344 362 0.5971 0.7660 0.5971 0.7728
No log 5.9672 364 0.6022 0.7660 0.6022 0.7760
No log 6.0 366 0.6095 0.7660 0.6095 0.7807
No log 6.0328 368 0.6123 0.7660 0.6123 0.7825
No log 6.0656 370 0.5892 0.7799 0.5892 0.7676
No log 6.0984 372 0.5380 0.6872 0.5380 0.7335
No log 6.1311 374 0.4849 0.6376 0.4849 0.6963
No log 6.1639 376 0.4778 0.6461 0.4778 0.6912
No log 6.1967 378 0.4778 0.6571 0.4778 0.6912
No log 6.2295 380 0.5369 0.7492 0.5369 0.7328
No log 6.2623 382 0.6685 0.7660 0.6685 0.8176
No log 6.2951 384 0.8414 0.7961 0.8414 0.9173
No log 6.3279 386 0.9174 0.7375 0.9174 0.9578
No log 6.3607 388 0.8507 0.7961 0.8507 0.9223
No log 6.3934 390 0.7447 0.7853 0.7447 0.8630
No log 6.4262 392 0.6332 0.7853 0.6332 0.7957
No log 6.4590 394 0.5289 0.7799 0.5289 0.7272
No log 6.4918 396 0.5023 0.6510 0.5023 0.7087
No log 6.5246 398 0.5171 0.7799 0.5171 0.7191
No log 6.5574 400 0.5153 0.7799 0.5153 0.7179
No log 6.5902 402 0.5191 0.7799 0.5191 0.7205
No log 6.6230 404 0.5519 0.7799 0.5519 0.7429
No log 6.6557 406 0.6037 0.7799 0.6037 0.7770
No log 6.6885 408 0.6432 0.7853 0.6432 0.8020
No log 6.7213 410 0.6038 0.8 0.6038 0.7770
No log 6.7541 412 0.5172 0.7799 0.5172 0.7191
No log 6.7869 414 0.4696 0.7217 0.4696 0.6853
No log 6.8197 416 0.4841 0.7217 0.4841 0.6958
No log 6.8525 418 0.5467 0.8 0.5467 0.7394
No log 6.8852 420 0.5967 0.8 0.5967 0.7725
No log 6.9180 422 0.6295 0.7853 0.6295 0.7934
No log 6.9508 424 0.6321 0.7853 0.6321 0.7951
No log 6.9836 426 0.6771 0.7853 0.6771 0.8229
No log 7.0164 428 0.6710 0.7853 0.6710 0.8191
No log 7.0492 430 0.6420 0.7853 0.6420 0.8013
No log 7.0820 432 0.5988 0.7451 0.5988 0.7739
No log 7.1148 434 0.5451 0.7515 0.5451 0.7383
No log 7.1475 436 0.5375 0.7515 0.5375 0.7332
No log 7.1803 438 0.5673 0.7515 0.5673 0.7532
No log 7.2131 440 0.5879 0.7799 0.5879 0.7667
No log 7.2459 442 0.6104 0.7660 0.6104 0.7813
No log 7.2787 444 0.6115 0.7853 0.6115 0.7820
No log 7.3115 446 0.6126 0.7853 0.6126 0.7827
No log 7.3443 448 0.5635 0.7290 0.5635 0.7507
No log 7.3770 450 0.5187 0.7290 0.5187 0.7202
No log 7.4098 452 0.5160 0.7290 0.5160 0.7183
No log 7.4426 454 0.5346 0.7290 0.5346 0.7312
No log 7.4754 456 0.5878 0.7290 0.5878 0.7667
No log 7.5082 458 0.6642 0.8 0.6642 0.8150
No log 7.5410 460 0.7359 0.7961 0.7359 0.8579
No log 7.5738 462 0.7502 0.7961 0.7502 0.8661
No log 7.6066 464 0.7494 0.7853 0.7494 0.8657
No log 7.6393 466 0.7149 0.7660 0.7149 0.8455
No log 7.6721 468 0.7185 0.7660 0.7185 0.8476
No log 7.7049 470 0.6875 0.7048 0.6875 0.8291
No log 7.7377 472 0.6582 0.7048 0.6582 0.8113
No log 7.7705 474 0.6068 0.7290 0.6068 0.7790
No log 7.8033 476 0.5752 0.7290 0.5752 0.7584
No log 7.8361 478 0.5840 0.7290 0.5840 0.7642
No log 7.8689 480 0.6174 0.7048 0.6174 0.7858
No log 7.9016 482 0.6425 0.7048 0.6425 0.8015
No log 7.9344 484 0.6592 0.7660 0.6592 0.8119
No log 7.9672 486 0.6432 0.7048 0.6432 0.8020
No log 8.0 488 0.6129 0.7048 0.6129 0.7829
No log 8.0328 490 0.5823 0.7165 0.5823 0.7631
No log 8.0656 492 0.5905 0.7165 0.5905 0.7685
No log 8.0984 494 0.6252 0.7660 0.6252 0.7907
No log 8.1311 496 0.6848 0.7853 0.6848 0.8275
No log 8.1639 498 0.7161 0.7853 0.7161 0.8462
0.3667 8.1967 500 0.7085 0.7853 0.7085 0.8418
0.3667 8.2295 502 0.6616 0.7853 0.6616 0.8134
0.3667 8.2623 504 0.5925 0.8158 0.5925 0.7697
0.3667 8.2951 506 0.5359 0.7290 0.5359 0.7320
0.3667 8.3279 508 0.4899 0.6597 0.4899 0.6999
0.3667 8.3607 510 0.4802 0.6597 0.4802 0.6930
0.3667 8.3934 512 0.4862 0.6597 0.4862 0.6973
0.3667 8.4262 514 0.5031 0.7290 0.5031 0.7093
0.3667 8.4590 516 0.5375 0.7290 0.5375 0.7332
0.3667 8.4918 518 0.5962 0.7799 0.5962 0.7721
0.3667 8.5246 520 0.6515 0.7853 0.6515 0.8071
0.3667 8.5574 522 0.7036 0.7853 0.7036 0.8388
0.3667 8.5902 524 0.7329 0.7853 0.7329 0.8561
0.3667 8.6230 526 0.7186 0.7853 0.7186 0.8477
0.3667 8.6557 528 0.6812 0.7853 0.6812 0.8253
0.3667 8.6885 530 0.6408 0.7853 0.6407 0.8005
0.3667 8.7213 532 0.6196 0.7853 0.6196 0.7871
0.3667 8.7541 534 0.6145 0.7853 0.6145 0.7839
0.3667 8.7869 536 0.6196 0.7853 0.6196 0.7871
0.3667 8.8197 538 0.6166 0.7853 0.6166 0.7853
0.3667 8.8525 540 0.6028 0.7853 0.6028 0.7764
0.3667 8.8852 542 0.5942 0.7799 0.5942 0.7709
0.3667 8.9180 544 0.5816 0.7948 0.5816 0.7626
0.3667 8.9508 546 0.5805 0.7290 0.5805 0.7619
0.3667 8.9836 548 0.5871 0.7799 0.5871 0.7662
0.3667 9.0164 550 0.6065 0.7660 0.6065 0.7788
0.3667 9.0492 552 0.6185 0.7853 0.6185 0.7865
0.3667 9.0820 554 0.6198 0.7853 0.6198 0.7873
0.3667 9.1148 556 0.6246 0.7853 0.6246 0.7903
0.3667 9.1475 558 0.6428 0.7853 0.6428 0.8018
0.3667 9.1803 560 0.6476 0.7853 0.6476 0.8047
0.3667 9.2131 562 0.6449 0.7853 0.6449 0.8030
0.3667 9.2459 564 0.6364 0.7853 0.6364 0.7978
0.3667 9.2787 566 0.6291 0.7853 0.6291 0.7932
0.3667 9.3115 568 0.6177 0.7853 0.6177 0.7860
0.3667 9.3443 570 0.6058 0.7660 0.6058 0.7783
0.3667 9.3770 572 0.6031 0.7660 0.6031 0.7766
0.3667 9.4098 574 0.6027 0.7853 0.6027 0.7763
0.3667 9.4426 576 0.6041 0.7853 0.6041 0.7773
0.3667 9.4754 578 0.6131 0.7853 0.6131 0.7830
0.3667 9.5082 580 0.6242 0.7853 0.6242 0.7901
0.3667 9.5410 582 0.6275 0.7853 0.6275 0.7922
0.3667 9.5738 584 0.6330 0.7853 0.6330 0.7956
0.3667 9.6066 586 0.6381 0.7853 0.6381 0.7988
0.3667 9.6393 588 0.6442 0.7853 0.6442 0.8026
0.3667 9.6721 590 0.6429 0.7853 0.6429 0.8018
0.3667 9.7049 592 0.6357 0.7853 0.6357 0.7973
0.3667 9.7377 594 0.6294 0.7853 0.6294 0.7933
0.3667 9.7705 596 0.6250 0.7853 0.6250 0.7906
0.3667 9.8033 598 0.6247 0.7853 0.6247 0.7904
0.3667 9.8361 600 0.6239 0.7853 0.6239 0.7899
0.3667 9.8689 602 0.6250 0.7853 0.6250 0.7906
0.3667 9.9016 604 0.6261 0.7853 0.6261 0.7912
0.3667 9.9344 606 0.6275 0.7853 0.6275 0.7922
0.3667 9.9672 608 0.6288 0.7853 0.6288 0.7930
0.3667 10.0 610 0.6295 0.7853 0.6295 0.7934

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
26
Safetensors
Model size
135M params
Tensor type
F32
·
Inference Providers NEW
This model is not currently available via any of the supported Inference Providers.

Model tree for MayBashendy/Arabic_FineTuningAraBERT_AugV4_k2_task1_organization_fold1

Finetuned
(4222)
this model