Arabic_FineTuningAraBERT_AugV0_k10_task1_organization_fold1

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.3837
  • Qwk: 0.7956
  • Mse: 0.3837
  • Rmse: 0.6194

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 10

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0333 2 5.3272 -0.0229 5.3272 2.3081
No log 0.0667 4 2.4707 0.0272 2.4707 1.5718
No log 0.1 6 1.2300 0.0978 1.2300 1.1091
No log 0.1333 8 0.8324 0.2473 0.8324 0.9124
No log 0.1667 10 0.9418 -0.0736 0.9418 0.9705
No log 0.2 12 1.0142 0.0242 1.0142 1.0071
No log 0.2333 14 1.0150 0.0 1.0150 1.0075
No log 0.2667 16 0.9577 0.0 0.9577 0.9786
No log 0.3 18 0.8847 0.0 0.8847 0.9406
No log 0.3333 20 0.8436 0.0 0.8436 0.9185
No log 0.3667 22 0.7170 0.3253 0.7170 0.8467
No log 0.4 24 0.6143 0.56 0.6143 0.7838
No log 0.4333 26 0.5680 0.56 0.5680 0.7536
No log 0.4667 28 0.5526 0.5930 0.5526 0.7434
No log 0.5 30 0.8593 0.1667 0.8593 0.9270
No log 0.5333 32 0.9161 0.1667 0.9161 0.9571
No log 0.5667 34 0.7690 0.1529 0.7689 0.8769
No log 0.6 36 0.6763 0.3253 0.6763 0.8224
No log 0.6333 38 0.6226 0.3708 0.6226 0.7891
No log 0.6667 40 0.5855 0.4615 0.5855 0.7652
No log 0.7 42 0.5754 0.3277 0.5754 0.7585
No log 0.7333 44 0.4691 0.5333 0.4691 0.6849
No log 0.7667 46 0.4788 0.4615 0.4788 0.6919
No log 0.8 48 0.5416 0.4096 0.5416 0.7359
No log 0.8333 50 0.5221 0.5930 0.5221 0.7226
No log 0.8667 52 0.4854 0.5444 0.4854 0.6967
No log 0.9 54 0.4692 0.5444 0.4692 0.6850
No log 0.9333 56 0.4940 0.5444 0.4940 0.7028
No log 0.9667 58 0.6791 0.3368 0.6791 0.8241
No log 1.0 60 0.9198 0.3383 0.9198 0.9591
No log 1.0333 62 0.9387 0.3383 0.9387 0.9689
No log 1.0667 64 0.8887 0.3396 0.8887 0.9427
No log 1.1 66 0.6818 0.4979 0.6818 0.8257
No log 1.1333 68 0.4038 0.6535 0.4038 0.6354
No log 1.1667 70 0.4140 0.6280 0.4140 0.6434
No log 1.2 72 0.4483 0.6280 0.4483 0.6696
No log 1.2333 74 0.3798 0.5435 0.3798 0.6163
No log 1.2667 76 0.3812 0.7319 0.3812 0.6174
No log 1.3 78 0.5829 0.6500 0.5829 0.7635
No log 1.3333 80 0.7160 0.5484 0.7160 0.8462
No log 1.3667 82 0.6768 0.3824 0.6768 0.8227
No log 1.4 84 0.6115 0.3563 0.6115 0.7820
No log 1.4333 86 0.6212 0.3253 0.6212 0.7881
No log 1.4667 88 0.5771 0.4615 0.5771 0.7597
No log 1.5 90 0.5324 0.4724 0.5324 0.7296
No log 1.5333 92 0.5188 0.5333 0.5188 0.7203
No log 1.5667 94 0.4680 0.5991 0.4680 0.6841
No log 1.6 96 0.4323 0.6778 0.4323 0.6575
No log 1.6333 98 0.4168 0.6778 0.4168 0.6456
No log 1.6667 100 0.4192 0.6385 0.4192 0.6474
No log 1.7 102 0.4639 0.5333 0.4639 0.6811
No log 1.7333 104 0.4942 0.5236 0.4942 0.7030
No log 1.7667 106 0.6060 0.2125 0.6060 0.7784
No log 1.8 108 0.7025 0.2125 0.7025 0.8381
No log 1.8333 110 0.7285 0.2699 0.7285 0.8535
No log 1.8667 112 0.6746 0.2125 0.6746 0.8213
No log 1.9 114 0.5491 0.4251 0.5491 0.7410
No log 1.9333 116 0.4894 0.5249 0.4894 0.6996
No log 1.9667 118 0.4749 0.6723 0.4749 0.6891
No log 2.0 120 0.4278 0.7154 0.4278 0.6541
No log 2.0333 122 0.3553 0.7287 0.3553 0.5961
No log 2.0667 124 0.3283 0.7266 0.3283 0.5730
No log 2.1 126 0.3337 0.7529 0.3337 0.5777
No log 2.1333 128 0.4231 0.7605 0.4231 0.6505
No log 2.1667 130 0.5480 0.7154 0.5480 0.7403
No log 2.2 132 0.5200 0.6379 0.5200 0.7211
No log 2.2333 134 0.3948 0.7222 0.3948 0.6283
No log 2.2667 136 0.3938 0.6351 0.3938 0.6275
No log 2.3 138 0.4639 0.6 0.4639 0.6811
No log 2.3333 140 0.6642 0.5415 0.6642 0.8150
No log 2.3667 142 0.7573 0.3814 0.7573 0.8703
No log 2.4 144 0.7295 0.3814 0.7295 0.8541
No log 2.4333 146 0.6086 0.4862 0.6086 0.7801
No log 2.4667 148 0.5096 0.6908 0.5096 0.7139
No log 2.5 150 0.5600 0.6908 0.5600 0.7483
No log 2.5333 152 0.5866 0.7159 0.5866 0.7659
No log 2.5667 154 0.4327 0.7605 0.4327 0.6578
No log 2.6 156 0.2880 0.7482 0.2880 0.5367
No log 2.6333 158 0.3852 0.75 0.3852 0.6206
No log 2.6667 160 0.3564 0.7846 0.3564 0.5970
No log 2.7 162 0.2749 0.7050 0.2749 0.5244
No log 2.7333 164 0.4500 0.7308 0.4500 0.6708
No log 2.7667 166 0.5873 0.4979 0.5873 0.7664
No log 2.8 168 0.6276 0.4979 0.6276 0.7922
No log 2.8333 170 0.6444 0.4979 0.6444 0.8028
No log 2.8667 172 0.5932 0.5917 0.5932 0.7702
No log 2.9 174 0.5010 0.6111 0.5010 0.7078
No log 2.9333 176 0.4910 0.6111 0.4910 0.7007
No log 2.9667 178 0.5376 0.5172 0.5376 0.7332
No log 3.0 180 0.5996 0.5415 0.5996 0.7743
No log 3.0333 182 0.7195 0.5484 0.7195 0.8482
No log 3.0667 184 0.7186 0.5484 0.7186 0.8477
No log 3.1 186 0.5675 0.5415 0.5675 0.7533
No log 3.1333 188 0.4600 0.5845 0.4600 0.6783
No log 3.1667 190 0.4597 0.6111 0.4597 0.6780
No log 3.2 192 0.5296 0.5679 0.5296 0.7277
No log 3.2333 194 0.6972 0.5563 0.6972 0.8350
No log 3.2667 196 0.7460 0.5563 0.7460 0.8637
No log 3.3 198 0.6051 0.6260 0.6051 0.7779
No log 3.3333 200 0.4807 0.6255 0.4807 0.6933
No log 3.3667 202 0.4120 0.6128 0.4120 0.6419
No log 3.4 204 0.3912 0.7287 0.3912 0.6254
No log 3.4333 206 0.4889 0.5917 0.4889 0.6992
No log 3.4667 208 0.5522 0.5917 0.5522 0.7431
No log 3.5 210 0.4959 0.6831 0.4959 0.7042
No log 3.5333 212 0.4247 0.7386 0.4247 0.6517
No log 3.5667 214 0.3866 0.7131 0.3866 0.6218
No log 3.6 216 0.4096 0.7386 0.4096 0.6400
No log 3.6333 218 0.4293 0.7059 0.4293 0.6552
No log 3.6667 220 0.4317 0.7529 0.4317 0.6570
No log 3.7 222 0.3652 0.8123 0.3652 0.6043
No log 3.7333 224 0.3200 0.8123 0.3200 0.5657
No log 3.7667 226 0.3056 0.8123 0.3056 0.5528
No log 3.8 228 0.3841 0.7829 0.3841 0.6197
No log 3.8333 230 0.5948 0.7036 0.5948 0.7712
No log 3.8667 232 0.6784 0.6621 0.6784 0.8237
No log 3.9 234 0.5592 0.7549 0.5592 0.7478
No log 3.9333 236 0.3694 0.7705 0.3694 0.6078
No log 3.9667 238 0.3332 0.8063 0.3332 0.5772
No log 4.0 240 0.3424 0.7449 0.3424 0.5851
No log 4.0333 242 0.4364 0.7154 0.4364 0.6606
No log 4.0667 244 0.5033 0.6255 0.5033 0.7094
No log 4.1 246 0.5120 0.6255 0.5120 0.7155
No log 4.1333 248 0.4381 0.6255 0.4381 0.6619
No log 4.1667 250 0.4425 0.6016 0.4425 0.6652
No log 4.2 252 0.4633 0.6255 0.4633 0.6806
No log 4.2333 254 0.5072 0.6255 0.5072 0.7122
No log 4.2667 256 0.6083 0.6375 0.6083 0.7799
No log 4.3 258 0.5754 0.5917 0.5754 0.7585
No log 4.3333 260 0.5201 0.6255 0.5201 0.7212
No log 4.3667 262 0.4204 0.6908 0.4204 0.6484
No log 4.4 264 0.4086 0.6908 0.4086 0.6392
No log 4.4333 266 0.4639 0.7154 0.4639 0.6811
No log 4.4667 268 0.4899 0.7154 0.4899 0.7000
No log 4.5 270 0.4793 0.7154 0.4793 0.6923
No log 4.5333 272 0.4708 0.7154 0.4708 0.6862
No log 4.5667 274 0.3748 0.7368 0.3748 0.6122
No log 4.6 276 0.2967 0.7705 0.2967 0.5447
No log 4.6333 278 0.2852 0.7705 0.2852 0.5340
No log 4.6667 280 0.3155 0.7529 0.3155 0.5617
No log 4.7 282 0.4066 0.7956 0.4066 0.6376
No log 4.7333 284 0.4458 0.7635 0.4458 0.6677
No log 4.7667 286 0.3529 0.7569 0.3529 0.5940
No log 4.8 288 0.2637 0.7529 0.2637 0.5136
No log 4.8333 290 0.2642 0.7829 0.2642 0.5140
No log 4.8667 292 0.3204 0.7368 0.3204 0.5661
No log 4.9 294 0.3225 0.7368 0.3225 0.5679
No log 4.9333 296 0.2846 0.7658 0.2846 0.5335
No log 4.9667 298 0.2787 0.8016 0.2787 0.5279
No log 5.0 300 0.2917 0.7658 0.2917 0.5401
No log 5.0333 302 0.3794 0.7368 0.3794 0.6160
No log 5.0667 304 0.4266 0.7956 0.4266 0.6532
No log 5.1 306 0.4227 0.7956 0.4227 0.6502
No log 5.1333 308 0.3602 0.7658 0.3602 0.6001
No log 5.1667 310 0.3117 0.776 0.3117 0.5583
No log 5.2 312 0.3215 0.776 0.3215 0.5670
No log 5.2333 314 0.3297 0.7829 0.3297 0.5742
No log 5.2667 316 0.3672 0.7658 0.3672 0.6059
No log 5.3 318 0.4508 0.7956 0.4508 0.6714
No log 5.3333 320 0.5150 0.7956 0.5150 0.7176
No log 5.3667 322 0.4737 0.7956 0.4737 0.6883
No log 5.4 324 0.4551 0.7956 0.4551 0.6746
No log 5.4333 326 0.3946 0.7956 0.3946 0.6282
No log 5.4667 328 0.3028 0.7658 0.3028 0.5503
No log 5.5 330 0.2915 0.7658 0.2915 0.5399
No log 5.5333 332 0.3268 0.7658 0.3268 0.5717
No log 5.5667 334 0.4061 0.8000 0.4061 0.6373
No log 5.6 336 0.4354 0.8000 0.4354 0.6599
No log 5.6333 338 0.4549 0.7726 0.4549 0.6745
No log 5.6667 340 0.4729 0.7956 0.4729 0.6877
No log 5.7 342 0.4471 0.7726 0.4471 0.6686
No log 5.7333 344 0.4925 0.6934 0.4925 0.7018
No log 5.7667 346 0.5541 0.7159 0.5541 0.7444
No log 5.8 348 0.5142 0.7159 0.5142 0.7171
No log 5.8333 350 0.4914 0.6934 0.4914 0.7010
No log 5.8667 352 0.4603 0.6540 0.4603 0.6785
No log 5.9 354 0.4230 0.6111 0.4230 0.6504
No log 5.9333 356 0.4183 0.6224 0.4183 0.6468
No log 5.9667 358 0.4338 0.6111 0.4338 0.6586
No log 6.0 360 0.4975 0.6769 0.4975 0.7054
No log 6.0333 362 0.5502 0.6343 0.5502 0.7418
No log 6.0667 364 0.5120 0.7159 0.5120 0.7156
No log 6.1 366 0.4009 0.7368 0.4009 0.6332
No log 6.1333 368 0.3302 0.7829 0.3302 0.5746
No log 6.1667 370 0.3167 0.7586 0.3167 0.5628
No log 6.2 372 0.3118 0.7586 0.3118 0.5584
No log 6.2333 374 0.3433 0.7368 0.3433 0.5859
No log 6.2667 376 0.4088 0.7726 0.4088 0.6394
No log 6.3 378 0.4783 0.7789 0.4783 0.6916
No log 6.3333 380 0.4528 0.7789 0.4528 0.6729
No log 6.3667 382 0.3579 0.7726 0.3579 0.5983
No log 6.4 384 0.3167 0.7658 0.3167 0.5628
No log 6.4333 386 0.3254 0.8000 0.3254 0.5704
No log 6.4667 388 0.3517 0.8000 0.3517 0.5930
No log 6.5 390 0.3881 0.7956 0.3881 0.6229
No log 6.5333 392 0.4145 0.7956 0.4145 0.6438
No log 6.5667 394 0.3831 0.7956 0.3831 0.6190
No log 6.6 396 0.4029 0.7956 0.4029 0.6347
No log 6.6333 398 0.4239 0.7956 0.4239 0.6511
No log 6.6667 400 0.4392 0.7956 0.4392 0.6627
No log 6.7 402 0.4047 0.7956 0.4047 0.6362
No log 6.7333 404 0.3862 0.7956 0.3862 0.6215
No log 6.7667 406 0.4092 0.7956 0.4092 0.6397
No log 6.8 408 0.3856 0.7956 0.3856 0.6210
No log 6.8333 410 0.3644 0.7726 0.3644 0.6036
No log 6.8667 412 0.3262 0.7658 0.3262 0.5711
No log 6.9 414 0.3147 0.7829 0.3147 0.5609
No log 6.9333 416 0.3346 0.7368 0.3346 0.5784
No log 6.9667 418 0.3912 0.7956 0.3912 0.6255
No log 7.0 420 0.4878 0.7789 0.4878 0.6984
No log 7.0333 422 0.5251 0.7789 0.5251 0.7247
No log 7.0667 424 0.4830 0.7789 0.4830 0.6950
No log 7.1 426 0.3969 0.7956 0.3969 0.6300
No log 7.1333 428 0.3551 0.7726 0.3551 0.5959
No log 7.1667 430 0.3511 0.7726 0.3511 0.5925
No log 7.2 432 0.3588 0.7726 0.3588 0.5990
No log 7.2333 434 0.3671 0.7726 0.3671 0.6059
No log 7.2667 436 0.3792 0.7569 0.3792 0.6158
No log 7.3 438 0.3519 0.7726 0.3519 0.5932
No log 7.3333 440 0.3213 0.7368 0.3213 0.5669
No log 7.3667 442 0.3169 0.7368 0.3169 0.5630
No log 7.4 444 0.3411 0.7726 0.3411 0.5841
No log 7.4333 446 0.4070 0.7789 0.4070 0.6380
No log 7.4667 448 0.4527 0.7789 0.4527 0.6729
No log 7.5 450 0.4499 0.7789 0.4499 0.6708
No log 7.5333 452 0.4132 0.7789 0.4132 0.6428
No log 7.5667 454 0.3848 0.7956 0.3848 0.6203
No log 7.6 456 0.3901 0.7956 0.3901 0.6246
No log 7.6333 458 0.4138 0.7956 0.4138 0.6433
No log 7.6667 460 0.4520 0.7549 0.4520 0.6723
No log 7.7 462 0.5068 0.7388 0.5068 0.7119
No log 7.7333 464 0.5143 0.7388 0.5143 0.7171
No log 7.7667 466 0.4887 0.7388 0.4887 0.6990
No log 7.8 468 0.4707 0.7549 0.4707 0.6861
No log 7.8333 470 0.4580 0.7846 0.4580 0.6767
No log 7.8667 472 0.4729 0.7388 0.4729 0.6877
No log 7.9 474 0.4957 0.7388 0.4957 0.7041
No log 7.9333 476 0.5342 0.7388 0.5342 0.7309
No log 7.9667 478 0.5576 0.7388 0.5576 0.7467
No log 8.0 480 0.5802 0.7789 0.5802 0.7617
No log 8.0333 482 0.5627 0.7789 0.5627 0.7501
No log 8.0667 484 0.5264 0.7789 0.5264 0.7255
No log 8.1 486 0.4980 0.7789 0.4980 0.7057
No log 8.1333 488 0.4650 0.7956 0.4650 0.6819
No log 8.1667 490 0.4352 0.7956 0.4352 0.6597
No log 8.2 492 0.4130 0.8231 0.4130 0.6426
No log 8.2333 494 0.4236 0.7956 0.4236 0.6508
No log 8.2667 496 0.4448 0.7956 0.4448 0.6670
No log 8.3 498 0.4526 0.7956 0.4526 0.6728
0.2822 8.3333 500 0.4484 0.7956 0.4484 0.6697
0.2822 8.3667 502 0.4343 0.7956 0.4343 0.6590
0.2822 8.4 504 0.3914 0.7956 0.3914 0.6256
0.2822 8.4333 506 0.3709 0.7956 0.3709 0.6090
0.2822 8.4667 508 0.3650 0.7956 0.3650 0.6041
0.2822 8.5 510 0.3891 0.7956 0.3891 0.6237
0.2822 8.5333 512 0.4027 0.7956 0.4027 0.6346
0.2822 8.5667 514 0.4171 0.7789 0.4171 0.6458
0.2822 8.6 516 0.4328 0.7789 0.4328 0.6579
0.2822 8.6333 518 0.4336 0.7789 0.4336 0.6585
0.2822 8.6667 520 0.4475 0.7789 0.4475 0.6690
0.2822 8.7 522 0.4557 0.7789 0.4557 0.6750
0.2822 8.7333 524 0.4500 0.7789 0.4500 0.6709
0.2822 8.7667 526 0.4493 0.7789 0.4493 0.6703
0.2822 8.8 528 0.4347 0.7789 0.4347 0.6593
0.2822 8.8333 530 0.4237 0.7789 0.4237 0.6509
0.2822 8.8667 532 0.3966 0.7956 0.3966 0.6298
0.2822 8.9 534 0.3758 0.7956 0.3758 0.6131
0.2822 8.9333 536 0.3630 0.7956 0.3630 0.6025
0.2822 8.9667 538 0.3647 0.7956 0.3647 0.6039
0.2822 9.0 540 0.3786 0.7956 0.3786 0.6153
0.2822 9.0333 542 0.3937 0.7956 0.3937 0.6275
0.2822 9.0667 544 0.4096 0.7956 0.4096 0.6400
0.2822 9.1 546 0.4053 0.7956 0.4053 0.6367
0.2822 9.1333 548 0.3887 0.7956 0.3887 0.6234
0.2822 9.1667 550 0.3697 0.7956 0.3697 0.6080
0.2822 9.2 552 0.3558 0.7956 0.3558 0.5965
0.2822 9.2333 554 0.3546 0.7956 0.3546 0.5955
0.2822 9.2667 556 0.3634 0.7956 0.3634 0.6028
0.2822 9.3 558 0.3709 0.7956 0.3709 0.6090
0.2822 9.3333 560 0.3798 0.7956 0.3798 0.6163
0.2822 9.3667 562 0.3919 0.7956 0.3919 0.6261
0.2822 9.4 564 0.4106 0.7789 0.4106 0.6408
0.2822 9.4333 566 0.4264 0.7789 0.4264 0.6530
0.2822 9.4667 568 0.4329 0.7789 0.4329 0.6580
0.2822 9.5 570 0.4316 0.7789 0.4316 0.6570
0.2822 9.5333 572 0.4315 0.7789 0.4315 0.6569
0.2822 9.5667 574 0.4326 0.7789 0.4326 0.6577
0.2822 9.6 576 0.4234 0.7789 0.4234 0.6507
0.2822 9.6333 578 0.4124 0.7789 0.4124 0.6422
0.2822 9.6667 580 0.4092 0.7789 0.4092 0.6397
0.2822 9.7 582 0.4059 0.7789 0.4059 0.6371
0.2822 9.7333 584 0.3996 0.7956 0.3996 0.6322
0.2822 9.7667 586 0.3960 0.7956 0.3960 0.6293
0.2822 9.8 588 0.3948 0.7956 0.3948 0.6283
0.2822 9.8333 590 0.3937 0.7956 0.3937 0.6274
0.2822 9.8667 592 0.3912 0.7956 0.3912 0.6255
0.2822 9.9 594 0.3879 0.7956 0.3879 0.6228
0.2822 9.9333 596 0.3854 0.7956 0.3854 0.6208
0.2822 9.9667 598 0.3844 0.7956 0.3844 0.6200
0.2822 10.0 600 0.3837 0.7956 0.3837 0.6194

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
4
Safetensors
Model size
135M params
Tensor type
F32
·
Inference Providers NEW
This model is not currently available via any of the supported Inference Providers.

Model tree for MayBashendy/Arabic_FineTuningAraBERT_AugV0_k10_task1_organization_fold1

Finetuned
(4222)
this model