ArabicNewSplits8_usingALLEssays_FineTuningAraBERT_run1_AugV5_k17_task2_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.6216
  • Qwk: 0.4473
  • Mse: 0.6216
  • Rmse: 0.7884

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0227 2 4.1099 -0.0156 4.1099 2.0273
No log 0.0455 4 2.2908 0.0624 2.2908 1.5135
No log 0.0682 6 1.2693 0.0258 1.2693 1.1266
No log 0.0909 8 1.0996 -0.0865 1.0996 1.0486
No log 0.1136 10 0.8345 0.2010 0.8345 0.9135
No log 0.1364 12 0.8154 0.1624 0.8154 0.9030
No log 0.1591 14 0.8399 0.1744 0.8399 0.9164
No log 0.1818 16 0.8794 0.1994 0.8794 0.9378
No log 0.2045 18 1.0899 0.0906 1.0899 1.0440
No log 0.2273 20 1.3842 -0.0568 1.3842 1.1765
No log 0.25 22 1.3723 0.0152 1.3723 1.1715
No log 0.2727 24 1.5472 0.0756 1.5472 1.2439
No log 0.2955 26 1.5717 0.0323 1.5717 1.2537
No log 0.3182 28 1.3298 -0.0290 1.3298 1.1532
No log 0.3409 30 1.0326 0.0574 1.0326 1.0162
No log 0.3636 32 0.9026 0.1496 0.9026 0.9500
No log 0.3864 34 0.8495 0.2239 0.8495 0.9217
No log 0.4091 36 0.9157 0.1867 0.9157 0.9569
No log 0.4318 38 0.9796 0.2049 0.9796 0.9897
No log 0.4545 40 1.1976 0.1888 1.1976 1.0943
No log 0.4773 42 1.1977 0.2031 1.1977 1.0944
No log 0.5 44 1.1920 0.1449 1.1920 1.0918
No log 0.5227 46 1.0787 0.1547 1.0787 1.0386
No log 0.5455 48 1.0565 0.0904 1.0565 1.0279
No log 0.5682 50 0.9601 0.1748 0.9601 0.9798
No log 0.5909 52 0.9102 0.2748 0.9102 0.9541
No log 0.6136 54 0.9617 0.2325 0.9617 0.9806
No log 0.6364 56 1.1189 0.2104 1.1189 1.0578
No log 0.6591 58 1.1292 0.1752 1.1292 1.0627
No log 0.6818 60 1.1897 0.0957 1.1897 1.0907
No log 0.7045 62 1.2487 0.1669 1.2487 1.1174
No log 0.7273 64 1.0565 0.1367 1.0565 1.0279
No log 0.75 66 0.9177 0.2267 0.9177 0.9579
No log 0.7727 68 0.9339 0.2267 0.9339 0.9664
No log 0.7955 70 0.9933 0.2169 0.9933 0.9966
No log 0.8182 72 1.0842 0.1652 1.0842 1.0412
No log 0.8409 74 1.0223 0.1928 1.0223 1.0111
No log 0.8636 76 1.0365 0.1872 1.0365 1.0181
No log 0.8864 78 1.1340 0.1805 1.1340 1.0649
No log 0.9091 80 1.0687 0.1805 1.0687 1.0338
No log 0.9318 82 0.8867 0.2571 0.8867 0.9417
No log 0.9545 84 0.7166 0.3409 0.7166 0.8465
No log 0.9773 86 0.6814 0.4413 0.6814 0.8254
No log 1.0 88 0.6562 0.4523 0.6562 0.8101
No log 1.0227 90 0.6544 0.4402 0.6544 0.8090
No log 1.0455 92 0.6549 0.4298 0.6549 0.8092
No log 1.0682 94 0.6617 0.4465 0.6617 0.8134
No log 1.0909 96 0.7562 0.4125 0.7562 0.8696
No log 1.1136 98 0.8636 0.3595 0.8636 0.9293
No log 1.1364 100 0.8333 0.3570 0.8333 0.9129
No log 1.1591 102 0.7301 0.4488 0.7301 0.8545
No log 1.1818 104 0.6612 0.4588 0.6612 0.8131
No log 1.2045 106 0.6500 0.4618 0.6500 0.8062
No log 1.2273 108 0.6944 0.3940 0.6944 0.8333
No log 1.25 110 0.6808 0.4496 0.6808 0.8251
No log 1.2727 112 0.6634 0.4389 0.6634 0.8145
No log 1.2955 114 0.6472 0.4614 0.6472 0.8045
No log 1.3182 116 0.6588 0.4396 0.6588 0.8117
No log 1.3409 118 0.6621 0.4482 0.6621 0.8137
No log 1.3636 120 0.6750 0.4827 0.6750 0.8216
No log 1.3864 122 0.6975 0.4072 0.6975 0.8352
No log 1.4091 124 0.6982 0.4856 0.6982 0.8356
No log 1.4318 126 0.7534 0.4575 0.7534 0.8680
No log 1.4545 128 1.0781 0.3302 1.0781 1.0383
No log 1.4773 130 0.9862 0.3447 0.9862 0.9931
No log 1.5 132 0.6761 0.48 0.6761 0.8223
No log 1.5227 134 0.6454 0.4151 0.6454 0.8034
No log 1.5455 136 0.6692 0.4261 0.6692 0.8180
No log 1.5682 138 0.6318 0.3661 0.6318 0.7949
No log 1.5909 140 0.8353 0.3908 0.8353 0.9140
No log 1.6136 142 1.0756 0.2009 1.0756 1.0371
No log 1.6364 144 0.8907 0.3191 0.8907 0.9438
No log 1.6591 146 0.6391 0.4784 0.6391 0.7994
No log 1.6818 148 0.6220 0.4102 0.6220 0.7887
No log 1.7045 150 0.6233 0.4284 0.6233 0.7895
No log 1.7273 152 0.7015 0.5507 0.7015 0.8376
No log 1.75 154 0.8576 0.4136 0.8576 0.9260
No log 1.7727 156 0.7904 0.4821 0.7904 0.8891
No log 1.7955 158 0.7411 0.4902 0.7411 0.8609
No log 1.8182 160 0.6802 0.4532 0.6802 0.8248
No log 1.8409 162 0.6132 0.4506 0.6132 0.7831
No log 1.8636 164 0.6203 0.4372 0.6203 0.7876
No log 1.8864 166 0.6264 0.4635 0.6264 0.7914
No log 1.9091 168 0.7215 0.4097 0.7215 0.8494
No log 1.9318 170 0.7232 0.4210 0.7232 0.8504
No log 1.9545 172 0.6766 0.4873 0.6766 0.8226
No log 1.9773 174 0.7119 0.5016 0.7119 0.8437
No log 2.0 176 0.8294 0.4151 0.8294 0.9107
No log 2.0227 178 0.9000 0.4199 0.9000 0.9487
No log 2.0455 180 0.9700 0.3984 0.9700 0.9849
No log 2.0682 182 0.8547 0.3862 0.8547 0.9245
No log 2.0909 184 0.7254 0.4838 0.7254 0.8517
No log 2.1136 186 0.6819 0.4387 0.6819 0.8258
No log 2.1364 188 0.6777 0.4476 0.6777 0.8232
No log 2.1591 190 0.6811 0.4970 0.6811 0.8253
No log 2.1818 192 0.7233 0.4467 0.7233 0.8505
No log 2.2045 194 0.7651 0.4584 0.7651 0.8747
No log 2.2273 196 0.6909 0.4964 0.6909 0.8312
No log 2.25 198 0.6619 0.4456 0.6619 0.8136
No log 2.2727 200 0.6885 0.4337 0.6885 0.8298
No log 2.2955 202 0.6699 0.4403 0.6699 0.8185
No log 2.3182 204 0.7049 0.4729 0.7049 0.8396
No log 2.3409 206 0.8128 0.4883 0.8128 0.9015
No log 2.3636 208 0.8613 0.4552 0.8613 0.9281
No log 2.3864 210 0.7691 0.4709 0.7691 0.8770
No log 2.4091 212 0.6917 0.5145 0.6917 0.8317
No log 2.4318 214 0.6983 0.4881 0.6983 0.8356
No log 2.4545 216 0.7654 0.4848 0.7654 0.8748
No log 2.4773 218 0.7111 0.4758 0.7111 0.8432
No log 2.5 220 0.6531 0.5053 0.6531 0.8081
No log 2.5227 222 0.6413 0.4714 0.6413 0.8008
No log 2.5455 224 0.6437 0.4851 0.6437 0.8023
No log 2.5682 226 0.6444 0.4490 0.6444 0.8027
No log 2.5909 228 0.6822 0.4722 0.6822 0.8259
No log 2.6136 230 0.7484 0.4467 0.7484 0.8651
No log 2.6364 232 0.7224 0.4744 0.7224 0.8499
No log 2.6591 234 0.6703 0.4895 0.6703 0.8187
No log 2.6818 236 0.6716 0.4608 0.6716 0.8195
No log 2.7045 238 0.6863 0.4228 0.6863 0.8284
No log 2.7273 240 0.7090 0.4871 0.7090 0.8420
No log 2.75 242 0.7463 0.4566 0.7463 0.8639
No log 2.7727 244 0.7124 0.4092 0.7124 0.8441
No log 2.7955 246 0.7213 0.4640 0.7213 0.8493
No log 2.8182 248 0.7497 0.4701 0.7497 0.8659
No log 2.8409 250 0.6819 0.4488 0.6819 0.8258
No log 2.8636 252 0.6575 0.4915 0.6575 0.8108
No log 2.8864 254 0.6840 0.4256 0.6840 0.8271
No log 2.9091 256 0.6749 0.4404 0.6749 0.8215
No log 2.9318 258 0.6717 0.5094 0.6717 0.8196
No log 2.9545 260 0.6992 0.4584 0.6992 0.8362
No log 2.9773 262 0.6968 0.4672 0.6968 0.8347
No log 3.0 264 0.7322 0.4705 0.7322 0.8557
No log 3.0227 266 0.6922 0.5066 0.6922 0.8320
No log 3.0455 268 0.6578 0.4930 0.6578 0.8110
No log 3.0682 270 0.6723 0.4548 0.6723 0.8199
No log 3.0909 272 0.6949 0.4233 0.6949 0.8336
No log 3.1136 274 0.6477 0.4813 0.6477 0.8048
No log 3.1364 276 0.6615 0.5041 0.6615 0.8133
No log 3.1591 278 0.6933 0.5109 0.6933 0.8327
No log 3.1818 280 0.6746 0.4884 0.6746 0.8214
No log 3.2045 282 0.6926 0.5223 0.6926 0.8322
No log 3.2273 284 0.7483 0.5112 0.7483 0.8651
No log 3.25 286 0.7441 0.5009 0.7441 0.8626
No log 3.2727 288 0.6650 0.5017 0.6650 0.8155
No log 3.2955 290 0.6669 0.3999 0.6669 0.8166
No log 3.3182 292 0.6960 0.4484 0.6960 0.8343
No log 3.3409 294 0.8392 0.5019 0.8392 0.9161
No log 3.3636 296 0.8603 0.4950 0.8603 0.9275
No log 3.3864 298 0.7032 0.4862 0.7032 0.8386
No log 3.4091 300 0.6496 0.4490 0.6496 0.8060
No log 3.4318 302 0.6590 0.4762 0.6590 0.8118
No log 3.4545 304 0.7671 0.4546 0.7671 0.8759
No log 3.4773 306 0.7460 0.4673 0.7460 0.8637
No log 3.5 308 0.6498 0.4586 0.6498 0.8061
No log 3.5227 310 0.6290 0.4260 0.6290 0.7931
No log 3.5455 312 0.6519 0.4024 0.6519 0.8074
No log 3.5682 314 0.6548 0.4289 0.6548 0.8092
No log 3.5909 316 0.6627 0.4578 0.6627 0.8140
No log 3.6136 318 0.6822 0.4648 0.6822 0.8260
No log 3.6364 320 0.7080 0.4549 0.7080 0.8415
No log 3.6591 322 0.7123 0.4649 0.7123 0.8440
No log 3.6818 324 0.7332 0.4347 0.7332 0.8563
No log 3.7045 326 0.7917 0.4308 0.7917 0.8898
No log 3.7273 328 0.7630 0.4033 0.7630 0.8735
No log 3.75 330 0.7321 0.4 0.7321 0.8556
No log 3.7727 332 0.7887 0.4180 0.7887 0.8881
No log 3.7955 334 0.8181 0.3937 0.8181 0.9045
No log 3.8182 336 0.7745 0.4180 0.7745 0.8800
No log 3.8409 338 0.7049 0.4443 0.7049 0.8396
No log 3.8636 340 0.7031 0.4306 0.7031 0.8385
No log 3.8864 342 0.6960 0.4071 0.6960 0.8342
No log 3.9091 344 0.7023 0.3894 0.7023 0.8380
No log 3.9318 346 0.6835 0.4041 0.6835 0.8268
No log 3.9545 348 0.6914 0.3920 0.6914 0.8315
No log 3.9773 350 0.7011 0.3954 0.7011 0.8373
No log 4.0 352 0.7412 0.4004 0.7412 0.8609
No log 4.0227 354 0.7582 0.4296 0.7582 0.8707
No log 4.0455 356 0.6931 0.4304 0.6931 0.8325
No log 4.0682 358 0.6891 0.4472 0.6891 0.8301
No log 4.0909 360 0.7184 0.4385 0.7184 0.8476
No log 4.1136 362 0.6909 0.4094 0.6909 0.8312
No log 4.1364 364 0.6648 0.3852 0.6648 0.8153
No log 4.1591 366 0.6788 0.4066 0.6788 0.8239
No log 4.1818 368 0.7118 0.4253 0.7118 0.8437
No log 4.2045 370 0.8165 0.4142 0.8165 0.9036
No log 4.2273 372 0.7992 0.4410 0.7992 0.8940
No log 4.25 374 0.7454 0.4354 0.7454 0.8634
No log 4.2727 376 0.7081 0.4248 0.7081 0.8415
No log 4.2955 378 0.6704 0.4430 0.6704 0.8188
No log 4.3182 380 0.6615 0.4430 0.6615 0.8133
No log 4.3409 382 0.6623 0.4165 0.6623 0.8138
No log 4.3636 384 0.6580 0.4610 0.6580 0.8112
No log 4.3864 386 0.6574 0.4610 0.6574 0.8108
No log 4.4091 388 0.6630 0.4513 0.6630 0.8142
No log 4.4318 390 0.6745 0.3740 0.6745 0.8213
No log 4.4545 392 0.7388 0.4238 0.7388 0.8595
No log 4.4773 394 0.7531 0.4062 0.7531 0.8678
No log 4.5 396 0.7151 0.3776 0.7151 0.8456
No log 4.5227 398 0.7306 0.4633 0.7306 0.8547
No log 4.5455 400 0.7953 0.4595 0.7953 0.8918
No log 4.5682 402 0.7517 0.4500 0.7517 0.8670
No log 4.5909 404 0.6715 0.4652 0.6715 0.8195
No log 4.6136 406 0.6722 0.4144 0.6722 0.8199
No log 4.6364 408 0.6873 0.4175 0.6873 0.8290
No log 4.6591 410 0.6485 0.4414 0.6485 0.8053
No log 4.6818 412 0.6671 0.4439 0.6671 0.8168
No log 4.7045 414 0.6982 0.4009 0.6982 0.8356
No log 4.7273 416 0.6752 0.3564 0.6752 0.8217
No log 4.75 418 0.7144 0.4409 0.7144 0.8452
No log 4.7727 420 0.7788 0.4761 0.7788 0.8825
No log 4.7955 422 0.7315 0.4893 0.7315 0.8553
No log 4.8182 424 0.6960 0.5116 0.6960 0.8343
No log 4.8409 426 0.6478 0.5175 0.6478 0.8049
No log 4.8636 428 0.6307 0.4552 0.6307 0.7942
No log 4.8864 430 0.6356 0.4552 0.6356 0.7972
No log 4.9091 432 0.6558 0.5104 0.6558 0.8098
No log 4.9318 434 0.6563 0.4512 0.6563 0.8101
No log 4.9545 436 0.6508 0.4605 0.6508 0.8067
No log 4.9773 438 0.6669 0.4489 0.6669 0.8166
No log 5.0 440 0.7254 0.5010 0.7254 0.8517
No log 5.0227 442 0.7170 0.4920 0.7170 0.8468
No log 5.0455 444 0.6645 0.4589 0.6645 0.8152
No log 5.0682 446 0.6340 0.4069 0.6340 0.7963
No log 5.0909 448 0.6355 0.4034 0.6355 0.7972
No log 5.1136 450 0.6422 0.3816 0.6422 0.8013
No log 5.1364 452 0.7194 0.4697 0.7194 0.8482
No log 5.1591 454 0.7931 0.4784 0.7931 0.8906
No log 5.1818 456 0.7231 0.4583 0.7231 0.8503
No log 5.2045 458 0.6598 0.3633 0.6598 0.8123
No log 5.2273 460 0.6538 0.3407 0.6538 0.8086
No log 5.25 462 0.6666 0.3823 0.6666 0.8165
No log 5.2727 464 0.6949 0.4055 0.6949 0.8336
No log 5.2955 466 0.6988 0.4023 0.6988 0.8360
No log 5.3182 468 0.6973 0.3928 0.6973 0.8351
No log 5.3409 470 0.7257 0.4026 0.7257 0.8519
No log 5.3636 472 0.7213 0.3810 0.7213 0.8493
No log 5.3864 474 0.6966 0.3327 0.6966 0.8346
No log 5.4091 476 0.6982 0.3935 0.6982 0.8356
No log 5.4318 478 0.6976 0.4427 0.6976 0.8352
No log 5.4545 480 0.6914 0.4557 0.6914 0.8315
No log 5.4773 482 0.6623 0.4612 0.6623 0.8138
No log 5.5 484 0.6369 0.3782 0.6369 0.7981
No log 5.5227 486 0.6329 0.3758 0.6329 0.7955
No log 5.5455 488 0.6511 0.4512 0.6511 0.8069
No log 5.5682 490 0.7272 0.4779 0.7272 0.8528
No log 5.5909 492 0.7207 0.4717 0.7207 0.8489
No log 5.6136 494 0.6907 0.4940 0.6907 0.8311
No log 5.6364 496 0.6384 0.4982 0.6384 0.7990
No log 5.6591 498 0.6265 0.3600 0.6265 0.7915
0.3618 5.6818 500 0.6339 0.4499 0.6339 0.7962
0.3618 5.7045 502 0.6275 0.3691 0.6275 0.7921
0.3618 5.7273 504 0.6202 0.3803 0.6202 0.7875
0.3618 5.75 506 0.6399 0.4618 0.6399 0.7999
0.3618 5.7727 508 0.6272 0.4598 0.6272 0.7920
0.3618 5.7955 510 0.6046 0.4311 0.6046 0.7776
0.3618 5.8182 512 0.6047 0.3685 0.6047 0.7776
0.3618 5.8409 514 0.6051 0.3934 0.6051 0.7779
0.3618 5.8636 516 0.6723 0.4926 0.6723 0.8199
0.3618 5.8864 518 0.7544 0.5022 0.7544 0.8686
0.3618 5.9091 520 0.7264 0.4467 0.7264 0.8523
0.3618 5.9318 522 0.7133 0.4169 0.7133 0.8446
0.3618 5.9545 524 0.7436 0.4554 0.7436 0.8623
0.3618 5.9773 526 0.7340 0.3909 0.7340 0.8567
0.3618 6.0 528 0.7557 0.4029 0.7557 0.8693
0.3618 6.0227 530 0.8114 0.4301 0.8114 0.9008
0.3618 6.0455 532 0.7770 0.4721 0.7770 0.8815
0.3618 6.0682 534 0.7087 0.4586 0.7087 0.8419
0.3618 6.0909 536 0.6736 0.4255 0.6736 0.8207
0.3618 6.1136 538 0.6666 0.4250 0.6666 0.8164
0.3618 6.1364 540 0.7008 0.4733 0.7008 0.8371
0.3618 6.1591 542 0.7102 0.5017 0.7102 0.8427
0.3618 6.1818 544 0.7393 0.4970 0.7393 0.8598
0.3618 6.2045 546 0.7602 0.4639 0.7602 0.8719
0.3618 6.2273 548 0.7329 0.4697 0.7329 0.8561
0.3618 6.25 550 0.7079 0.4929 0.7079 0.8414
0.3618 6.2727 552 0.6993 0.4481 0.6993 0.8362
0.3618 6.2955 554 0.7050 0.4266 0.7050 0.8396
0.3618 6.3182 556 0.7176 0.4350 0.7176 0.8471
0.3618 6.3409 558 0.7902 0.4339 0.7902 0.8890
0.3618 6.3636 560 0.7959 0.4516 0.7959 0.8921
0.3618 6.3864 562 0.6976 0.4053 0.6976 0.8352
0.3618 6.4091 564 0.6501 0.4139 0.6501 0.8063
0.3618 6.4318 566 0.6532 0.4284 0.6532 0.8082
0.3618 6.4545 568 0.6911 0.4650 0.6911 0.8313
0.3618 6.4773 570 0.7375 0.4380 0.7375 0.8588
0.3618 6.5 572 0.7097 0.4285 0.7097 0.8424
0.3618 6.5227 574 0.6629 0.4562 0.6629 0.8142
0.3618 6.5455 576 0.6801 0.4462 0.6801 0.8247
0.3618 6.5682 578 0.6980 0.4553 0.6980 0.8355
0.3618 6.5909 580 0.7465 0.5169 0.7465 0.8640
0.3618 6.6136 582 0.7345 0.5027 0.7345 0.8570
0.3618 6.6364 584 0.7065 0.5015 0.7065 0.8405
0.3618 6.6591 586 0.6669 0.4940 0.6669 0.8167
0.3618 6.6818 588 0.6539 0.4864 0.6539 0.8086
0.3618 6.7045 590 0.6542 0.4437 0.6542 0.8088
0.3618 6.7273 592 0.6299 0.4756 0.6299 0.7937
0.3618 6.75 594 0.6388 0.4459 0.6388 0.7992
0.3618 6.7727 596 0.6488 0.4428 0.6488 0.8055
0.3618 6.7955 598 0.6340 0.4644 0.6340 0.7962
0.3618 6.8182 600 0.6274 0.4564 0.6274 0.7921
0.3618 6.8409 602 0.6242 0.4473 0.6242 0.7901
0.3618 6.8636 604 0.6225 0.4446 0.6225 0.7890
0.3618 6.8864 606 0.6256 0.4607 0.6256 0.7909
0.3618 6.9091 608 0.6334 0.4498 0.6334 0.7959
0.3618 6.9318 610 0.6374 0.4644 0.6374 0.7984
0.3618 6.9545 612 0.6589 0.4483 0.6589 0.8117
0.3618 6.9773 614 0.6576 0.4452 0.6576 0.8109
0.3618 7.0 616 0.6338 0.4432 0.6338 0.7961
0.3618 7.0227 618 0.6434 0.4583 0.6434 0.8021
0.3618 7.0455 620 0.6411 0.4592 0.6411 0.8007
0.3618 7.0682 622 0.6224 0.4204 0.6224 0.7889
0.3618 7.0909 624 0.6330 0.3887 0.6330 0.7956
0.3618 7.1136 626 0.6367 0.3848 0.6367 0.7980
0.3618 7.1364 628 0.6243 0.4267 0.6243 0.7901
0.3618 7.1591 630 0.6216 0.4473 0.6216 0.7884

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
4
Safetensors
Model size
135M params
Tensor type
F32
·
Inference Providers NEW
This model is not currently available via any of the supported Inference Providers.

Model tree for MayBashendy/ArabicNewSplits8_usingALLEssays_FineTuningAraBERT_run1_AugV5_k17_task2_organization

Finetuned
(4222)
this model