ArabicNewSplits8_usingALLEssays_FineTuningAraBERT_run3_AugV5_k8_task2_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.5691
  • Qwk: 0.4368
  • Mse: 0.5691
  • Rmse: 0.7544

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0476 2 4.4110 -0.0315 4.4110 2.1002
No log 0.0952 4 2.7729 -0.0339 2.7729 1.6652
No log 0.1429 6 3.0646 -0.1292 3.0646 1.7506
No log 0.1905 8 2.4227 -0.1660 2.4227 1.5565
No log 0.2381 10 1.1109 -0.0016 1.1109 1.0540
No log 0.2857 12 0.9411 0.1196 0.9411 0.9701
No log 0.3333 14 1.9100 -0.0262 1.9100 1.3820
No log 0.3810 16 1.7786 -0.0139 1.7786 1.3336
No log 0.4286 18 1.0136 0.1363 1.0136 1.0068
No log 0.4762 20 0.8147 0.1844 0.8147 0.9026
No log 0.5238 22 0.8085 0.1058 0.8085 0.8992
No log 0.5714 24 0.7938 0.0516 0.7938 0.8910
No log 0.6190 26 0.7743 0.1481 0.7743 0.8799
No log 0.6667 28 0.7352 0.2979 0.7352 0.8574
No log 0.7143 30 0.7992 0.2109 0.7992 0.8940
No log 0.7619 32 0.7787 0.2751 0.7787 0.8824
No log 0.8095 34 0.6847 0.3874 0.6847 0.8275
No log 0.8571 36 0.6881 0.4241 0.6881 0.8295
No log 0.9048 38 0.8796 0.3289 0.8796 0.9379
No log 0.9524 40 1.4598 0.2257 1.4598 1.2082
No log 1.0 42 1.7806 0.1536 1.7806 1.3344
No log 1.0476 44 1.7099 0.1636 1.7099 1.3076
No log 1.0952 46 1.3750 0.2000 1.3750 1.1726
No log 1.1429 48 0.8878 0.3972 0.8878 0.9422
No log 1.1905 50 0.6731 0.4633 0.6731 0.8204
No log 1.2381 52 0.6397 0.4580 0.6397 0.7998
No log 1.2857 54 0.6173 0.3843 0.6173 0.7857
No log 1.3333 56 0.6143 0.4830 0.6143 0.7838
No log 1.3810 58 0.6994 0.3708 0.6994 0.8363
No log 1.4286 60 0.7905 0.2947 0.7905 0.8891
No log 1.4762 62 0.7185 0.3815 0.7185 0.8476
No log 1.5238 64 0.7025 0.3668 0.7025 0.8382
No log 1.5714 66 0.8251 0.3588 0.8251 0.9083
No log 1.6190 68 0.9942 0.3359 0.9942 0.9971
No log 1.6667 70 0.9806 0.3589 0.9806 0.9902
No log 1.7143 72 0.8719 0.4230 0.8719 0.9337
No log 1.7619 74 0.8163 0.5045 0.8163 0.9035
No log 1.8095 76 0.8229 0.4796 0.8229 0.9071
No log 1.8571 78 0.8342 0.4561 0.8342 0.9133
No log 1.9048 80 0.6901 0.5048 0.6901 0.8307
No log 1.9524 82 0.6153 0.5671 0.6153 0.7844
No log 2.0 84 0.6721 0.4417 0.6721 0.8198
No log 2.0476 86 0.7187 0.4287 0.7187 0.8478
No log 2.0952 88 0.6973 0.5634 0.6973 0.8350
No log 2.1429 90 0.7681 0.4925 0.7681 0.8764
No log 2.1905 92 0.9891 0.4079 0.9891 0.9945
No log 2.2381 94 1.0057 0.4104 1.0057 1.0028
No log 2.2857 96 0.8528 0.4719 0.8528 0.9235
No log 2.3333 98 0.7600 0.5655 0.7600 0.8718
No log 2.3810 100 0.6747 0.5596 0.6747 0.8214
No log 2.4286 102 0.6728 0.5034 0.6728 0.8202
No log 2.4762 104 0.6992 0.5748 0.6992 0.8362
No log 2.5238 106 0.9037 0.4206 0.9037 0.9506
No log 2.5714 108 0.9558 0.3944 0.9558 0.9776
No log 2.6190 110 0.8604 0.4449 0.8604 0.9276
No log 2.6667 112 0.8482 0.4536 0.8482 0.9210
No log 2.7143 114 0.8869 0.4449 0.8869 0.9418
No log 2.7619 116 0.7814 0.4861 0.7814 0.8840
No log 2.8095 118 0.6860 0.5565 0.6860 0.8283
No log 2.8571 120 0.6634 0.4968 0.6634 0.8145
No log 2.9048 122 0.7012 0.5426 0.7012 0.8374
No log 2.9524 124 0.7346 0.5525 0.7346 0.8571
No log 3.0 126 0.7372 0.5501 0.7372 0.8586
No log 3.0476 128 0.8362 0.5038 0.8362 0.9145
No log 3.0952 130 0.9898 0.3876 0.9898 0.9949
No log 3.1429 132 0.8730 0.4965 0.8730 0.9343
No log 3.1905 134 0.7123 0.5392 0.7123 0.8440
No log 3.2381 136 0.6788 0.5327 0.6788 0.8239
No log 3.2857 138 0.6856 0.5287 0.6856 0.8280
No log 3.3333 140 0.7216 0.5447 0.7216 0.8495
No log 3.3810 142 0.8615 0.4556 0.8615 0.9282
No log 3.4286 144 0.9581 0.3627 0.9581 0.9788
No log 3.4762 146 0.9120 0.4539 0.9120 0.9550
No log 3.5238 148 0.8047 0.5434 0.8047 0.8971
No log 3.5714 150 0.7919 0.5647 0.7919 0.8899
No log 3.6190 152 0.7872 0.5647 0.7872 0.8872
No log 3.6667 154 0.7719 0.5580 0.7719 0.8786
No log 3.7143 156 0.8896 0.4615 0.8896 0.9432
No log 3.7619 158 0.9534 0.3902 0.9534 0.9764
No log 3.8095 160 0.8634 0.4760 0.8634 0.9292
No log 3.8571 162 0.8254 0.4675 0.8254 0.9085
No log 3.9048 164 0.8056 0.4659 0.8056 0.8975
No log 3.9524 166 0.7602 0.5055 0.7602 0.8719
No log 4.0 168 0.6733 0.5709 0.6733 0.8206
No log 4.0476 170 0.6417 0.5141 0.6417 0.8011
No log 4.0952 172 0.6451 0.5291 0.6451 0.8032
No log 4.1429 174 0.7263 0.5307 0.7263 0.8522
No log 4.1905 176 0.7048 0.5324 0.7048 0.8395
No log 4.2381 178 0.6957 0.5682 0.6957 0.8341
No log 4.2857 180 0.6717 0.4946 0.6717 0.8196
No log 4.3333 182 0.7068 0.4927 0.7068 0.8407
No log 4.3810 184 0.7268 0.5027 0.7268 0.8525
No log 4.4286 186 0.7161 0.5021 0.7161 0.8462
No log 4.4762 188 0.7441 0.4535 0.7441 0.8626
No log 4.5238 190 0.7175 0.4496 0.7175 0.8470
No log 4.5714 192 0.6791 0.4671 0.6791 0.8241
No log 4.6190 194 0.7338 0.48 0.7338 0.8566
No log 4.6667 196 0.9203 0.3898 0.9203 0.9593
No log 4.7143 198 0.9761 0.3844 0.9761 0.9880
No log 4.7619 200 0.8031 0.4735 0.8031 0.8962
No log 4.8095 202 0.6451 0.4173 0.6451 0.8032
No log 4.8571 204 0.6182 0.4057 0.6182 0.7863
No log 4.9048 206 0.6396 0.4371 0.6396 0.7998
No log 4.9524 208 0.6271 0.4882 0.6271 0.7919
No log 5.0 210 0.6490 0.4967 0.6490 0.8056
No log 5.0476 212 0.6752 0.5176 0.6752 0.8217
No log 5.0952 214 0.7118 0.5337 0.7118 0.8437
No log 5.1429 216 0.7129 0.5264 0.7129 0.8443
No log 5.1905 218 0.6931 0.5505 0.6931 0.8325
No log 5.2381 220 0.6894 0.5029 0.6894 0.8303
No log 5.2857 222 0.6577 0.4904 0.6577 0.8110
No log 5.3333 224 0.6165 0.4763 0.6165 0.7852
No log 5.3810 226 0.6798 0.5009 0.6798 0.8245
No log 5.4286 228 0.7446 0.4799 0.7446 0.8629
No log 5.4762 230 0.7287 0.5294 0.7287 0.8536
No log 5.5238 232 0.6714 0.5484 0.6714 0.8194
No log 5.5714 234 0.6874 0.5081 0.6874 0.8291
No log 5.6190 236 0.6983 0.4976 0.6983 0.8357
No log 5.6667 238 0.7152 0.5409 0.7152 0.8457
No log 5.7143 240 0.7180 0.5365 0.7180 0.8473
No log 5.7619 242 0.6763 0.5029 0.6763 0.8223
No log 5.8095 244 0.6612 0.4627 0.6612 0.8132
No log 5.8571 246 0.6596 0.4787 0.6596 0.8122
No log 5.9048 248 0.6611 0.4708 0.6611 0.8131
No log 5.9524 250 0.6585 0.4373 0.6585 0.8115
No log 6.0 252 0.6778 0.4542 0.6778 0.8233
No log 6.0476 254 0.7054 0.4915 0.7054 0.8399
No log 6.0952 256 0.7219 0.4899 0.7219 0.8497
No log 6.1429 258 0.7749 0.5507 0.7749 0.8803
No log 6.1905 260 0.8156 0.5257 0.8156 0.9031
No log 6.2381 262 0.7928 0.5050 0.7928 0.8904
No log 6.2857 264 0.8015 0.5268 0.8015 0.8953
No log 6.3333 266 0.7113 0.5334 0.7113 0.8434
No log 6.3810 268 0.7308 0.4815 0.7308 0.8549
No log 6.4286 270 0.7995 0.4567 0.7995 0.8942
No log 6.4762 272 0.8030 0.4337 0.8030 0.8961
No log 6.5238 274 0.7575 0.4288 0.7575 0.8704
No log 6.5714 276 0.7263 0.4301 0.7263 0.8522
No log 6.6190 278 0.7189 0.4336 0.7189 0.8479
No log 6.6667 280 0.6943 0.4386 0.6943 0.8333
No log 6.7143 282 0.6800 0.4834 0.6800 0.8246
No log 6.7619 284 0.6627 0.4808 0.6627 0.8141
No log 6.8095 286 0.6325 0.4498 0.6325 0.7953
No log 6.8571 288 0.6210 0.5264 0.6210 0.7880
No log 6.9048 290 0.6637 0.5714 0.6637 0.8147
No log 6.9524 292 0.7879 0.5013 0.7879 0.8877
No log 7.0 294 0.8184 0.4582 0.8184 0.9047
No log 7.0476 296 0.8020 0.4793 0.8020 0.8956
No log 7.0952 298 0.7170 0.5902 0.7170 0.8467
No log 7.1429 300 0.6457 0.5449 0.6457 0.8036
No log 7.1905 302 0.6488 0.5078 0.6488 0.8055
No log 7.2381 304 0.6383 0.5316 0.6383 0.7989
No log 7.2857 306 0.6550 0.5327 0.6550 0.8093
No log 7.3333 308 0.6852 0.5161 0.6852 0.8278
No log 7.3810 310 0.6490 0.4712 0.6490 0.8056
No log 7.4286 312 0.6305 0.4857 0.6305 0.7941
No log 7.4762 314 0.6060 0.5228 0.6060 0.7785
No log 7.5238 316 0.6344 0.4588 0.6344 0.7965
No log 7.5714 318 0.6620 0.4433 0.6620 0.8136
No log 7.6190 320 0.6436 0.4513 0.6436 0.8023
No log 7.6667 322 0.6191 0.5290 0.6191 0.7869
No log 7.7143 324 0.6118 0.5083 0.6118 0.7821
No log 7.7619 326 0.6637 0.5348 0.6637 0.8147
No log 7.8095 328 0.6986 0.5247 0.6986 0.8358
No log 7.8571 330 0.6663 0.5199 0.6663 0.8163
No log 7.9048 332 0.6403 0.4914 0.6403 0.8002
No log 7.9524 334 0.6472 0.4659 0.6472 0.8045
No log 8.0 336 0.6499 0.4714 0.6499 0.8061
No log 8.0476 338 0.6670 0.5248 0.6670 0.8167
No log 8.0952 340 0.6754 0.4978 0.6754 0.8218
No log 8.1429 342 0.6458 0.5191 0.6458 0.8036
No log 8.1905 344 0.6427 0.4667 0.6427 0.8017
No log 8.2381 346 0.6495 0.4711 0.6495 0.8059
No log 8.2857 348 0.6122 0.4321 0.6122 0.7825
No log 8.3333 350 0.5956 0.4763 0.5956 0.7717
No log 8.3810 352 0.6444 0.4599 0.6444 0.8028
No log 8.4286 354 0.6947 0.4990 0.6947 0.8335
No log 8.4762 356 0.6912 0.5114 0.6912 0.8314
No log 8.5238 358 0.6920 0.5236 0.6920 0.8319
No log 8.5714 360 0.6974 0.5515 0.6974 0.8351
No log 8.6190 362 0.6826 0.5140 0.6826 0.8262
No log 8.6667 364 0.6482 0.5079 0.6482 0.8051
No log 8.7143 366 0.6220 0.5095 0.6220 0.7887
No log 8.7619 368 0.6193 0.4811 0.6193 0.7869
No log 8.8095 370 0.6334 0.4691 0.6334 0.7958
No log 8.8571 372 0.6180 0.4654 0.6180 0.7862
No log 8.9048 374 0.6123 0.4277 0.6123 0.7825
No log 8.9524 376 0.6085 0.4377 0.6085 0.7800
No log 9.0 378 0.6595 0.4164 0.6595 0.8121
No log 9.0476 380 0.7242 0.4704 0.7242 0.8510
No log 9.0952 382 0.7213 0.4704 0.7213 0.8493
No log 9.1429 384 0.6340 0.4630 0.6340 0.7962
No log 9.1905 386 0.6274 0.5259 0.6274 0.7921
No log 9.2381 388 0.6537 0.5392 0.6537 0.8085
No log 9.2857 390 0.6655 0.5473 0.6655 0.8158
No log 9.3333 392 0.6719 0.5473 0.6719 0.8197
No log 9.3810 394 0.6549 0.5129 0.6549 0.8093
No log 9.4286 396 0.6733 0.5288 0.6733 0.8206
No log 9.4762 398 0.6678 0.4824 0.6678 0.8172
No log 9.5238 400 0.6175 0.4092 0.6175 0.7858
No log 9.5714 402 0.5990 0.4025 0.5990 0.7739
No log 9.6190 404 0.5990 0.3920 0.5990 0.7740
No log 9.6667 406 0.6163 0.4648 0.6163 0.7850
No log 9.7143 408 0.6386 0.4437 0.6386 0.7991
No log 9.7619 410 0.6561 0.4430 0.6561 0.8100
No log 9.8095 412 0.6543 0.4449 0.6543 0.8089
No log 9.8571 414 0.6608 0.4358 0.6608 0.8129
No log 9.9048 416 0.6437 0.4309 0.6437 0.8023
No log 9.9524 418 0.6111 0.4315 0.6111 0.7818
No log 10.0 420 0.6399 0.4367 0.6399 0.7999
No log 10.0476 422 0.6892 0.4649 0.6892 0.8302
No log 10.0952 424 0.6832 0.4649 0.6832 0.8266
No log 10.1429 426 0.6212 0.4379 0.6212 0.7881
No log 10.1905 428 0.6103 0.4326 0.6103 0.7812
No log 10.2381 430 0.5943 0.4536 0.5943 0.7709
No log 10.2857 432 0.5915 0.4619 0.5915 0.7691
No log 10.3333 434 0.6149 0.5044 0.6149 0.7841
No log 10.3810 436 0.6158 0.4614 0.6158 0.7847
No log 10.4286 438 0.6485 0.4838 0.6485 0.8053
No log 10.4762 440 0.7008 0.5305 0.7008 0.8371
No log 10.5238 442 0.6752 0.4886 0.6752 0.8217
No log 10.5714 444 0.6533 0.4980 0.6533 0.8083
No log 10.6190 446 0.6497 0.4932 0.6497 0.8060
No log 10.6667 448 0.6484 0.5271 0.6484 0.8052
No log 10.7143 450 0.6516 0.4829 0.6516 0.8072
No log 10.7619 452 0.6399 0.5274 0.6399 0.7999
No log 10.8095 454 0.6209 0.4798 0.6209 0.7880
No log 10.8571 456 0.6117 0.4805 0.6117 0.7821
No log 10.9048 458 0.6121 0.4891 0.6121 0.7824
No log 10.9524 460 0.6189 0.4755 0.6189 0.7867
No log 11.0 462 0.5965 0.4465 0.5965 0.7724
No log 11.0476 464 0.5837 0.4355 0.5837 0.7640
No log 11.0952 466 0.5811 0.4322 0.5811 0.7623
No log 11.1429 468 0.5785 0.4332 0.5785 0.7606
No log 11.1905 470 0.5852 0.4388 0.5852 0.7650
No log 11.2381 472 0.6064 0.4535 0.6064 0.7787
No log 11.2857 474 0.6242 0.4932 0.6242 0.7901
No log 11.3333 476 0.6406 0.4684 0.6406 0.8004
No log 11.3810 478 0.6521 0.5035 0.6521 0.8075
No log 11.4286 480 0.6285 0.4827 0.6285 0.7927
No log 11.4762 482 0.6067 0.4467 0.6067 0.7789
No log 11.5238 484 0.5998 0.4352 0.5998 0.7744
No log 11.5714 486 0.6080 0.4648 0.6080 0.7797
No log 11.6190 488 0.6182 0.5044 0.6182 0.7863
No log 11.6667 490 0.6225 0.5219 0.6225 0.7890
No log 11.7143 492 0.6150 0.4999 0.6150 0.7842
No log 11.7619 494 0.6165 0.5171 0.6165 0.7852
No log 11.8095 496 0.6359 0.4797 0.6359 0.7975
No log 11.8571 498 0.6528 0.5096 0.6528 0.8079
0.3541 11.9048 500 0.6462 0.4952 0.6462 0.8038
0.3541 11.9524 502 0.6317 0.5251 0.6317 0.7948
0.3541 12.0 504 0.6315 0.5306 0.6315 0.7946
0.3541 12.0476 506 0.6491 0.5060 0.6491 0.8057
0.3541 12.0952 508 0.7027 0.5260 0.7027 0.8383
0.3541 12.1429 510 0.7254 0.4822 0.7254 0.8517
0.3541 12.1905 512 0.7062 0.5198 0.7062 0.8403
0.3541 12.2381 514 0.6362 0.5450 0.6362 0.7976
0.3541 12.2857 516 0.5988 0.5588 0.5988 0.7738
0.3541 12.3333 518 0.6215 0.5321 0.6215 0.7884
0.3541 12.3810 520 0.6836 0.4913 0.6836 0.8268
0.3541 12.4286 522 0.6731 0.5074 0.6731 0.8204
0.3541 12.4762 524 0.6611 0.5286 0.6611 0.8131
0.3541 12.5238 526 0.5888 0.5613 0.5888 0.7673
0.3541 12.5714 528 0.5626 0.5177 0.5626 0.7501
0.3541 12.6190 530 0.5474 0.5193 0.5474 0.7398
0.3541 12.6667 532 0.5588 0.5219 0.5588 0.7475
0.3541 12.7143 534 0.5849 0.5641 0.5849 0.7648
0.3541 12.7619 536 0.6104 0.5344 0.6104 0.7813
0.3541 12.8095 538 0.6462 0.5176 0.6462 0.8039
0.3541 12.8571 540 0.6140 0.5153 0.6140 0.7836
0.3541 12.9048 542 0.5762 0.5397 0.5762 0.7591
0.3541 12.9524 544 0.6247 0.5449 0.6247 0.7904
0.3541 13.0 546 0.6899 0.4751 0.6899 0.8306
0.3541 13.0476 548 0.6733 0.4996 0.6733 0.8205
0.3541 13.0952 550 0.6166 0.5218 0.6166 0.7852
0.3541 13.1429 552 0.6457 0.5193 0.6457 0.8036
0.3541 13.1905 554 0.6854 0.4915 0.6854 0.8279
0.3541 13.2381 556 0.6578 0.5151 0.6578 0.8110
0.3541 13.2857 558 0.6328 0.5330 0.6328 0.7955
0.3541 13.3333 560 0.6088 0.5075 0.6088 0.7803
0.3541 13.3810 562 0.6171 0.5301 0.6171 0.7856
0.3541 13.4286 564 0.5989 0.4956 0.5989 0.7739
0.3541 13.4762 566 0.6183 0.4721 0.6183 0.7863
0.3541 13.5238 568 0.6309 0.4235 0.6309 0.7943
0.3541 13.5714 570 0.6026 0.4598 0.6026 0.7763
0.3541 13.6190 572 0.5861 0.5051 0.5861 0.7656
0.3541 13.6667 574 0.5880 0.5018 0.5880 0.7668
0.3541 13.7143 576 0.6023 0.5773 0.6023 0.7761
0.3541 13.7619 578 0.6053 0.5228 0.6053 0.7780
0.3541 13.8095 580 0.6431 0.5363 0.6431 0.8020
0.3541 13.8571 582 0.6448 0.4931 0.6448 0.8030
0.3541 13.9048 584 0.6269 0.5184 0.6269 0.7918
0.3541 13.9524 586 0.6082 0.5103 0.6082 0.7799
0.3541 14.0 588 0.6500 0.4939 0.6500 0.8062
0.3541 14.0476 590 0.6818 0.4825 0.6818 0.8257
0.3541 14.0952 592 0.6633 0.4941 0.6633 0.8145
0.3541 14.1429 594 0.6280 0.5347 0.6280 0.7925
0.3541 14.1905 596 0.6173 0.5048 0.6173 0.7857
0.3541 14.2381 598 0.6117 0.4744 0.6117 0.7821
0.3541 14.2857 600 0.5854 0.4241 0.5854 0.7651
0.3541 14.3333 602 0.5693 0.4562 0.5693 0.7546
0.3541 14.3810 604 0.5661 0.4725 0.5661 0.7524
0.3541 14.4286 606 0.5691 0.4368 0.5691 0.7544

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
3
Safetensors
Model size
135M params
Tensor type
F32
·
Inference Providers NEW
This model is not currently available via any of the supported Inference Providers.

Model tree for MayBashendy/ArabicNewSplits8_usingALLEssays_FineTuningAraBERT_run3_AugV5_k8_task2_organization

Finetuned
(4222)
this model