ArabicNewSplits8_usingWellWrittenEssays_FineTuningAraBERT_run2_AugV5_k17_task2_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.5599
  • Qwk: 0.4358
  • Mse: 0.5599
  • Rmse: 0.7483

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0225 2 4.2850 -0.0290 4.2850 2.0700
No log 0.0449 4 2.3492 0.0242 2.3492 1.5327
No log 0.0674 6 1.6120 -0.0421 1.6120 1.2696
No log 0.0899 8 1.1552 -0.0837 1.1552 1.0748
No log 0.1124 10 0.8626 0.1666 0.8626 0.9287
No log 0.1348 12 0.8249 0.1904 0.8249 0.9082
No log 0.1573 14 0.8308 0.1904 0.8308 0.9115
No log 0.1798 16 0.8278 0.2052 0.8278 0.9098
No log 0.2022 18 0.8268 0.2111 0.8268 0.9093
No log 0.2247 20 1.0178 -0.0325 1.0178 1.0088
No log 0.2472 22 0.9914 -0.0533 0.9914 0.9957
No log 0.2697 24 0.9064 0.0675 0.9064 0.9521
No log 0.2921 26 0.8167 0.2155 0.8167 0.9037
No log 0.3146 28 1.0507 0.2071 1.0507 1.0250
No log 0.3371 30 1.1920 0.2146 1.1920 1.0918
No log 0.3596 32 1.0570 0.2286 1.0570 1.0281
No log 0.3820 34 0.9414 0.0916 0.9414 0.9703
No log 0.4045 36 0.9345 0.0710 0.9345 0.9667
No log 0.4270 38 0.9005 0.0602 0.9005 0.9489
No log 0.4494 40 1.3443 0.0358 1.3443 1.1594
No log 0.4719 42 1.7260 -0.0131 1.7260 1.3138
No log 0.4944 44 1.6694 0.0307 1.6694 1.2921
No log 0.5169 46 1.3198 0.0262 1.3198 1.1488
No log 0.5393 48 1.1206 0.0520 1.1206 1.0586
No log 0.5618 50 0.8457 0.1572 0.8457 0.9196
No log 0.5843 52 0.7546 0.2597 0.7546 0.8687
No log 0.6067 54 0.7351 0.2256 0.7351 0.8574
No log 0.6292 56 0.7337 0.2183 0.7337 0.8565
No log 0.6517 58 0.7175 0.2793 0.7175 0.8471
No log 0.6742 60 0.7632 0.2648 0.7632 0.8736
No log 0.6966 62 0.9105 0.1509 0.9105 0.9542
No log 0.7191 64 0.9177 0.0957 0.9177 0.9580
No log 0.7416 66 0.9389 0.0740 0.9389 0.9690
No log 0.7640 68 1.1845 0.0872 1.1845 1.0883
No log 0.7865 70 1.0935 0.1029 1.0935 1.0457
No log 0.8090 72 0.8092 0.2740 0.8092 0.8996
No log 0.8315 74 0.6714 0.3240 0.6714 0.8194
No log 0.8539 76 0.7795 0.3445 0.7795 0.8829
No log 0.8764 78 0.7717 0.2748 0.7717 0.8785
No log 0.8989 80 0.7247 0.3309 0.7247 0.8513
No log 0.9213 82 0.6840 0.4090 0.6840 0.8271
No log 0.9438 84 0.6735 0.3917 0.6735 0.8207
No log 0.9663 86 0.6714 0.3016 0.6714 0.8194
No log 0.9888 88 0.6764 0.3422 0.6764 0.8224
No log 1.0112 90 0.6822 0.3626 0.6822 0.8260
No log 1.0337 92 0.6739 0.3695 0.6739 0.8209
No log 1.0562 94 0.6643 0.4741 0.6643 0.8150
No log 1.0787 96 0.6609 0.4796 0.6609 0.8130
No log 1.1011 98 0.6413 0.4337 0.6413 0.8008
No log 1.1236 100 0.6415 0.4689 0.6415 0.8009
No log 1.1461 102 0.6824 0.3696 0.6824 0.8261
No log 1.1685 104 0.6863 0.3665 0.6863 0.8284
No log 1.1910 106 0.7428 0.3964 0.7428 0.8618
No log 1.2135 108 0.8756 0.2779 0.8756 0.9357
No log 1.2360 110 1.0186 0.2189 1.0186 1.0093
No log 1.2584 112 0.9316 0.2569 0.9316 0.9652
No log 1.2809 114 0.6762 0.3601 0.6762 0.8223
No log 1.3034 116 0.6314 0.4284 0.6314 0.7946
No log 1.3258 118 0.7602 0.3478 0.7602 0.8719
No log 1.3483 120 0.6713 0.3941 0.6713 0.8193
No log 1.3708 122 0.6439 0.4241 0.6439 0.8024
No log 1.3933 124 0.9615 0.3263 0.9615 0.9806
No log 1.4157 126 1.0530 0.3698 1.0530 1.0262
No log 1.4382 128 0.9506 0.4605 0.9506 0.9750
No log 1.4607 130 0.7867 0.4520 0.7867 0.8869
No log 1.4831 132 0.7971 0.4377 0.7971 0.8928
No log 1.5056 134 0.9719 0.4372 0.9719 0.9859
No log 1.5281 136 1.0757 0.3751 1.0757 1.0371
No log 1.5506 138 0.8825 0.4583 0.8825 0.9394
No log 1.5730 140 0.6734 0.4901 0.6734 0.8206
No log 1.5955 142 0.6998 0.4347 0.6998 0.8366
No log 1.6180 144 0.7189 0.3911 0.7189 0.8479
No log 1.6404 146 0.6553 0.4181 0.6553 0.8095
No log 1.6629 148 0.7100 0.4726 0.7100 0.8426
No log 1.6854 150 0.7947 0.4274 0.7947 0.8914
No log 1.7079 152 0.7201 0.4478 0.7201 0.8486
No log 1.7303 154 0.6795 0.4862 0.6795 0.8243
No log 1.7528 156 0.9566 0.3805 0.9566 0.9781
No log 1.7753 158 1.2026 0.2745 1.2026 1.0966
No log 1.7978 160 1.0963 0.3446 1.0963 1.0470
No log 1.8202 162 0.8118 0.3660 0.8118 0.9010
No log 1.8427 164 0.6855 0.4622 0.6855 0.8279
No log 1.8652 166 0.8377 0.4211 0.8377 0.9153
No log 1.8876 168 0.8999 0.4111 0.8999 0.9486
No log 1.9101 170 0.7904 0.4353 0.7904 0.8891
No log 1.9326 172 0.7085 0.4291 0.7085 0.8418
No log 1.9551 174 0.6445 0.4048 0.6445 0.8028
No log 1.9775 176 0.6008 0.4059 0.6008 0.7751
No log 2.0 178 0.6276 0.3515 0.6276 0.7922
No log 2.0225 180 0.7185 0.4346 0.7185 0.8476
No log 2.0449 182 0.7180 0.3990 0.7180 0.8473
No log 2.0674 184 0.6410 0.3437 0.6410 0.8006
No log 2.0899 186 0.6056 0.3936 0.6056 0.7782
No log 2.1124 188 0.6280 0.4090 0.6280 0.7925
No log 2.1348 190 0.6390 0.4273 0.6390 0.7994
No log 2.1573 192 0.6330 0.4370 0.6330 0.7956
No log 2.1798 194 0.6565 0.4476 0.6565 0.8103
No log 2.2022 196 0.7223 0.4589 0.7223 0.8499
No log 2.2247 198 0.7167 0.4714 0.7167 0.8466
No log 2.2472 200 0.6602 0.4965 0.6602 0.8125
No log 2.2697 202 0.6372 0.4722 0.6372 0.7983
No log 2.2921 204 0.6205 0.4483 0.6205 0.7877
No log 2.3146 206 0.6246 0.4874 0.6246 0.7903
No log 2.3371 208 0.7312 0.4835 0.7312 0.8551
No log 2.3596 210 0.8261 0.4812 0.8261 0.9089
No log 2.3820 212 0.7791 0.4729 0.7791 0.8827
No log 2.4045 214 0.6573 0.4951 0.6573 0.8107
No log 2.4270 216 0.6010 0.4592 0.6010 0.7752
No log 2.4494 218 0.6035 0.4854 0.6035 0.7769
No log 2.4719 220 0.6290 0.4559 0.6290 0.7931
No log 2.4944 222 0.6743 0.4859 0.6743 0.8211
No log 2.5169 224 0.6862 0.4859 0.6862 0.8283
No log 2.5393 226 0.6538 0.4743 0.6538 0.8086
No log 2.5618 228 0.6310 0.3841 0.6310 0.7943
No log 2.5843 230 0.6622 0.4465 0.6622 0.8138
No log 2.6067 232 0.6709 0.4182 0.6709 0.8191
No log 2.6292 234 0.6353 0.3775 0.6353 0.7970
No log 2.6517 236 0.6483 0.4596 0.6483 0.8052
No log 2.6742 238 0.7698 0.4690 0.7698 0.8774
No log 2.6966 240 0.8071 0.4599 0.8071 0.8984
No log 2.7191 242 0.7253 0.4276 0.7253 0.8517
No log 2.7416 244 0.6151 0.4416 0.6151 0.7843
No log 2.7640 246 0.6034 0.3808 0.6034 0.7768
No log 2.7865 248 0.6065 0.4310 0.6065 0.7788
No log 2.8090 250 0.6213 0.4423 0.6213 0.7882
No log 2.8315 252 0.6909 0.4726 0.6909 0.8312
No log 2.8539 254 0.7137 0.4986 0.7137 0.8448
No log 2.8764 256 0.6894 0.4873 0.6894 0.8303
No log 2.8989 258 0.6678 0.4965 0.6678 0.8172
No log 2.9213 260 0.6167 0.3877 0.6167 0.7853
No log 2.9438 262 0.6329 0.4112 0.6329 0.7955
No log 2.9663 264 0.6754 0.4362 0.6754 0.8218
No log 2.9888 266 0.6831 0.4416 0.6831 0.8265
No log 3.0112 268 0.6769 0.4245 0.6769 0.8227
No log 3.0337 270 0.6454 0.4178 0.6454 0.8033
No log 3.0562 272 0.6347 0.4842 0.6347 0.7967
No log 3.0787 274 0.6497 0.4973 0.6497 0.8060
No log 3.1011 276 0.6346 0.4899 0.6346 0.7966
No log 3.1236 278 0.6362 0.5206 0.6362 0.7976
No log 3.1461 280 0.5924 0.4693 0.5924 0.7697
No log 3.1685 282 0.5663 0.4344 0.5663 0.7525
No log 3.1910 284 0.5611 0.5020 0.5611 0.7491
No log 3.2135 286 0.5637 0.5124 0.5637 0.7508
No log 3.2360 288 0.5748 0.5445 0.5748 0.7581
No log 3.2584 290 0.5746 0.5333 0.5746 0.7580
No log 3.2809 292 0.5721 0.5325 0.5721 0.7564
No log 3.3034 294 0.5777 0.4864 0.5777 0.7601
No log 3.3258 296 0.6161 0.4958 0.6161 0.7849
No log 3.3483 298 0.7161 0.5162 0.7161 0.8463
No log 3.3708 300 0.7244 0.5233 0.7244 0.8511
No log 3.3933 302 0.6232 0.5465 0.6232 0.7894
No log 3.4157 304 0.5695 0.5225 0.5695 0.7546
No log 3.4382 306 0.5666 0.4965 0.5666 0.7528
No log 3.4607 308 0.5716 0.5108 0.5716 0.7560
No log 3.4831 310 0.5790 0.5269 0.5790 0.7609
No log 3.5056 312 0.5823 0.4679 0.5823 0.7631
No log 3.5281 314 0.5782 0.4842 0.5782 0.7604
No log 3.5506 316 0.5827 0.4971 0.5827 0.7634
No log 3.5730 318 0.6266 0.5528 0.6266 0.7916
No log 3.5955 320 0.6555 0.5183 0.6555 0.8097
No log 3.6180 322 0.6307 0.4739 0.6307 0.7942
No log 3.6404 324 0.5794 0.4339 0.5794 0.7612
No log 3.6629 326 0.5764 0.3987 0.5764 0.7592
No log 3.6854 328 0.5596 0.3967 0.5596 0.7481
No log 3.7079 330 0.5576 0.3967 0.5576 0.7467
No log 3.7303 332 0.5516 0.4329 0.5516 0.7427
No log 3.7528 334 0.5658 0.4888 0.5658 0.7522
No log 3.7753 336 0.5654 0.4932 0.5654 0.7519
No log 3.7978 338 0.5774 0.5290 0.5774 0.7598
No log 3.8202 340 0.5843 0.5064 0.5843 0.7644
No log 3.8427 342 0.5726 0.5121 0.5726 0.7567
No log 3.8652 344 0.5837 0.5213 0.5837 0.7640
No log 3.8876 346 0.5837 0.4719 0.5837 0.7640
No log 3.9101 348 0.6308 0.4867 0.6308 0.7942
No log 3.9326 350 0.6970 0.5034 0.6970 0.8349
No log 3.9551 352 0.6579 0.4675 0.6579 0.8111
No log 3.9775 354 0.6209 0.4396 0.6209 0.7880
No log 4.0 356 0.6265 0.4273 0.6265 0.7915
No log 4.0225 358 0.6152 0.4088 0.6152 0.7844
No log 4.0449 360 0.6301 0.4546 0.6301 0.7938
No log 4.0674 362 0.6283 0.4568 0.6283 0.7927
No log 4.0899 364 0.6051 0.4057 0.6051 0.7779
No log 4.1124 366 0.6008 0.3971 0.6008 0.7751
No log 4.1348 368 0.5983 0.3971 0.5983 0.7735
No log 4.1573 370 0.5981 0.4036 0.5981 0.7734
No log 4.1798 372 0.5991 0.4270 0.5991 0.7740
No log 4.2022 374 0.6122 0.3981 0.6122 0.7824
No log 4.2247 376 0.6392 0.3958 0.6392 0.7995
No log 4.2472 378 0.6299 0.4122 0.6299 0.7937
No log 4.2697 380 0.6056 0.3827 0.6056 0.7782
No log 4.2921 382 0.6079 0.3857 0.6079 0.7797
No log 4.3146 384 0.6158 0.3972 0.6158 0.7848
No log 4.3371 386 0.6174 0.4110 0.6174 0.7857
No log 4.3596 388 0.6188 0.4305 0.6188 0.7866
No log 4.3820 390 0.6232 0.4328 0.6232 0.7894
No log 4.4045 392 0.6369 0.4232 0.6369 0.7981
No log 4.4270 394 0.6776 0.3725 0.6776 0.8232
No log 4.4494 396 0.6616 0.3733 0.6616 0.8134
No log 4.4719 398 0.6212 0.4482 0.6212 0.7882
No log 4.4944 400 0.6177 0.4912 0.6177 0.7859
No log 4.5169 402 0.6681 0.4913 0.6681 0.8174
No log 4.5393 404 0.6826 0.5033 0.6826 0.8262
No log 4.5618 406 0.6401 0.5387 0.6401 0.8001
No log 4.5843 408 0.5906 0.4796 0.5906 0.7685
No log 4.6067 410 0.6101 0.4258 0.6101 0.7811
No log 4.6292 412 0.6161 0.4166 0.6161 0.7849
No log 4.6517 414 0.6369 0.4339 0.6369 0.7981
No log 4.6742 416 0.6187 0.4787 0.6187 0.7866
No log 4.6966 418 0.6789 0.5182 0.6789 0.8240
No log 4.7191 420 0.7086 0.5139 0.7086 0.8418
No log 4.7416 422 0.6608 0.5113 0.6608 0.8129
No log 4.7640 424 0.6345 0.4349 0.6345 0.7965
No log 4.7865 426 0.6362 0.4330 0.6362 0.7976
No log 4.8090 428 0.6201 0.4169 0.6201 0.7874
No log 4.8315 430 0.6141 0.4169 0.6141 0.7837
No log 4.8539 432 0.6423 0.4694 0.6423 0.8014
No log 4.8764 434 0.7369 0.4568 0.7369 0.8584
No log 4.8989 436 0.7239 0.4568 0.7239 0.8508
No log 4.9213 438 0.6153 0.4811 0.6153 0.7844
No log 4.9438 440 0.5985 0.4126 0.5985 0.7736
No log 4.9663 442 0.6959 0.4921 0.6959 0.8342
No log 4.9888 444 0.7749 0.4929 0.7749 0.8803
No log 5.0112 446 0.7062 0.4968 0.7062 0.8404
No log 5.0337 448 0.5966 0.4320 0.5966 0.7724
No log 5.0562 450 0.5846 0.4279 0.5846 0.7646
No log 5.0787 452 0.5802 0.4141 0.5802 0.7617
No log 5.1011 454 0.5966 0.3986 0.5966 0.7724
No log 5.1236 456 0.6337 0.4594 0.6337 0.7961
No log 5.1461 458 0.7217 0.4516 0.7217 0.8496
No log 5.1685 460 0.7357 0.4561 0.7357 0.8577
No log 5.1910 462 0.6542 0.4632 0.6542 0.8088
No log 5.2135 464 0.5665 0.4213 0.5665 0.7527
No log 5.2360 466 0.5794 0.4455 0.5794 0.7612
No log 5.2584 468 0.5765 0.4455 0.5765 0.7593
No log 5.2809 470 0.5630 0.4579 0.5630 0.7503
No log 5.3034 472 0.5581 0.4680 0.5581 0.7471
No log 5.3258 474 0.5573 0.4893 0.5573 0.7465
No log 5.3483 476 0.5760 0.5021 0.5760 0.7590
No log 5.3708 478 0.5840 0.4899 0.5840 0.7642
No log 5.3933 480 0.5762 0.5331 0.5762 0.7591
No log 5.4157 482 0.5639 0.5527 0.5639 0.7509
No log 5.4382 484 0.5677 0.4937 0.5677 0.7535
No log 5.4607 486 0.5795 0.4794 0.5795 0.7613
No log 5.4831 488 0.5598 0.5263 0.5598 0.7482
No log 5.5056 490 0.5493 0.5097 0.5493 0.7411
No log 5.5281 492 0.5455 0.4914 0.5455 0.7386
No log 5.5506 494 0.5630 0.4284 0.5630 0.7503
No log 5.5730 496 0.6080 0.5134 0.6080 0.7797
No log 5.5955 498 0.6030 0.5128 0.6030 0.7765
0.4206 5.6180 500 0.5927 0.4962 0.5927 0.7699
0.4206 5.6404 502 0.5900 0.5010 0.5900 0.7681
0.4206 5.6629 504 0.5920 0.4922 0.5920 0.7694
0.4206 5.6854 506 0.5919 0.5031 0.5919 0.7694
0.4206 5.7079 508 0.6197 0.5468 0.6197 0.7872
0.4206 5.7303 510 0.6549 0.5048 0.6549 0.8092
0.4206 5.7528 512 0.6261 0.5551 0.6261 0.7913
0.4206 5.7753 514 0.5879 0.5038 0.5879 0.7667
0.4206 5.7978 516 0.5857 0.5192 0.5857 0.7653
0.4206 5.8202 518 0.5974 0.5093 0.5974 0.7729
0.4206 5.8427 520 0.5935 0.5222 0.5935 0.7704
0.4206 5.8652 522 0.5956 0.5083 0.5956 0.7718
0.4206 5.8876 524 0.5951 0.5123 0.5951 0.7714
0.4206 5.9101 526 0.6029 0.5291 0.6029 0.7765
0.4206 5.9326 528 0.5935 0.4850 0.5935 0.7704
0.4206 5.9551 530 0.5781 0.4418 0.5781 0.7603
0.4206 5.9775 532 0.5987 0.4899 0.5987 0.7738
0.4206 6.0 534 0.6322 0.5116 0.6322 0.7951
0.4206 6.0225 536 0.6754 0.5123 0.6754 0.8218
0.4206 6.0449 538 0.6516 0.5434 0.6516 0.8072
0.4206 6.0674 540 0.5774 0.5558 0.5774 0.7598
0.4206 6.0899 542 0.5554 0.5323 0.5554 0.7453
0.4206 6.1124 544 0.5738 0.5503 0.5738 0.7575
0.4206 6.1348 546 0.5698 0.5354 0.5698 0.7549
0.4206 6.1573 548 0.5582 0.5177 0.5582 0.7472
0.4206 6.1798 550 0.5773 0.5668 0.5773 0.7598
0.4206 6.2022 552 0.6561 0.5887 0.6561 0.8100
0.4206 6.2247 554 0.6784 0.5713 0.6784 0.8236
0.4206 6.2472 556 0.6450 0.5696 0.6450 0.8031
0.4206 6.2697 558 0.6104 0.5544 0.6104 0.7813
0.4206 6.2921 560 0.5785 0.4921 0.5785 0.7606
0.4206 6.3146 562 0.5716 0.4950 0.5716 0.7560
0.4206 6.3371 564 0.5700 0.4741 0.5700 0.7550
0.4206 6.3596 566 0.5716 0.4753 0.5716 0.7560
0.4206 6.3820 568 0.5874 0.5665 0.5874 0.7664
0.4206 6.4045 570 0.6208 0.5831 0.6208 0.7879
0.4206 6.4270 572 0.6735 0.5525 0.6735 0.8207
0.4206 6.4494 574 0.6873 0.5374 0.6873 0.8290
0.4206 6.4719 576 0.6221 0.5452 0.6221 0.7888
0.4206 6.4944 578 0.5616 0.4796 0.5616 0.7494
0.4206 6.5169 580 0.5658 0.4337 0.5658 0.7522
0.4206 6.5393 582 0.5919 0.4529 0.5919 0.7694
0.4206 6.5618 584 0.5793 0.4335 0.5793 0.7611
0.4206 6.5843 586 0.5599 0.4358 0.5599 0.7483

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
4
Safetensors
Model size
135M params
Tensor type
F32
·
Inference Providers NEW
This model is not currently available via any of the supported Inference Providers.

Model tree for MayBashendy/ArabicNewSplits8_usingWellWrittenEssays_FineTuningAraBERT_run2_AugV5_k17_task2_organization

Finetuned
(4222)
this model