ArabicNewSplits7_usingWellWrittenEssays_FineTuningAraBERT_run1_AugV5_k8_task1_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 1.7865
  • Qwk: 0.25
  • Mse: 1.7865
  • Rmse: 1.3366

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0571 2 7.4539 -0.0368 7.4539 2.7302
No log 0.1143 4 4.1722 0.0769 4.1722 2.0426
No log 0.1714 6 2.9799 0.0123 2.9799 1.7262
No log 0.2286 8 3.2861 -0.0118 3.2861 1.8128
No log 0.2857 10 2.4038 0.0 2.4038 1.5504
No log 0.3429 12 1.8532 0.0755 1.8532 1.3613
No log 0.4 14 1.8616 0.1538 1.8616 1.3644
No log 0.4571 16 2.2720 0.0333 2.2720 1.5073
No log 0.5143 18 2.3673 -0.0800 2.3673 1.5386
No log 0.5714 20 2.1610 0.0357 2.1610 1.4700
No log 0.6286 22 2.0420 0.1905 2.0420 1.4290
No log 0.6857 24 2.0083 0.0962 2.0083 1.4171
No log 0.7429 26 1.9601 0.2281 1.9601 1.4000
No log 0.8 28 1.9969 0.3140 1.9969 1.4131
No log 0.8571 30 2.1474 0.1168 2.1474 1.4654
No log 0.9143 32 2.1438 0.0863 2.1438 1.4642
No log 0.9714 34 2.1005 0.1594 2.1005 1.4493
No log 1.0286 36 1.8629 0.3770 1.8629 1.3649
No log 1.0857 38 1.6025 0.2407 1.6025 1.2659
No log 1.1429 40 1.4960 0.2075 1.4960 1.2231
No log 1.2 42 1.5067 0.1714 1.5067 1.2275
No log 1.2571 44 1.5243 0.2407 1.5243 1.2346
No log 1.3143 46 1.5518 0.2752 1.5518 1.2457
No log 1.3714 48 1.5925 0.1905 1.5925 1.2619
No log 1.4286 50 1.7204 0.1607 1.7204 1.3117
No log 1.4857 52 1.8678 0.2295 1.8678 1.3667
No log 1.5429 54 1.8467 0.24 1.8467 1.3590
No log 1.6 56 1.7856 0.2521 1.7856 1.3362
No log 1.6571 58 1.6179 0.1835 1.6179 1.2720
No log 1.7143 60 1.4905 0.2243 1.4905 1.2209
No log 1.7714 62 1.3842 0.2778 1.3842 1.1765
No log 1.8286 64 1.4082 0.2909 1.4082 1.1867
No log 1.8857 66 1.3495 0.3860 1.3495 1.1617
No log 1.9429 68 1.3248 0.4667 1.3248 1.1510
No log 2.0 70 1.2129 0.4615 1.2129 1.1013
No log 2.0571 72 1.2055 0.4576 1.2055 1.0979
No log 2.1143 74 1.1821 0.3966 1.1821 1.0872
No log 2.1714 76 1.3515 0.3423 1.3515 1.1625
No log 2.2286 78 1.3057 0.3063 1.3057 1.1427
No log 2.2857 80 1.2254 0.3186 1.2254 1.1070
No log 2.3429 82 1.2078 0.4590 1.2078 1.0990
No log 2.4 84 1.1821 0.528 1.1821 1.0872
No log 2.4571 86 1.1923 0.4959 1.1923 1.0919
No log 2.5143 88 1.2153 0.4500 1.2153 1.1024
No log 2.5714 90 1.2518 0.4211 1.2518 1.1188
No log 2.6286 92 1.2799 0.4174 1.2799 1.1313
No log 2.6857 94 1.2450 0.4561 1.2450 1.1158
No log 2.7429 96 1.2207 0.4655 1.2207 1.1049
No log 2.8 98 1.3279 0.5469 1.3279 1.1523
No log 2.8571 100 1.5492 0.4375 1.5492 1.2447
No log 2.9143 102 1.5490 0.3817 1.5490 1.2446
No log 2.9714 104 1.6897 0.3333 1.6897 1.2999
No log 3.0286 106 1.7939 0.2667 1.7939 1.3394
No log 3.0857 108 1.5947 0.3235 1.5947 1.2628
No log 3.1429 110 1.5781 0.3134 1.5781 1.2562
No log 3.2 112 1.8691 0.2222 1.8691 1.3672
No log 3.2571 114 2.2072 0.0645 2.2072 1.4857
No log 3.3143 116 2.3394 -0.0500 2.3394 1.5295
No log 3.3714 118 2.1207 0.0820 2.1207 1.4563
No log 3.4286 120 1.6673 0.2794 1.6673 1.2912
No log 3.4857 122 1.3625 0.4848 1.3625 1.1672
No log 3.5429 124 1.4134 0.4328 1.4134 1.1889
No log 3.6 126 1.3915 0.4885 1.3915 1.1796
No log 3.6571 128 1.4774 0.4 1.4774 1.2155
No log 3.7143 130 1.4748 0.4265 1.4748 1.2144
No log 3.7714 132 1.5066 0.4203 1.5066 1.2274
No log 3.8286 134 1.3738 0.5113 1.3738 1.1721
No log 3.8857 136 1.3155 0.5312 1.3155 1.1470
No log 3.9429 138 1.3786 0.5197 1.3786 1.1741
No log 4.0 140 1.3995 0.4298 1.3995 1.1830
No log 4.0571 142 1.3231 0.4746 1.3231 1.1502
No log 4.1143 144 1.2274 0.3860 1.2274 1.1079
No log 4.1714 146 1.2348 0.4483 1.2348 1.1112
No log 4.2286 148 1.3332 0.4706 1.3332 1.1546
No log 4.2857 150 1.4793 0.4627 1.4793 1.2163
No log 4.3429 152 1.3875 0.4030 1.3875 1.1779
No log 4.4 154 1.3198 0.4923 1.3198 1.1488
No log 4.4571 156 1.4046 0.4179 1.4046 1.1852
No log 4.5143 158 1.6824 0.3521 1.6824 1.2971
No log 4.5714 160 1.6558 0.3453 1.6558 1.2868
No log 4.6286 162 1.4075 0.4275 1.4075 1.1864
No log 4.6857 164 1.2563 0.4839 1.2563 1.1209
No log 4.7429 166 1.2313 0.4628 1.2313 1.1096
No log 4.8 168 1.3226 0.4426 1.3226 1.1500
No log 4.8571 170 1.5774 0.375 1.5774 1.2560
No log 4.9143 172 1.6308 0.3566 1.6308 1.2770
No log 4.9714 174 1.4480 0.4882 1.4480 1.2033
No log 5.0286 176 1.1699 0.5891 1.1699 1.0816
No log 5.0857 178 1.0929 0.5556 1.0929 1.0454
No log 5.1429 180 1.1045 0.5312 1.1045 1.0509
No log 5.2 182 1.3474 0.4545 1.3474 1.1608
No log 5.2571 184 1.6831 0.3459 1.6831 1.2973
No log 5.3143 186 1.8116 0.2121 1.8116 1.3460
No log 5.3714 188 1.8024 0.2258 1.8024 1.3425
No log 5.4286 190 1.6981 0.3511 1.6981 1.3031
No log 5.4857 192 1.5473 0.4186 1.5473 1.2439
No log 5.5429 194 1.4987 0.4211 1.4987 1.2242
No log 5.6 196 1.5572 0.4348 1.5572 1.2479
No log 5.6571 198 1.6660 0.3309 1.6660 1.2907
No log 5.7143 200 1.5331 0.3971 1.5331 1.2382
No log 5.7714 202 1.3255 0.4961 1.3255 1.1513
No log 5.8286 204 1.3445 0.4923 1.3445 1.1595
No log 5.8857 206 1.5750 0.3971 1.5750 1.2550
No log 5.9429 208 1.7113 0.2774 1.7113 1.3082
No log 6.0 210 1.6616 0.3030 1.6616 1.2890
No log 6.0571 212 1.5352 0.496 1.5352 1.2390
No log 6.1143 214 1.3989 0.4839 1.3989 1.1828
No log 6.1714 216 1.3937 0.4426 1.3937 1.1806
No log 6.2286 218 1.4770 0.4298 1.4770 1.2153
No log 6.2857 220 1.5963 0.3382 1.5963 1.2634
No log 6.3429 222 1.7069 0.3188 1.7069 1.3065
No log 6.4 224 1.7468 0.3022 1.7468 1.3217
No log 6.4571 226 1.6892 0.3165 1.6892 1.2997
No log 6.5143 228 1.5597 0.3636 1.5597 1.2489
No log 6.5714 230 1.5301 0.3636 1.5301 1.2370
No log 6.6286 232 1.3924 0.4308 1.3924 1.1800
No log 6.6857 234 1.4424 0.4242 1.4424 1.2010
No log 6.7429 236 1.6475 0.3768 1.6475 1.2835
No log 6.8 238 1.8042 0.2774 1.8042 1.3432
No log 6.8571 240 1.7203 0.3478 1.7203 1.3116
No log 6.9143 242 1.5689 0.4060 1.5689 1.2526
No log 6.9714 244 1.4053 0.3906 1.4053 1.1855
No log 7.0286 246 1.3346 0.4724 1.3346 1.1552
No log 7.0857 248 1.4422 0.4394 1.4422 1.2009
No log 7.1429 250 1.6539 0.3212 1.6539 1.2860
No log 7.2 252 1.7528 0.3043 1.7528 1.3239
No log 7.2571 254 1.8018 0.2667 1.8018 1.3423
No log 7.3143 256 1.7345 0.3358 1.7345 1.3170
No log 7.3714 258 1.5924 0.4 1.5924 1.2619
No log 7.4286 260 1.5349 0.4672 1.5349 1.2389
No log 7.4857 262 1.6351 0.3433 1.6351 1.2787
No log 7.5429 264 1.6705 0.3088 1.6705 1.2925
No log 7.6 266 1.5412 0.4265 1.5412 1.2414
No log 7.6571 268 1.3942 0.4559 1.3942 1.1807
No log 7.7143 270 1.3313 0.4776 1.3313 1.1538
No log 7.7714 272 1.3550 0.4925 1.3550 1.1640
No log 7.8286 274 1.3966 0.4265 1.3966 1.1818
No log 7.8857 276 1.5375 0.3704 1.5375 1.2399
No log 7.9429 278 1.5776 0.3158 1.5776 1.2560
No log 8.0 280 1.5705 0.3433 1.5705 1.2532
No log 8.0571 282 1.5823 0.3433 1.5823 1.2579
No log 8.1143 284 1.4536 0.4031 1.4536 1.2056
No log 8.1714 286 1.2902 0.4844 1.2902 1.1359
No log 8.2286 288 1.2571 0.5231 1.2571 1.1212
No log 8.2857 290 1.3889 0.4394 1.3889 1.1785
No log 8.3429 292 1.6167 0.3650 1.6167 1.2715
No log 8.4 294 1.7558 0.2695 1.7558 1.3251
No log 8.4571 296 1.7715 0.2695 1.7715 1.3310
No log 8.5143 298 1.8135 0.2429 1.8135 1.3467
No log 8.5714 300 1.7607 0.2222 1.7607 1.3269
No log 8.6286 302 1.6361 0.3437 1.6361 1.2791
No log 8.6857 304 1.6172 0.3636 1.6172 1.2717
No log 8.7429 306 1.6462 0.3009 1.6462 1.2831
No log 8.8 308 1.6544 0.3214 1.6544 1.2862
No log 8.8571 310 1.5885 0.3509 1.5885 1.2603
No log 8.9143 312 1.4996 0.4409 1.4996 1.2246
No log 8.9714 314 1.5454 0.4427 1.5454 1.2431
No log 9.0286 316 1.5423 0.4889 1.5423 1.2419
No log 9.0857 318 1.6121 0.3796 1.6121 1.2697
No log 9.1429 320 1.5165 0.4627 1.5165 1.2315
No log 9.2 322 1.3753 0.4885 1.3753 1.1727
No log 9.2571 324 1.4123 0.4885 1.4123 1.1884
No log 9.3143 326 1.6182 0.3478 1.6182 1.2721
No log 9.3714 328 1.7031 0.2590 1.7031 1.3050
No log 9.4286 330 1.6189 0.3382 1.6189 1.2724
No log 9.4857 332 1.4124 0.5 1.4124 1.1884
No log 9.5429 334 1.3400 0.4793 1.3400 1.1576
No log 9.6 336 1.3745 0.5 1.3745 1.1724
No log 9.6571 338 1.4883 0.4697 1.4883 1.2200
No log 9.7143 340 1.5742 0.2920 1.5742 1.2547
No log 9.7714 342 1.5091 0.3852 1.5091 1.2284
No log 9.8286 344 1.3635 0.4341 1.3635 1.1677
No log 9.8857 346 1.3318 0.4409 1.3318 1.1540
No log 9.9429 348 1.3897 0.4094 1.3897 1.1788
No log 10.0 350 1.4745 0.3846 1.4745 1.2143
No log 10.0571 352 1.6068 0.3407 1.6068 1.2676
No log 10.1143 354 1.6030 0.3382 1.6030 1.2661
No log 10.1714 356 1.4081 0.4511 1.4081 1.1866
No log 10.2286 358 1.2643 0.5191 1.2643 1.1244
No log 10.2857 360 1.2712 0.5191 1.2712 1.1275
No log 10.3429 362 1.4061 0.4651 1.4061 1.1858
No log 10.4 364 1.4416 0.4444 1.4416 1.2007
No log 10.4571 366 1.3610 0.4333 1.3610 1.1666
No log 10.5143 368 1.2781 0.4754 1.2781 1.1305
No log 10.5714 370 1.2848 0.5 1.2848 1.1335
No log 10.6286 372 1.3658 0.4567 1.3658 1.1687
No log 10.6857 374 1.5306 0.3088 1.5306 1.2372
No log 10.7429 376 1.6535 0.3407 1.6535 1.2859
No log 10.8 378 1.7475 0.2963 1.7475 1.3219
No log 10.8571 380 1.7174 0.3158 1.7174 1.3105
No log 10.9143 382 1.6407 0.3810 1.6407 1.2809
No log 10.9714 384 1.5522 0.3594 1.5522 1.2459
No log 11.0286 386 1.5482 0.3485 1.5482 1.2443
No log 11.0857 388 1.6978 0.3188 1.6978 1.3030
No log 11.1429 390 1.8451 0.2714 1.8451 1.3583
No log 11.2 392 1.8179 0.3188 1.8179 1.3483
No log 11.2571 394 1.6028 0.3433 1.6028 1.2660
No log 11.3143 396 1.4291 0.4 1.4291 1.1954
No log 11.3714 398 1.4258 0.3910 1.4258 1.1941
No log 11.4286 400 1.6230 0.3188 1.6230 1.2740
No log 11.4857 402 1.8430 0.2411 1.8430 1.3576
No log 11.5429 404 1.8798 0.1958 1.8798 1.3711
No log 11.6 406 1.7177 0.2899 1.7177 1.3106
No log 11.6571 408 1.5180 0.4094 1.5180 1.2321
No log 11.7143 410 1.3836 0.5354 1.3836 1.1763
No log 11.7714 412 1.3521 0.5781 1.3521 1.1628
No log 11.8286 414 1.4255 0.4733 1.4255 1.1939
No log 11.8857 416 1.6166 0.3284 1.6166 1.2715
No log 11.9429 418 1.7459 0.2899 1.7459 1.3213
No log 12.0 420 1.8009 0.2590 1.8009 1.3420
No log 12.0571 422 1.7620 0.2920 1.7620 1.3274
No log 12.1143 424 1.6390 0.3284 1.6390 1.2802
No log 12.1714 426 1.5120 0.3664 1.5120 1.2296
No log 12.2286 428 1.4668 0.4776 1.4668 1.2111
No log 12.2857 430 1.5290 0.3939 1.5290 1.2365
No log 12.3429 432 1.6194 0.3088 1.6194 1.2726
No log 12.4 434 1.7131 0.3022 1.7131 1.3089
No log 12.4571 436 1.6298 0.3731 1.6298 1.2766
No log 12.5143 438 1.5423 0.4275 1.5423 1.2419
No log 12.5714 440 1.5301 0.4275 1.5301 1.2370
No log 12.6286 442 1.6680 0.3358 1.6680 1.2915
No log 12.6857 444 1.9500 0.1958 1.9500 1.3964
No log 12.7429 446 2.1884 0.1690 2.1884 1.4793
No log 12.8 448 2.1721 0.1690 2.1721 1.4738
No log 12.8571 450 1.9721 0.1690 1.9721 1.4043
No log 12.9143 452 1.6610 0.3788 1.6610 1.2888
No log 12.9714 454 1.4530 0.4341 1.4530 1.2054
No log 13.0286 456 1.3925 0.4733 1.3925 1.1800
No log 13.0857 458 1.5077 0.4030 1.5077 1.2279
No log 13.1429 460 1.6979 0.2878 1.6979 1.3030
No log 13.2 462 1.7224 0.2878 1.7224 1.3124
No log 13.2571 464 1.6515 0.2774 1.6515 1.2851
No log 13.3143 466 1.5742 0.3382 1.5742 1.2547
No log 13.3714 468 1.5932 0.3650 1.5932 1.2622
No log 13.4286 470 1.5879 0.3623 1.5879 1.2601
No log 13.4857 472 1.6817 0.3143 1.6817 1.2968
No log 13.5429 474 1.7223 0.3099 1.7223 1.3124
No log 13.6 476 1.6144 0.3662 1.6144 1.2706
No log 13.6571 478 1.4906 0.4286 1.4906 1.2209
No log 13.7143 480 1.4711 0.4672 1.4711 1.2129
No log 13.7714 482 1.5151 0.4593 1.5151 1.2309
No log 13.8286 484 1.6085 0.3910 1.6085 1.2683
No log 13.8857 486 1.6251 0.3910 1.6251 1.2748
No log 13.9429 488 1.5486 0.3788 1.5486 1.2444
No log 14.0 490 1.3603 0.5038 1.3603 1.1663
No log 14.0571 492 1.2230 0.5116 1.2230 1.1059
No log 14.1143 494 1.2044 0.5385 1.2044 1.0974
No log 14.1714 496 1.3083 0.5191 1.3083 1.1438
No log 14.2286 498 1.4467 0.4615 1.4467 1.2028
0.3939 14.2857 500 1.6312 0.2748 1.6312 1.2772
0.3939 14.3429 502 1.7407 0.2748 1.7407 1.3194
0.3939 14.4 504 1.8077 0.2748 1.8077 1.3445
0.3939 14.4571 506 1.8017 0.2748 1.8017 1.3423
0.3939 14.5143 508 1.6813 0.2879 1.6813 1.2966
0.3939 14.5714 510 1.6008 0.3206 1.6008 1.2652
0.3939 14.6286 512 1.6823 0.2812 1.6823 1.2970
0.3939 14.6857 514 1.7432 0.2769 1.7432 1.3203
0.3939 14.7429 516 1.6914 0.2941 1.6914 1.3005
0.3939 14.8 518 1.6368 0.3212 1.6368 1.2794
0.3939 14.8571 520 1.4627 0.4091 1.4627 1.2094
0.3939 14.9143 522 1.3668 0.4848 1.3668 1.1691
0.3939 14.9714 524 1.3300 0.4962 1.3300 1.1532
0.3939 15.0286 526 1.3776 0.4964 1.3776 1.1737
0.3939 15.0857 528 1.5495 0.4113 1.5495 1.2448
0.3939 15.1429 530 1.8496 0.2464 1.8496 1.3600
0.3939 15.2 532 1.9511 0.1630 1.9511 1.3968
0.3939 15.2571 534 1.8094 0.2344 1.8094 1.3452
0.3939 15.3143 536 1.5732 0.3968 1.5732 1.2543
0.3939 15.3714 538 1.3796 0.5077 1.3796 1.1746
0.3939 15.4286 540 1.2978 0.4923 1.2978 1.1392
0.3939 15.4857 542 1.3601 0.5113 1.3601 1.1662
0.3939 15.5429 544 1.4992 0.4493 1.4992 1.2244
0.3939 15.6 546 1.6473 0.4173 1.6473 1.2835
0.3939 15.6571 548 1.7520 0.2695 1.7520 1.3236
0.3939 15.7143 550 1.7343 0.2857 1.7343 1.3169
0.3939 15.7714 552 1.5700 0.3971 1.5700 1.2530
0.3939 15.8286 554 1.4052 0.4923 1.4052 1.1854
0.3939 15.8857 556 1.3283 0.4882 1.3283 1.1525
0.3939 15.9429 558 1.3217 0.4882 1.3217 1.1496
0.3939 16.0 560 1.4146 0.4496 1.4146 1.1894
0.3939 16.0571 562 1.4939 0.4580 1.4939 1.2223
0.3939 16.1143 564 1.4745 0.4478 1.4745 1.2143
0.3939 16.1714 566 1.4484 0.4559 1.4484 1.2035
0.3939 16.2286 568 1.5033 0.4559 1.5033 1.2261
0.3939 16.2857 570 1.7408 0.3008 1.7408 1.3194
0.3939 16.3429 572 1.9629 0.1940 1.9629 1.4010
0.3939 16.4 574 2.0332 0.1606 2.0332 1.4259
0.3939 16.4571 576 1.9532 0.2059 1.9532 1.3976
0.3939 16.5143 578 1.8788 0.2239 1.8788 1.3707
0.3939 16.5714 580 1.7865 0.25 1.7865 1.3366

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
4
Safetensors
Model size
135M params
Tensor type
F32
·
Inference Providers NEW
This model is not currently available via any of the supported Inference Providers.

Model tree for MayBashendy/ArabicNewSplits7_usingWellWrittenEssays_FineTuningAraBERT_run1_AugV5_k8_task1_organization

Finetuned
(4222)
this model