ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run1_AugV5_k4_task1_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 1.1682
  • Qwk: 0.5663
  • Mse: 1.1682
  • Rmse: 1.0808

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0667 2 6.8411 0.0290 6.8411 2.6155
No log 0.1333 4 4.5055 0.0794 4.5055 2.1226
No log 0.2 6 2.7652 0.0870 2.7652 1.6629
No log 0.2667 8 2.0825 0.1594 2.0825 1.4431
No log 0.3333 10 1.7389 0.2264 1.7389 1.3187
No log 0.4 12 1.6371 0.1905 1.6371 1.2795
No log 0.4667 14 1.6820 0.1495 1.6820 1.2969
No log 0.5333 16 1.7122 0.3548 1.7122 1.3085
No log 0.6 18 1.7297 0.2645 1.7297 1.3152
No log 0.6667 20 1.7308 0.2456 1.7308 1.3156
No log 0.7333 22 1.9098 0.1565 1.9098 1.3819
No log 0.8 24 1.9363 0.0847 1.9363 1.3915
No log 0.8667 26 1.7360 0.1667 1.7360 1.3176
No log 0.9333 28 1.4894 0.1947 1.4894 1.2204
No log 1.0 30 1.4493 0.3740 1.4493 1.2039
No log 1.0667 32 1.5090 0.3333 1.5090 1.2284
No log 1.1333 34 1.6048 0.2835 1.6048 1.2668
No log 1.2 36 1.5969 0.3333 1.5969 1.2637
No log 1.2667 38 1.5607 0.3256 1.5607 1.2493
No log 1.3333 40 1.5836 0.3053 1.5836 1.2584
No log 1.4 42 1.5397 0.3382 1.5397 1.2408
No log 1.4667 44 1.5278 0.3913 1.5278 1.2361
No log 1.5333 46 1.4911 0.4160 1.4911 1.2211
No log 1.6 48 1.3562 0.4286 1.3562 1.1645
No log 1.6667 50 1.3315 0.4769 1.3315 1.1539
No log 1.7333 52 1.7800 0.2901 1.7800 1.3342
No log 1.8 54 1.6626 0.3077 1.6626 1.2894
No log 1.8667 56 1.2296 0.4426 1.2296 1.1089
No log 1.9333 58 1.1936 0.3571 1.1936 1.0925
No log 2.0 60 1.2190 0.4407 1.2190 1.1041
No log 2.0667 62 1.2760 0.4961 1.2760 1.1296
No log 2.1333 64 1.3799 0.4462 1.3799 1.1747
No log 2.2 66 1.4315 0.4857 1.4315 1.1964
No log 2.2667 68 1.4433 0.4242 1.4433 1.2014
No log 2.3333 70 1.3643 0.4648 1.3643 1.1681
No log 2.4 72 1.4124 0.4460 1.4124 1.1884
No log 2.4667 74 1.3953 0.4148 1.3953 1.1812
No log 2.5333 76 1.4697 0.4444 1.4697 1.2123
No log 2.6 78 1.4942 0.4211 1.4942 1.2224
No log 2.6667 80 1.3468 0.4806 1.3468 1.1605
No log 2.7333 82 1.1612 0.5512 1.1612 1.0776
No log 2.8 84 1.0878 0.528 1.0878 1.0430
No log 2.8667 86 1.1252 0.6119 1.1252 1.0608
No log 2.9333 88 1.1308 0.5778 1.1308 1.0634
No log 3.0 90 1.1582 0.5143 1.1582 1.0762
No log 3.0667 92 1.3523 0.5576 1.3523 1.1629
No log 3.1333 94 1.4646 0.4815 1.4646 1.2102
No log 3.2 96 1.3034 0.4783 1.3034 1.1417
No log 3.2667 98 1.2189 0.5538 1.2189 1.1041
No log 3.3333 100 1.1976 0.5954 1.1976 1.0943
No log 3.4 102 1.2458 0.5 1.2458 1.1162
No log 3.4667 104 1.1685 0.5294 1.1685 1.0810
No log 3.5333 106 1.0799 0.5735 1.0799 1.0392
No log 3.6 108 0.9817 0.6277 0.9817 0.9908
No log 3.6667 110 0.9840 0.6619 0.9840 0.9920
No log 3.7333 112 1.0359 0.6571 1.0359 1.0178
No log 3.8 114 1.1027 0.6 1.1027 1.0501
No log 3.8667 116 1.1618 0.5734 1.1618 1.0779
No log 3.9333 118 1.3260 0.5325 1.3260 1.1515
No log 4.0 120 1.5194 0.5031 1.5194 1.2326
No log 4.0667 122 1.4042 0.5 1.4042 1.1850
No log 4.1333 124 1.2303 0.5395 1.2303 1.1092
No log 4.2 126 1.1265 0.5547 1.1265 1.0614
No log 4.2667 128 1.0583 0.6567 1.0583 1.0288
No log 4.3333 130 1.0741 0.6212 1.0741 1.0364
No log 4.4 132 1.1067 0.6107 1.1067 1.0520
No log 4.4667 134 1.1166 0.6165 1.1166 1.0567
No log 4.5333 136 1.2331 0.5175 1.2331 1.1105
No log 4.6 138 1.2278 0.5068 1.2278 1.1081
No log 4.6667 140 1.1185 0.5755 1.1185 1.0576
No log 4.7333 142 1.0165 0.6418 1.0165 1.0082
No log 4.8 144 0.9863 0.6619 0.9863 0.9931
No log 4.8667 146 1.0320 0.6174 1.0320 1.0159
No log 4.9333 148 1.1882 0.5897 1.1882 1.0901
No log 5.0 150 1.1527 0.5987 1.1527 1.0736
No log 5.0667 152 1.0692 0.6577 1.0692 1.0340
No log 5.1333 154 0.9798 0.6761 0.9798 0.9899
No log 5.2 156 1.0669 0.6423 1.0669 1.0329
No log 5.2667 158 1.2467 0.5135 1.2467 1.1166
No log 5.3333 160 1.3934 0.4845 1.3934 1.1804
No log 5.4 162 1.3303 0.4935 1.3303 1.1534
No log 5.4667 164 1.1582 0.5333 1.1582 1.0762
No log 5.5333 166 1.0258 0.6131 1.0258 1.0128
No log 5.6 168 0.9112 0.6567 0.9112 0.9546
No log 5.6667 170 0.8672 0.6716 0.8672 0.9312
No log 5.7333 172 0.8603 0.6861 0.8603 0.9275
No log 5.8 174 0.9279 0.6370 0.9279 0.9633
No log 5.8667 176 1.0725 0.6099 1.0725 1.0356
No log 5.9333 178 1.1306 0.5612 1.1306 1.0633
No log 6.0 180 1.1694 0.5714 1.1694 1.0814
No log 6.0667 182 1.1824 0.5571 1.1824 1.0874
No log 6.1333 184 1.1753 0.5547 1.1753 1.0841
No log 6.2 186 1.1645 0.5734 1.1645 1.0791
No log 6.2667 188 1.0933 0.5588 1.0933 1.0456
No log 6.3333 190 1.0788 0.6131 1.0788 1.0387
No log 6.4 192 1.0192 0.6423 1.0192 1.0096
No log 6.4667 194 0.9612 0.6667 0.9612 0.9804
No log 6.5333 196 1.0047 0.6667 1.0047 1.0023
No log 6.6 198 1.0488 0.6423 1.0488 1.0241
No log 6.6667 200 1.1262 0.5578 1.1262 1.0612
No log 6.7333 202 1.1493 0.5625 1.1493 1.0720
No log 6.8 204 1.0030 0.6241 1.0030 1.0015
No log 6.8667 206 0.7999 0.6857 0.7999 0.8944
No log 6.9333 208 0.6923 0.7153 0.6923 0.8320
No log 7.0 210 0.7225 0.7101 0.7225 0.8500
No log 7.0667 212 0.7961 0.6765 0.7961 0.8922
No log 7.1333 214 0.9400 0.6418 0.9400 0.9695
No log 7.2 216 1.1221 0.5758 1.1221 1.0593
No log 7.2667 218 1.1054 0.5954 1.1054 1.0514
No log 7.3333 220 1.1653 0.6119 1.1653 1.0795
No log 7.4 222 1.2698 0.4966 1.2698 1.1268
No log 7.4667 224 1.3247 0.4969 1.3247 1.1509
No log 7.5333 226 1.1623 0.6053 1.1623 1.0781
No log 7.6 228 1.0819 0.6301 1.0819 1.0401
No log 7.6667 230 0.9733 0.6525 0.9733 0.9866
No log 7.7333 232 1.1109 0.6207 1.1109 1.0540
No log 7.8 234 1.3954 0.5562 1.3954 1.1813
No log 7.8667 236 1.5114 0.5059 1.5114 1.2294
No log 7.9333 238 1.3694 0.525 1.3694 1.1702
No log 8.0 240 1.1268 0.5906 1.1268 1.0615
No log 8.0667 242 0.9815 0.6993 0.9815 0.9907
No log 8.1333 244 0.9961 0.6667 0.9961 0.9980
No log 8.2 246 1.1060 0.5578 1.1060 1.0517
No log 8.2667 248 1.1312 0.5616 1.1312 1.0636
No log 8.3333 250 1.0797 0.6241 1.0797 1.0391
No log 8.4 252 1.0239 0.6277 1.0239 1.0119
No log 8.4667 254 0.9819 0.6763 0.9819 0.9909
No log 8.5333 256 0.9788 0.6667 0.9788 0.9893
No log 8.6 258 1.0502 0.6418 1.0502 1.0248
No log 8.6667 260 1.1559 0.6029 1.1559 1.0751
No log 8.7333 262 1.2523 0.5109 1.2523 1.1191
No log 8.8 264 1.2236 0.5109 1.2236 1.1062
No log 8.8667 266 1.1008 0.5865 1.1008 1.0492
No log 8.9333 268 1.0131 0.5865 1.0131 1.0065
No log 9.0 270 0.9567 0.6316 0.9567 0.9781
No log 9.0667 272 0.9227 0.6277 0.9227 0.9606
No log 9.1333 274 0.9190 0.6522 0.9190 0.9586
No log 9.2 276 0.9918 0.6405 0.9918 0.9959
No log 9.2667 278 1.0030 0.6667 1.0030 1.0015
No log 9.3333 280 0.8790 0.6324 0.8790 0.9375
No log 9.4 282 0.8730 0.6324 0.8730 0.9343
No log 9.4667 284 0.9293 0.6187 0.9293 0.9640
No log 9.5333 286 0.9091 0.6324 0.9091 0.9535
No log 9.6 288 0.8488 0.6615 0.8488 0.9213
No log 9.6667 290 0.8677 0.6815 0.8677 0.9315
No log 9.7333 292 0.9591 0.6812 0.9591 0.9794
No log 9.8 294 1.1017 0.5882 1.1017 1.0496
No log 9.8667 296 1.2618 0.5509 1.2618 1.1233
No log 9.9333 298 1.3770 0.5325 1.3770 1.1734
No log 10.0 300 1.3422 0.5238 1.3422 1.1585
No log 10.0667 302 1.1776 0.5578 1.1776 1.0852
No log 10.1333 304 0.9751 0.6716 0.9751 0.9875
No log 10.2 306 0.8607 0.6406 0.8607 0.9277
No log 10.2667 308 0.8602 0.5984 0.8602 0.9275
No log 10.3333 310 0.8909 0.625 0.8909 0.9439
No log 10.4 312 0.9768 0.6906 0.9768 0.9883
No log 10.4667 314 1.1032 0.6225 1.1032 1.0503
No log 10.5333 316 1.1913 0.6265 1.1913 1.0915
No log 10.6 318 1.1459 0.5860 1.1459 1.0705
No log 10.6667 320 0.9666 0.6892 0.9666 0.9831
No log 10.7333 322 0.8932 0.6269 0.8932 0.9451
No log 10.8 324 0.8935 0.6269 0.8935 0.9452
No log 10.8667 326 0.9379 0.6912 0.9379 0.9684
No log 10.9333 328 1.0794 0.6309 1.0794 1.0389
No log 11.0 330 1.1860 0.6296 1.1860 1.0891
No log 11.0667 332 1.1891 0.6429 1.1891 1.0904
No log 11.1333 334 1.0631 0.6581 1.0631 1.0311
No log 11.2 336 0.9568 0.6763 0.9568 0.9782
No log 11.2667 338 0.9640 0.6324 0.9640 0.9818
No log 11.3333 340 0.9778 0.6324 0.9778 0.9889
No log 11.4 342 1.0438 0.6519 1.0438 1.0217
No log 11.4667 344 1.2155 0.5655 1.2155 1.1025
No log 11.5333 346 1.3660 0.5256 1.3660 1.1688
No log 11.6 348 1.2987 0.5325 1.2987 1.1396
No log 11.6667 350 1.0831 0.6029 1.0831 1.0407
No log 11.7333 352 0.9538 0.6260 0.9538 0.9766
No log 11.8 354 0.9300 0.6269 0.9300 0.9644
No log 11.8667 356 0.9813 0.6324 0.9813 0.9906
No log 11.9333 358 1.1209 0.6232 1.1209 1.0587
No log 12.0 360 1.3313 0.5170 1.3313 1.1538
No log 12.0667 362 1.4482 0.4196 1.4482 1.2034
No log 12.1333 364 1.3954 0.4265 1.3954 1.1813
No log 12.2 366 1.2580 0.5294 1.2580 1.1216
No log 12.2667 368 1.1873 0.5882 1.1873 1.0896
No log 12.3333 370 1.1016 0.6466 1.1016 1.0496
No log 12.4 372 1.0789 0.6519 1.0789 1.0387
No log 12.4667 374 1.1158 0.5915 1.1158 1.0563
No log 12.5333 376 1.2124 0.5806 1.2124 1.1011
No log 12.6 378 1.2120 0.5974 1.2120 1.1009
No log 12.6667 380 1.1104 0.6207 1.1104 1.0538
No log 12.7333 382 1.0596 0.6197 1.0596 1.0294
No log 12.8 384 1.0240 0.6197 1.0240 1.0119
No log 12.8667 386 1.0451 0.6197 1.0451 1.0223
No log 12.9333 388 1.0846 0.6197 1.0846 1.0415
No log 13.0 390 1.0829 0.6241 1.0829 1.0406
No log 13.0667 392 1.0886 0.6241 1.0886 1.0434
No log 13.1333 394 1.0738 0.6197 1.0738 1.0362
No log 13.2 396 1.1437 0.6144 1.1437 1.0694
No log 13.2667 398 1.1341 0.6174 1.1341 1.0649
No log 13.3333 400 1.0499 0.6197 1.0499 1.0246
No log 13.4 402 1.0067 0.6277 1.0067 1.0033
No log 13.4667 404 1.0055 0.6061 1.0055 1.0027
No log 13.5333 406 1.0464 0.6061 1.0464 1.0229
No log 13.6 408 1.0631 0.5909 1.0631 1.0311
No log 13.6667 410 1.0617 0.6061 1.0617 1.0304
No log 13.7333 412 1.0727 0.6260 1.0727 1.0357
No log 13.8 414 1.1313 0.6119 1.1313 1.0636
No log 13.8667 416 1.2340 0.5503 1.2340 1.1109
No log 13.9333 418 1.3273 0.5350 1.3273 1.1521
No log 14.0 420 1.3200 0.5385 1.3200 1.1489
No log 14.0667 422 1.3247 0.5223 1.3247 1.1510
No log 14.1333 424 1.2907 0.5170 1.2907 1.1361
No log 14.2 426 1.2642 0.5468 1.2642 1.1244
No log 14.2667 428 1.2646 0.5401 1.2646 1.1245
No log 14.3333 430 1.1767 0.5909 1.1767 1.0848
No log 14.4 432 1.1276 0.5802 1.1276 1.0619
No log 14.4667 434 1.0721 0.5512 1.0721 1.0354
No log 14.5333 436 1.0481 0.5649 1.0481 1.0237
No log 14.6 438 1.0613 0.6131 1.0613 1.0302
No log 14.6667 440 1.1536 0.5811 1.1536 1.0741
No log 14.7333 442 1.3342 0.5395 1.3342 1.1551
No log 14.8 444 1.3475 0.5548 1.3475 1.1608
No log 14.8667 446 1.2069 0.5714 1.2069 1.0986
No log 14.9333 448 1.0777 0.6232 1.0777 1.0381
No log 15.0 450 0.9441 0.6277 0.9441 0.9717
No log 15.0667 452 0.8969 0.6212 0.8969 0.9470
No log 15.1333 454 0.9045 0.6569 0.9045 0.9511
No log 15.2 456 1.0174 0.6395 1.0174 1.0086
No log 15.2667 458 1.2439 0.6182 1.2439 1.1153
No log 15.3333 460 1.3134 0.6024 1.3134 1.1460
No log 15.4 462 1.2133 0.5897 1.2133 1.1015
No log 15.4667 464 1.0636 0.6525 1.0636 1.0313
No log 15.5333 466 0.9961 0.6316 0.9961 0.9981
No log 15.6 468 1.0037 0.6316 1.0037 1.0019
No log 15.6667 470 1.0793 0.6569 1.0793 1.0389
No log 15.7333 472 1.2025 0.6099 1.2025 1.0966
No log 15.8 474 1.2963 0.5890 1.2963 1.1386
No log 15.8667 476 1.2393 0.5899 1.2393 1.1132
No log 15.9333 478 1.1624 0.6324 1.1624 1.0781
No log 16.0 480 1.1372 0.6119 1.1372 1.0664
No log 16.0667 482 1.1336 0.6131 1.1336 1.0647
No log 16.1333 484 1.0423 0.6331 1.0423 1.0209
No log 16.2 486 0.9645 0.6620 0.9645 0.9821
No log 16.2667 488 0.9296 0.6755 0.9296 0.9641
No log 16.3333 490 0.8918 0.7006 0.8918 0.9444
No log 16.4 492 0.8814 0.7 0.8814 0.9388
No log 16.4667 494 0.9504 0.6424 0.9504 0.9749
No log 16.5333 496 1.0877 0.6509 1.0877 1.0429
No log 16.6 498 1.0546 0.6220 1.0546 1.0270
0.372 16.6667 500 0.9086 0.6667 0.9086 0.9532
0.372 16.7333 502 0.8080 0.6806 0.8080 0.8989
0.372 16.8 504 0.8219 0.7206 0.8219 0.9066
0.372 16.8667 506 0.8541 0.6667 0.8541 0.9242
0.372 16.9333 508 0.8858 0.6667 0.8858 0.9412
0.372 17.0 510 0.9483 0.6119 0.9483 0.9738
0.372 17.0667 512 1.0789 0.6438 1.0789 1.0387
0.372 17.1333 514 1.1682 0.5663 1.1682 1.0808

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
4
Safetensors
Model size
135M params
Tensor type
F32
·
Inference Providers NEW
This model is not currently available via any of the supported Inference Providers.

Model tree for MayBashendy/ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run1_AugV5_k4_task1_organization

Finetuned
(4222)
this model