ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run1_AugV5_k9_task1_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.6630
  • Qwk: 0.6716
  • Mse: 0.6630
  • Rmse: 0.8143

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0299 2 7.1893 0.0110 7.1893 2.6813
No log 0.0597 4 4.7827 0.0690 4.7827 2.1869
No log 0.0896 6 3.2390 0.0791 3.2390 1.7997
No log 0.1194 8 2.2809 0.0658 2.2809 1.5103
No log 0.1493 10 1.8933 0.2202 1.8933 1.3760
No log 0.1791 12 1.6864 0.2075 1.6864 1.2986
No log 0.2090 14 1.6286 0.1682 1.6286 1.2762
No log 0.2388 16 1.6563 0.2523 1.6563 1.2870
No log 0.2687 18 1.6830 0.3000 1.6830 1.2973
No log 0.2985 20 1.6707 0.1880 1.6707 1.2925
No log 0.3284 22 1.6463 0.2479 1.6462 1.2831
No log 0.3582 24 1.4037 0.3158 1.4037 1.1848
No log 0.3881 26 1.8463 0.0727 1.8463 1.3588
No log 0.4179 28 2.0703 0.0909 2.0703 1.4389
No log 0.4478 30 1.9843 0.1062 1.9843 1.4086
No log 0.4776 32 1.4785 0.1667 1.4785 1.2159
No log 0.5075 34 1.2724 0.3761 1.2724 1.1280
No log 0.5373 36 1.8881 0.3684 1.8881 1.3741
No log 0.5672 38 1.6554 0.4681 1.6554 1.2866
No log 0.5970 40 1.1611 0.3717 1.1611 1.0775
No log 0.6269 42 1.2712 0.3423 1.2712 1.1275
No log 0.6567 44 1.3940 0.3276 1.3940 1.1807
No log 0.6866 46 1.5588 0.3360 1.5588 1.2485
No log 0.7164 48 1.4551 0.4 1.4551 1.2063
No log 0.7463 50 1.1576 0.4390 1.1576 1.0759
No log 0.7761 52 0.9234 0.6316 0.9234 0.9610
No log 0.8060 54 0.9126 0.6154 0.9126 0.9553
No log 0.8358 56 1.0456 0.4844 1.0456 1.0226
No log 0.8657 58 1.3072 0.5113 1.3072 1.1433
No log 0.8955 60 1.2204 0.5735 1.2204 1.1047
No log 0.9254 62 0.9685 0.5865 0.9685 0.9841
No log 0.9552 64 0.8994 0.6324 0.8994 0.9484
No log 0.9851 66 0.9722 0.6142 0.9722 0.9860
No log 1.0149 68 1.0479 0.5512 1.0479 1.0237
No log 1.0448 70 1.0999 0.5037 1.0999 1.0488
No log 1.0746 72 1.0616 0.5113 1.0616 1.0304
No log 1.1045 74 1.0027 0.5839 1.0027 1.0014
No log 1.1343 76 0.9874 0.6014 0.9874 0.9937
No log 1.1642 78 1.0178 0.6486 1.0178 1.0089
No log 1.1940 80 0.9829 0.7034 0.9829 0.9914
No log 1.2239 82 0.8566 0.6906 0.8566 0.9256
No log 1.2537 84 0.9080 0.6324 0.9080 0.9529
No log 1.2836 86 1.0107 0.6286 1.0107 1.0053
No log 1.3134 88 0.9816 0.6165 0.9816 0.9907
No log 1.3433 90 0.9207 0.6412 0.9207 0.9595
No log 1.3731 92 1.0391 0.6107 1.0391 1.0194
No log 1.4030 94 1.0859 0.6087 1.0859 1.0421
No log 1.4328 96 1.1279 0.5775 1.1279 1.0620
No log 1.4627 98 1.0552 0.6027 1.0552 1.0272
No log 1.4925 100 1.0665 0.5874 1.0665 1.0327
No log 1.5224 102 1.0193 0.6479 1.0193 1.0096
No log 1.5522 104 1.0236 0.6216 1.0236 1.0117
No log 1.5821 106 1.0121 0.6164 1.0121 1.0060
No log 1.6119 108 0.9635 0.6443 0.9635 0.9816
No log 1.6418 110 0.8874 0.6883 0.8874 0.9420
No log 1.6716 112 0.8407 0.7125 0.8407 0.9169
No log 1.7015 114 0.7822 0.7329 0.7822 0.8844
No log 1.7313 116 0.7411 0.7516 0.7411 0.8608
No log 1.7612 118 0.8219 0.6853 0.8219 0.9066
No log 1.7910 120 0.9243 0.5755 0.9243 0.9614
No log 1.8209 122 0.9469 0.6119 0.9469 0.9731
No log 1.8507 124 0.8437 0.6429 0.8437 0.9185
No log 1.8806 126 0.7713 0.6980 0.7713 0.8783
No log 1.9104 128 0.7624 0.7051 0.7624 0.8731
No log 1.9403 130 0.8307 0.7006 0.8307 0.9114
No log 1.9701 132 0.8725 0.7294 0.8725 0.9341
No log 2.0 134 0.8432 0.6971 0.8432 0.9182
No log 2.0299 136 0.9823 0.6941 0.9823 0.9911
No log 2.0597 138 1.1380 0.6667 1.1380 1.0668
No log 2.0896 140 1.0319 0.6788 1.0319 1.0158
No log 2.1194 142 0.9708 0.6620 0.9708 0.9853
No log 2.1493 144 0.8914 0.6232 0.8914 0.9442
No log 2.1791 146 0.8622 0.6316 0.8622 0.9285
No log 2.2090 148 0.8625 0.6569 0.8625 0.9287
No log 2.2388 150 0.8581 0.6345 0.8581 0.9263
No log 2.2687 152 0.8900 0.6621 0.8900 0.9434
No log 2.2985 154 0.9169 0.6577 0.9169 0.9576
No log 2.3284 156 0.8987 0.6621 0.8987 0.9480
No log 2.3582 158 0.8252 0.6522 0.8252 0.9084
No log 2.3881 160 0.8714 0.6980 0.8714 0.9335
No log 2.4179 162 0.9280 0.7013 0.9280 0.9633
No log 2.4478 164 0.8683 0.7285 0.8683 0.9318
No log 2.4776 166 0.8855 0.6667 0.8855 0.9410
No log 2.5075 168 0.9761 0.6623 0.9761 0.9880
No log 2.5373 170 0.9321 0.6623 0.9321 0.9654
No log 2.5672 172 0.7965 0.6755 0.7965 0.8924
No log 2.5970 174 0.8126 0.7468 0.8126 0.9014
No log 2.6269 176 0.8144 0.7389 0.8144 0.9024
No log 2.6567 178 0.7397 0.7075 0.7397 0.8600
No log 2.6866 180 0.7606 0.7347 0.7606 0.8721
No log 2.7164 182 0.8908 0.6933 0.8908 0.9438
No log 2.7463 184 0.8721 0.7114 0.8721 0.9339
No log 2.7761 186 0.8073 0.7183 0.8073 0.8985
No log 2.8060 188 0.7805 0.7310 0.7805 0.8835
No log 2.8358 190 0.8064 0.7590 0.8064 0.8980
No log 2.8657 192 0.8708 0.7191 0.8708 0.9332
No log 2.8955 194 0.8454 0.7444 0.8454 0.9194
No log 2.9254 196 0.8459 0.7444 0.8459 0.9197
No log 2.9552 198 0.8870 0.7273 0.8870 0.9418
No log 2.9851 200 0.7262 0.7222 0.7262 0.8522
No log 3.0149 202 0.6719 0.7042 0.6719 0.8197
No log 3.0448 204 0.6872 0.6957 0.6872 0.8290
No log 3.0746 206 0.6878 0.7324 0.6878 0.8293
No log 3.1045 208 0.7630 0.7 0.7630 0.8735
No log 3.1343 210 0.9272 0.6933 0.9272 0.9629
No log 3.1642 212 0.9683 0.7013 0.9683 0.9840
No log 3.1940 214 0.7845 0.7143 0.7845 0.8857
No log 3.2239 216 0.7000 0.7273 0.7000 0.8366
No log 3.2537 218 0.6961 0.6759 0.6961 0.8343
No log 3.2836 220 0.6849 0.7152 0.6849 0.8276
No log 3.3134 222 0.6756 0.76 0.6756 0.8219
No log 3.3433 224 0.7654 0.7333 0.7654 0.8749
No log 3.3731 226 0.8079 0.7143 0.8079 0.8989
No log 3.4030 228 0.8007 0.7027 0.8007 0.8948
No log 3.4328 230 0.7293 0.7133 0.7293 0.8540
No log 3.4627 232 0.7137 0.6849 0.7137 0.8448
No log 3.4925 234 0.6998 0.7448 0.6998 0.8365
No log 3.5224 236 0.7393 0.72 0.7393 0.8598
No log 3.5522 238 0.8904 0.7337 0.8904 0.9436
No log 3.5821 240 0.9118 0.7326 0.9118 0.9549
No log 3.6119 242 0.7450 0.7261 0.7450 0.8631
No log 3.6418 244 0.6538 0.7285 0.6538 0.8086
No log 3.6716 246 0.6726 0.8025 0.6726 0.8201
No log 3.7015 248 0.6817 0.7651 0.6817 0.8256
No log 3.7313 250 0.7176 0.7324 0.7176 0.8471
No log 3.7612 252 0.7234 0.7273 0.7234 0.8505
No log 3.7910 254 0.6836 0.7324 0.6836 0.8268
No log 3.8209 256 0.6706 0.7101 0.6706 0.8189
No log 3.8507 258 0.7216 0.7143 0.7216 0.8495
No log 3.8806 260 0.6764 0.7361 0.6764 0.8224
No log 3.9104 262 0.6619 0.7261 0.6619 0.8135
No log 3.9403 264 0.6271 0.75 0.6271 0.7919
No log 3.9701 266 0.5752 0.8121 0.5752 0.7584
No log 4.0 268 0.5776 0.8171 0.5776 0.7600
No log 4.0299 270 0.5929 0.7771 0.5929 0.7700
No log 4.0597 272 0.7293 0.7456 0.7293 0.8540
No log 4.0896 274 0.7770 0.7349 0.7770 0.8815
No log 4.1194 276 0.6471 0.7361 0.6471 0.8044
No log 4.1493 278 0.6642 0.7692 0.6642 0.8150
No log 4.1791 280 0.6624 0.775 0.6624 0.8139
No log 4.2090 282 0.6455 0.7484 0.6455 0.8034
No log 4.2388 284 0.7262 0.7273 0.7262 0.8522
No log 4.2687 286 0.8020 0.7657 0.8020 0.8955
No log 4.2985 288 0.7632 0.725 0.7632 0.8736
No log 4.3284 290 0.7108 0.7050 0.7108 0.8431
No log 4.3582 292 0.7042 0.7000 0.7042 0.8392
No log 4.3881 294 0.7336 0.7172 0.7336 0.8565
No log 4.4179 296 0.6801 0.7260 0.6801 0.8247
No log 4.4478 298 0.7143 0.7020 0.7143 0.8452
No log 4.4776 300 0.8235 0.7305 0.8235 0.9075
No log 4.5075 302 0.8410 0.7024 0.8410 0.9171
No log 4.5373 304 0.6792 0.7222 0.6792 0.8241
No log 4.5672 306 0.6212 0.7172 0.6212 0.7882
No log 4.5970 308 0.6272 0.7172 0.6272 0.7920
No log 4.6269 310 0.6523 0.75 0.6523 0.8077
No log 4.6567 312 0.6612 0.7143 0.6612 0.8132
No log 4.6866 314 0.6279 0.7333 0.6279 0.7924
No log 4.7164 316 0.6154 0.7651 0.6154 0.7844
No log 4.7463 318 0.6173 0.7050 0.6173 0.7857
No log 4.7761 320 0.6866 0.7234 0.6866 0.8286
No log 4.8060 322 0.7240 0.6944 0.7240 0.8509
No log 4.8358 324 0.7257 0.7027 0.7257 0.8519
No log 4.8657 326 0.6657 0.7183 0.6657 0.8159
No log 4.8955 328 0.6882 0.6667 0.6882 0.8296
No log 4.9254 330 0.7069 0.6715 0.7069 0.8408
No log 4.9552 332 0.7498 0.6861 0.7498 0.8659
No log 4.9851 334 0.8058 0.6765 0.8058 0.8977
No log 5.0149 336 0.7562 0.6861 0.7562 0.8696
No log 5.0448 338 0.7126 0.7101 0.7126 0.8442
No log 5.0746 340 0.6470 0.7338 0.6470 0.8044
No log 5.1045 342 0.5940 0.7639 0.5940 0.7707
No log 5.1343 344 0.5835 0.8052 0.5835 0.7639
No log 5.1642 346 0.6046 0.7867 0.6046 0.7775
No log 5.1940 348 0.6763 0.7152 0.6763 0.8224
No log 5.2239 350 0.7312 0.7273 0.7312 0.8551
No log 5.2537 352 0.7201 0.6957 0.7201 0.8486
No log 5.2836 354 0.7294 0.6423 0.7294 0.8540
No log 5.3134 356 0.7045 0.6423 0.7045 0.8394
No log 5.3433 358 0.6672 0.6950 0.6672 0.8168
No log 5.3731 360 0.7641 0.7394 0.7641 0.8741
No log 5.4030 362 0.8114 0.7485 0.8114 0.9008
No log 5.4328 364 0.6769 0.7226 0.6769 0.8227
No log 5.4627 366 0.6301 0.7297 0.6301 0.7938
No log 5.4925 368 0.5890 0.7273 0.5890 0.7675
No log 5.5224 370 0.5806 0.7619 0.5806 0.7620
No log 5.5522 372 0.5850 0.7368 0.5850 0.7649
No log 5.5821 374 0.6451 0.7368 0.6451 0.8032
No log 5.6119 376 0.6941 0.7246 0.6941 0.8331
No log 5.6418 378 0.7211 0.6912 0.7211 0.8492
No log 5.6716 380 0.7038 0.6716 0.7038 0.8389
No log 5.7015 382 0.6677 0.7324 0.6677 0.8171
No log 5.7313 384 0.6588 0.7417 0.6588 0.8117
No log 5.7612 386 0.6868 0.7067 0.6868 0.8288
No log 5.7910 388 0.7058 0.7089 0.7058 0.8401
No log 5.8209 390 0.7104 0.6667 0.7104 0.8429
No log 5.8507 392 0.7292 0.6716 0.7292 0.8539
No log 5.8806 394 0.7703 0.6716 0.7703 0.8776
No log 5.9104 396 0.7651 0.6917 0.7651 0.8747
No log 5.9403 398 0.7331 0.6565 0.7331 0.8562
No log 5.9701 400 0.6728 0.7234 0.6728 0.8203
No log 6.0 402 0.6132 0.8 0.6132 0.7831
No log 6.0299 404 0.5748 0.8166 0.5748 0.7581
No log 6.0597 406 0.5824 0.8268 0.5824 0.7632
No log 6.0896 408 0.5892 0.7758 0.5892 0.7676
No log 6.1194 410 0.6181 0.725 0.6181 0.7862
No log 6.1493 412 0.6783 0.6857 0.6783 0.8236
No log 6.1791 414 0.6971 0.6765 0.6971 0.8349
No log 6.2090 416 0.7002 0.6718 0.7002 0.8368
No log 6.2388 418 0.6979 0.6308 0.6979 0.8354
No log 6.2687 420 0.6931 0.6917 0.6931 0.8325
No log 6.2985 422 0.6692 0.6667 0.6692 0.8180
No log 6.3284 424 0.6244 0.6765 0.6244 0.7902
No log 6.3582 426 0.6261 0.6939 0.6261 0.7912
No log 6.3881 428 0.6095 0.7578 0.6095 0.7807
No log 6.4179 430 0.6247 0.7516 0.6247 0.7904
No log 6.4478 432 0.6200 0.7007 0.6200 0.7874
No log 6.4776 434 0.6153 0.7391 0.6153 0.7844
No log 6.5075 436 0.6413 0.7313 0.6413 0.8008
No log 6.5373 438 0.6668 0.6866 0.6668 0.8166
No log 6.5672 440 0.6726 0.6866 0.6726 0.8201
No log 6.5970 442 0.6741 0.7007 0.6741 0.8211
No log 6.6269 444 0.6863 0.7133 0.6863 0.8284
No log 6.6567 446 0.7141 0.7516 0.7141 0.8451
No log 6.6866 448 0.6477 0.7613 0.6477 0.8048
No log 6.7164 450 0.5985 0.7792 0.5985 0.7737
No log 6.7463 452 0.6231 0.76 0.6231 0.7893
No log 6.7761 454 0.6179 0.7682 0.6179 0.7860
No log 6.8060 456 0.6522 0.7222 0.6522 0.8076
No log 6.8358 458 0.7465 0.7407 0.7465 0.8640
No log 6.8657 460 0.7213 0.75 0.7213 0.8493
No log 6.8955 462 0.6368 0.7361 0.6368 0.7980
No log 6.9254 464 0.6201 0.7083 0.6201 0.7875
No log 6.9552 466 0.6109 0.7586 0.6109 0.7816
No log 6.9851 468 0.6043 0.7586 0.6043 0.7773
No log 7.0149 470 0.6063 0.7639 0.6063 0.7787
No log 7.0448 472 0.6138 0.6957 0.6138 0.7835
No log 7.0746 474 0.6451 0.7083 0.6451 0.8032
No log 7.1045 476 0.7149 0.7425 0.7149 0.8455
No log 7.1343 478 0.7533 0.7456 0.7533 0.8679
No log 7.1642 480 0.6937 0.7425 0.6937 0.8329
No log 7.1940 482 0.6003 0.7451 0.6003 0.7748
No log 7.2239 484 0.5653 0.7517 0.5653 0.7519
No log 7.2537 486 0.5578 0.7898 0.5578 0.7468
No log 7.2836 488 0.6107 0.7927 0.6107 0.7815
No log 7.3134 490 0.6130 0.775 0.6130 0.7830
No log 7.3433 492 0.6215 0.7432 0.6215 0.7884
No log 7.3731 494 0.6216 0.7632 0.6216 0.7884
No log 7.4030 496 0.5922 0.7432 0.5922 0.7695
No log 7.4328 498 0.5882 0.75 0.5882 0.7669
0.4632 7.4627 500 0.6101 0.7083 0.6101 0.7811
0.4632 7.4925 502 0.6300 0.6957 0.6300 0.7937
0.4632 7.5224 504 0.6499 0.6861 0.6499 0.8062
0.4632 7.5522 506 0.6945 0.7050 0.6945 0.8334
0.4632 7.5821 508 0.6862 0.6815 0.6862 0.8284
0.4632 7.6119 510 0.6630 0.6716 0.6630 0.8143

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
3
Safetensors
Model size
135M params
Tensor type
F32
·
Inference Providers NEW
This model is not currently available via any of the supported Inference Providers.

Model tree for MayBashendy/ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run1_AugV5_k9_task1_organization

Finetuned
(4222)
this model