ArabicNewSplits8_usingWellWrittenEssays_FineTuningAraBERT_run3_AugV5_k8_task2_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.8517
  • Qwk: 0.3700
  • Mse: 0.8517
  • Rmse: 0.9229

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0455 2 4.3835 -0.0156 4.3835 2.0937
No log 0.0909 4 2.5692 0.0369 2.5692 1.6029
No log 0.1364 6 1.3236 0.0457 1.3236 1.1505
No log 0.1818 8 0.9170 0.0610 0.9170 0.9576
No log 0.2273 10 0.7958 0.2222 0.7958 0.8921
No log 0.2727 12 1.1018 0.0508 1.1018 1.0496
No log 0.3182 14 2.1526 0.0463 2.1526 1.4672
No log 0.3636 16 2.2904 0.0526 2.2904 1.5134
No log 0.4091 18 1.6850 0.0440 1.6850 1.2981
No log 0.4545 20 1.0355 0.0 1.0355 1.0176
No log 0.5 22 0.7964 0.2923 0.7964 0.8924
No log 0.5455 24 0.8113 0.0869 0.8113 0.9007
No log 0.5909 26 0.8513 0.0509 0.8513 0.9227
No log 0.6364 28 0.8455 0.0364 0.8455 0.9195
No log 0.6818 30 0.8208 0.0655 0.8208 0.9060
No log 0.7273 32 0.8253 0.0868 0.8253 0.9085
No log 0.7727 34 0.8301 0.1624 0.8301 0.9111
No log 0.8182 36 0.9030 0.1252 0.9030 0.9503
No log 0.8636 38 0.9515 0.0462 0.9515 0.9755
No log 0.9091 40 0.9483 0.0465 0.9483 0.9738
No log 0.9545 42 0.8961 0.1412 0.8961 0.9466
No log 1.0 44 0.8553 0.2853 0.8553 0.9248
No log 1.0455 46 0.8452 0.2289 0.8452 0.9194
No log 1.0909 48 0.8243 0.2982 0.8243 0.9079
No log 1.1364 50 0.7719 0.3072 0.7719 0.8786
No log 1.1818 52 0.7536 0.1819 0.7536 0.8681
No log 1.2273 54 0.7386 0.1758 0.7386 0.8594
No log 1.2727 56 0.7311 0.2023 0.7311 0.8550
No log 1.3182 58 0.7180 0.2946 0.7180 0.8474
No log 1.3636 60 0.7201 0.3947 0.7201 0.8486
No log 1.4091 62 0.7252 0.3844 0.7252 0.8516
No log 1.4545 64 0.6957 0.3370 0.6957 0.8341
No log 1.5 66 0.7031 0.2126 0.7031 0.8385
No log 1.5455 68 0.6938 0.2711 0.6938 0.8329
No log 1.5909 70 0.6889 0.2961 0.6889 0.8300
No log 1.6364 72 0.6904 0.4060 0.6904 0.8309
No log 1.6818 74 0.7638 0.2998 0.7638 0.8740
No log 1.7273 76 0.7415 0.3380 0.7415 0.8611
No log 1.7727 78 0.6965 0.3772 0.6965 0.8346
No log 1.8182 80 0.7657 0.3064 0.7657 0.8751
No log 1.8636 82 0.9503 0.1922 0.9503 0.9748
No log 1.9091 84 0.8851 0.1976 0.8851 0.9408
No log 1.9545 86 0.7015 0.3915 0.7015 0.8376
No log 2.0 88 0.8719 0.2690 0.8719 0.9338
No log 2.0455 90 1.1031 0.2649 1.1031 1.0503
No log 2.0909 92 0.9964 0.2784 0.9964 0.9982
No log 2.1364 94 0.7387 0.2616 0.7387 0.8595
No log 2.1818 96 0.7395 0.3559 0.7395 0.8599
No log 2.2273 98 0.7474 0.3141 0.7474 0.8645
No log 2.2727 100 0.7909 0.2672 0.7909 0.8893
No log 2.3182 102 0.7805 0.2707 0.7805 0.8834
No log 2.3636 104 0.7224 0.3436 0.7224 0.8499
No log 2.4091 106 0.7079 0.3281 0.7079 0.8414
No log 2.4545 108 0.7130 0.3281 0.7130 0.8444
No log 2.5 110 0.7484 0.3951 0.7484 0.8651
No log 2.5455 112 0.7736 0.2529 0.7736 0.8795
No log 2.5909 114 0.7673 0.2899 0.7673 0.8760
No log 2.6364 116 0.7610 0.2790 0.7610 0.8723
No log 2.6818 118 0.8298 0.3135 0.8298 0.9109
No log 2.7273 120 0.8733 0.3376 0.8733 0.9345
No log 2.7727 122 0.8024 0.3230 0.8024 0.8958
No log 2.8182 124 0.7338 0.2935 0.7338 0.8566
No log 2.8636 126 0.7680 0.3894 0.7680 0.8764
No log 2.9091 128 0.7510 0.3580 0.7510 0.8666
No log 2.9545 130 0.7125 0.3323 0.7125 0.8441
No log 3.0 132 0.7220 0.3240 0.7220 0.8497
No log 3.0455 134 0.7936 0.3476 0.7936 0.8908
No log 3.0909 136 0.8055 0.3449 0.8055 0.8975
No log 3.1364 138 0.7650 0.3628 0.7650 0.8746
No log 3.1818 140 0.8470 0.3706 0.8470 0.9203
No log 3.2273 142 0.7279 0.3904 0.7279 0.8532
No log 3.2727 144 0.8381 0.2649 0.8381 0.9155
No log 3.3182 146 0.9320 0.2336 0.9320 0.9654
No log 3.3636 148 0.8159 0.2419 0.8159 0.9032
No log 3.4091 150 0.7174 0.2942 0.7174 0.8470
No log 3.4545 152 0.7099 0.3557 0.7099 0.8426
No log 3.5 154 0.7085 0.4067 0.7085 0.8417
No log 3.5455 156 0.7057 0.4779 0.7057 0.8401
No log 3.5909 158 0.7109 0.3523 0.7109 0.8432
No log 3.6364 160 0.6892 0.3552 0.6892 0.8302
No log 3.6818 162 0.6827 0.3405 0.6827 0.8263
No log 3.7273 164 0.6949 0.3943 0.6949 0.8336
No log 3.7727 166 0.7034 0.3452 0.7034 0.8387
No log 3.8182 168 0.7843 0.3335 0.7843 0.8856
No log 3.8636 170 0.8337 0.2976 0.8337 0.9130
No log 3.9091 172 0.7465 0.3271 0.7465 0.8640
No log 3.9545 174 0.6937 0.3541 0.6937 0.8329
No log 4.0 176 0.7146 0.4061 0.7146 0.8453
No log 4.0455 178 0.6867 0.3847 0.6867 0.8287
No log 4.0909 180 0.6806 0.3345 0.6806 0.8250
No log 4.1364 182 0.7246 0.2632 0.7246 0.8512
No log 4.1818 184 0.6587 0.3337 0.6587 0.8116
No log 4.2273 186 0.6705 0.4598 0.6705 0.8189
No log 4.2727 188 0.6757 0.4448 0.6757 0.8220
No log 4.3182 190 0.6561 0.3412 0.6561 0.8100
No log 4.3636 192 0.7368 0.2695 0.7368 0.8584
No log 4.4091 194 0.7121 0.2934 0.7121 0.8439
No log 4.4545 196 0.6736 0.3649 0.6736 0.8207
No log 4.5 198 0.6871 0.3913 0.6871 0.8289
No log 4.5455 200 0.6877 0.3600 0.6877 0.8293
No log 4.5909 202 0.7147 0.3319 0.7147 0.8454
No log 4.6364 204 0.7462 0.3082 0.7462 0.8638
No log 4.6818 206 0.8064 0.3068 0.8064 0.8980
No log 4.7273 208 0.7396 0.3504 0.7396 0.8600
No log 4.7727 210 0.7323 0.3544 0.7323 0.8557
No log 4.8182 212 0.7695 0.3322 0.7695 0.8772
No log 4.8636 214 0.7286 0.3292 0.7286 0.8536
No log 4.9091 216 0.7981 0.3566 0.7981 0.8934
No log 4.9545 218 0.8278 0.3452 0.8278 0.9098
No log 5.0 220 0.7591 0.3074 0.7591 0.8712
No log 5.0455 222 0.7409 0.3260 0.7409 0.8608
No log 5.0909 224 0.7950 0.3139 0.7950 0.8916
No log 5.1364 226 0.7350 0.3406 0.7350 0.8573
No log 5.1818 228 0.7210 0.3722 0.7210 0.8491
No log 5.2273 230 0.7125 0.3289 0.7125 0.8441
No log 5.2727 232 0.6983 0.3868 0.6983 0.8357
No log 5.3182 234 0.6966 0.3792 0.6966 0.8346
No log 5.3636 236 0.6998 0.4089 0.6998 0.8366
No log 5.4091 238 0.7000 0.4023 0.7000 0.8367
No log 5.4545 240 0.6770 0.4335 0.6770 0.8228
No log 5.5 242 0.6754 0.3896 0.6754 0.8218
No log 5.5455 244 0.6815 0.4120 0.6815 0.8255
No log 5.5909 246 0.6875 0.2827 0.6875 0.8292
No log 5.6364 248 0.6987 0.2778 0.6987 0.8359
No log 5.6818 250 0.6901 0.3375 0.6901 0.8307
No log 5.7273 252 0.6991 0.2974 0.6991 0.8361
No log 5.7727 254 0.7011 0.3380 0.7011 0.8373
No log 5.8182 256 0.7015 0.4268 0.7015 0.8376
No log 5.8636 258 0.7120 0.3392 0.7120 0.8438
No log 5.9091 260 0.7275 0.3646 0.7275 0.8529
No log 5.9545 262 0.7523 0.3879 0.7523 0.8673
No log 6.0 264 0.7364 0.3861 0.7364 0.8581
No log 6.0455 266 0.7302 0.3696 0.7302 0.8545
No log 6.0909 268 0.7154 0.3671 0.7154 0.8458
No log 6.1364 270 0.7157 0.3701 0.7157 0.8460
No log 6.1818 272 0.7283 0.3767 0.7283 0.8534
No log 6.2273 274 0.7825 0.4457 0.7825 0.8846
No log 6.2727 276 0.7648 0.4118 0.7648 0.8745
No log 6.3182 278 0.7310 0.3797 0.7310 0.8550
No log 6.3636 280 0.7022 0.3971 0.7022 0.8380
No log 6.4091 282 0.7002 0.3545 0.7002 0.8368
No log 6.4545 284 0.7635 0.3938 0.7635 0.8738
No log 6.5 286 0.8213 0.4045 0.8213 0.9063
No log 6.5455 288 0.7945 0.4218 0.7945 0.8913
No log 6.5909 290 0.7414 0.3939 0.7414 0.8611
No log 6.6364 292 0.7133 0.3815 0.7133 0.8446
No log 6.6818 294 0.6963 0.3569 0.6963 0.8344
No log 6.7273 296 0.6826 0.3264 0.6826 0.8262
No log 6.7727 298 0.6915 0.4030 0.6915 0.8316
No log 6.8182 300 0.7166 0.4476 0.7166 0.8465
No log 6.8636 302 0.7637 0.4317 0.7637 0.8739
No log 6.9091 304 0.7796 0.4707 0.7796 0.8830
No log 6.9545 306 0.7562 0.3973 0.7562 0.8696
No log 7.0 308 0.7586 0.4221 0.7586 0.8710
No log 7.0455 310 0.7570 0.4307 0.7570 0.8701
No log 7.0909 312 0.7469 0.4111 0.7469 0.8642
No log 7.1364 314 0.7531 0.3657 0.7531 0.8678
No log 7.1818 316 0.7597 0.3603 0.7597 0.8716
No log 7.2273 318 0.7809 0.3820 0.7809 0.8837
No log 7.2727 320 0.7594 0.3357 0.7594 0.8715
No log 7.3182 322 0.7278 0.3176 0.7278 0.8531
No log 7.3636 324 0.7320 0.2511 0.7320 0.8556
No log 7.4091 326 0.7954 0.2207 0.7954 0.8918
No log 7.4545 328 0.7743 0.2371 0.7743 0.8799
No log 7.5 330 0.7040 0.3522 0.7040 0.8391
No log 7.5455 332 0.6999 0.3905 0.6999 0.8366
No log 7.5909 334 0.7024 0.3717 0.7024 0.8381
No log 7.6364 336 0.7408 0.4330 0.7408 0.8607
No log 7.6818 338 0.8220 0.4182 0.8220 0.9066
No log 7.7273 340 0.7948 0.4077 0.7948 0.8915
No log 7.7727 342 0.7225 0.4500 0.7225 0.8500
No log 7.8182 344 0.6824 0.3929 0.6824 0.8261
No log 7.8636 346 0.6878 0.4246 0.6878 0.8293
No log 7.9091 348 0.6761 0.4724 0.6761 0.8223
No log 7.9545 350 0.6688 0.4118 0.6688 0.8178
No log 8.0 352 0.6822 0.4622 0.6822 0.8259
No log 8.0455 354 0.6910 0.4370 0.6910 0.8312
No log 8.0909 356 0.6612 0.3740 0.6612 0.8132
No log 8.1364 358 0.6670 0.3877 0.6670 0.8167
No log 8.1818 360 0.7156 0.3529 0.7156 0.8459
No log 8.2273 362 0.8350 0.4182 0.8350 0.9138
No log 8.2727 364 0.8123 0.3898 0.8123 0.9013
No log 8.3182 366 0.7217 0.3114 0.7217 0.8495
No log 8.3636 368 0.7003 0.3992 0.7003 0.8368
No log 8.4091 370 0.7184 0.3477 0.7184 0.8476
No log 8.4545 372 0.6825 0.3742 0.6825 0.8261
No log 8.5 374 0.6933 0.2780 0.6933 0.8326
No log 8.5455 376 0.7395 0.3679 0.7395 0.8600
No log 8.5909 378 0.7328 0.3846 0.7328 0.8561
No log 8.6364 380 0.7175 0.3570 0.7175 0.8471
No log 8.6818 382 0.6899 0.3409 0.6899 0.8306
No log 8.7273 384 0.6803 0.3059 0.6803 0.8248
No log 8.7727 386 0.6862 0.3498 0.6862 0.8284
No log 8.8182 388 0.6849 0.3498 0.6849 0.8276
No log 8.8636 390 0.7117 0.3748 0.7117 0.8436
No log 8.9091 392 0.6843 0.3368 0.6843 0.8273
No log 8.9545 394 0.6891 0.3174 0.6891 0.8301
No log 9.0 396 0.7179 0.3997 0.7179 0.8473
No log 9.0455 398 0.7308 0.3997 0.7308 0.8549
No log 9.0909 400 0.7143 0.3821 0.7143 0.8451
No log 9.1364 402 0.7036 0.3352 0.7036 0.8388
No log 9.1818 404 0.7310 0.3864 0.7310 0.8550
No log 9.2273 406 0.8361 0.3788 0.8361 0.9144
No log 9.2727 408 0.8723 0.3845 0.8723 0.9340
No log 9.3182 410 0.7724 0.3388 0.7724 0.8789
No log 9.3636 412 0.7210 0.3523 0.7210 0.8491
No log 9.4091 414 0.6879 0.3281 0.6879 0.8294
No log 9.4545 416 0.6782 0.4096 0.6782 0.8235
No log 9.5 418 0.6974 0.3604 0.6974 0.8351
No log 9.5455 420 0.8132 0.3807 0.8132 0.9018
No log 9.5909 422 0.8383 0.3330 0.8383 0.9156
No log 9.6364 424 0.7456 0.4215 0.7456 0.8635
No log 9.6818 426 0.7191 0.3564 0.7191 0.8480
No log 9.7273 428 0.7095 0.3770 0.7095 0.8423
No log 9.7727 430 0.7538 0.3670 0.7538 0.8682
No log 9.8182 432 0.8142 0.3126 0.8142 0.9023
No log 9.8636 434 0.7640 0.2559 0.7640 0.8741
No log 9.9091 436 0.7081 0.3146 0.7081 0.8415
No log 9.9545 438 0.6685 0.4124 0.6685 0.8176
No log 10.0 440 0.6707 0.3571 0.6707 0.8190
No log 10.0455 442 0.7040 0.4552 0.7040 0.8390
No log 10.0909 444 0.7192 0.4295 0.7192 0.8480
No log 10.1364 446 0.6962 0.4079 0.6962 0.8344
No log 10.1818 448 0.6789 0.4084 0.6789 0.8240
No log 10.2273 450 0.6654 0.3532 0.6654 0.8157
No log 10.2727 452 0.6774 0.3778 0.6774 0.8230
No log 10.3182 454 0.6881 0.3942 0.6881 0.8295
No log 10.3636 456 0.7031 0.4310 0.7031 0.8385
No log 10.4091 458 0.7679 0.4051 0.7679 0.8763
No log 10.4545 460 0.7571 0.3906 0.7571 0.8701
No log 10.5 462 0.7001 0.4351 0.7001 0.8367
No log 10.5455 464 0.6856 0.4330 0.6856 0.8280
No log 10.5909 466 0.6693 0.3687 0.6693 0.8181
No log 10.6364 468 0.6557 0.3803 0.6557 0.8098
No log 10.6818 470 0.6764 0.3987 0.6764 0.8225
No log 10.7273 472 0.6885 0.4135 0.6885 0.8298
No log 10.7727 474 0.7302 0.4612 0.7302 0.8545
No log 10.8182 476 0.8298 0.4046 0.8298 0.9109
No log 10.8636 478 0.8811 0.4004 0.8811 0.9387
No log 10.9091 480 0.8006 0.3626 0.8006 0.8948
No log 10.9545 482 0.7373 0.3745 0.7373 0.8586
No log 11.0 484 0.7083 0.4203 0.7083 0.8416
No log 11.0455 486 0.6984 0.4075 0.6984 0.8357
No log 11.0909 488 0.7156 0.3082 0.7156 0.8460
No log 11.1364 490 0.7878 0.2922 0.7878 0.8876
No log 11.1818 492 0.8698 0.3776 0.8698 0.9326
No log 11.2273 494 0.8400 0.3865 0.8400 0.9165
No log 11.2727 496 0.7362 0.3217 0.7362 0.8580
No log 11.3182 498 0.6884 0.4181 0.6884 0.8297
0.3793 11.3636 500 0.7310 0.4003 0.7310 0.8550
0.3793 11.4091 502 0.7068 0.3965 0.7068 0.8407
0.3793 11.4545 504 0.6686 0.3840 0.6686 0.8177
0.3793 11.5 506 0.7124 0.3844 0.7124 0.8441
0.3793 11.5455 508 0.8183 0.3556 0.8183 0.9046
0.3793 11.5909 510 0.8517 0.3700 0.8517 0.9229

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
4
Safetensors
Model size
135M params
Tensor type
F32
·
Inference Providers NEW
This model is not currently available via any of the supported Inference Providers.

Model tree for MayBashendy/ArabicNewSplits8_usingWellWrittenEssays_FineTuningAraBERT_run3_AugV5_k8_task2_organization

Finetuned
(4222)
this model