ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run1_AugV5_k14_task1_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.9146
  • Qwk: 0.6056
  • Mse: 0.9146
  • Rmse: 0.9563

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0190 2 6.5517 0.0308 6.5517 2.5596
No log 0.0381 4 4.7869 0.0766 4.7869 2.1879
No log 0.0571 6 2.8272 0.0988 2.8272 1.6814
No log 0.0762 8 2.5314 0.0940 2.5314 1.5910
No log 0.0952 10 2.1412 0.2319 2.1412 1.4633
No log 0.1143 12 1.5841 0.1667 1.5841 1.2586
No log 0.1333 14 1.6739 0.1165 1.6739 1.2938
No log 0.1524 16 1.7918 0.1538 1.7918 1.3386
No log 0.1714 18 1.7448 0.1714 1.7448 1.3209
No log 0.1905 20 1.6869 0.2857 1.6869 1.2988
No log 0.2095 22 1.5199 0.2883 1.5199 1.2328
No log 0.2286 24 1.3379 0.2430 1.3379 1.1567
No log 0.2476 26 1.1810 0.4444 1.1810 1.0867
No log 0.2667 28 1.0280 0.6383 1.0280 1.0139
No log 0.2857 30 1.0030 0.6622 1.0030 1.0015
No log 0.3048 32 0.9486 0.6846 0.9486 0.9740
No log 0.3238 34 0.9023 0.7075 0.9023 0.9499
No log 0.3429 36 0.9863 0.6857 0.9863 0.9931
No log 0.3619 38 1.1702 0.4333 1.1702 1.0818
No log 0.3810 40 1.3630 0.3243 1.3630 1.1675
No log 0.4 42 1.2735 0.3509 1.2735 1.1285
No log 0.4190 44 1.1222 0.4463 1.1222 1.0593
No log 0.4381 46 0.9889 0.5410 0.9889 0.9945
No log 0.4571 48 0.9494 0.6667 0.9494 0.9744
No log 0.4762 50 0.9571 0.6667 0.9571 0.9783
No log 0.4952 52 0.9909 0.5926 0.9909 0.9954
No log 0.5143 54 0.9477 0.6763 0.9477 0.9735
No log 0.5333 56 0.9166 0.6667 0.9166 0.9574
No log 0.5524 58 0.9143 0.6620 0.9143 0.9562
No log 0.5714 60 0.8606 0.7632 0.8606 0.9277
No log 0.5905 62 0.8959 0.6538 0.8959 0.9465
No log 0.6095 64 0.8634 0.7044 0.8634 0.9292
No log 0.6286 66 0.8516 0.7389 0.8516 0.9228
No log 0.6476 68 1.0183 0.5899 1.0183 1.0091
No log 0.6667 70 1.3112 0.5109 1.3112 1.1451
No log 0.6857 72 1.3810 0.5070 1.3810 1.1751
No log 0.7048 74 1.0817 0.5655 1.0817 1.0400
No log 0.7238 76 1.0033 0.5674 1.0033 1.0017
No log 0.7429 78 0.9485 0.6286 0.9485 0.9739
No log 0.7619 80 0.9563 0.6131 0.9563 0.9779
No log 0.7810 82 1.0053 0.6222 1.0053 1.0027
No log 0.8 84 1.0341 0.5625 1.0341 1.0169
No log 0.8190 86 1.0190 0.5970 1.0190 1.0094
No log 0.8381 88 0.9736 0.5985 0.9736 0.9867
No log 0.8571 90 0.9543 0.5957 0.9543 0.9769
No log 0.8762 92 1.0101 0.6331 1.0101 1.0050
No log 0.8952 94 1.0883 0.5735 1.0883 1.0432
No log 0.9143 96 1.1082 0.5571 1.1082 1.0527
No log 0.9333 98 1.0794 0.6309 1.0794 1.0389
No log 0.9524 100 1.0070 0.6443 1.0070 1.0035
No log 0.9714 102 1.0181 0.6267 1.0181 1.0090
No log 0.9905 104 0.9894 0.6788 0.9894 0.9947
No log 1.0095 106 0.9548 0.7215 0.9548 0.9771
No log 1.0286 108 1.0065 0.6790 1.0065 1.0033
No log 1.0476 110 1.0297 0.6452 1.0297 1.0147
No log 1.0667 112 1.0482 0.6622 1.0482 1.0238
No log 1.0857 114 1.0103 0.6914 1.0103 1.0052
No log 1.1048 116 0.8010 0.7456 0.8010 0.8950
No log 1.1238 118 0.7259 0.7394 0.7259 0.8520
No log 1.1429 120 0.7342 0.75 0.7342 0.8568
No log 1.1619 122 0.7965 0.7412 0.7965 0.8925
No log 1.1810 124 0.7931 0.7412 0.7931 0.8906
No log 1.2 126 0.7182 0.7349 0.7182 0.8474
No log 1.2190 128 0.6559 0.7532 0.6559 0.8099
No log 1.2381 130 0.6494 0.8025 0.6494 0.8058
No log 1.2571 132 0.6930 0.7515 0.6930 0.8324
No log 1.2762 134 0.9638 0.6957 0.9638 0.9817
No log 1.2952 136 1.1964 0.6272 1.1964 1.0938
No log 1.3143 138 1.1772 0.6740 1.1772 1.0850
No log 1.3333 140 1.0636 0.6778 1.0636 1.0313
No log 1.3524 142 0.7139 0.7514 0.7139 0.8449
No log 1.3714 144 0.6140 0.8023 0.6140 0.7836
No log 1.3905 146 0.6483 0.7845 0.6483 0.8052
No log 1.4095 148 0.8337 0.75 0.8337 0.9130
No log 1.4286 150 1.1290 0.6798 1.1290 1.0625
No log 1.4476 152 1.0199 0.7059 1.0199 1.0099
No log 1.4667 154 0.7289 0.7545 0.7289 0.8538
No log 1.4857 156 0.5757 0.8077 0.5757 0.7588
No log 1.5048 158 0.5842 0.8 0.5842 0.7643
No log 1.5238 160 0.6127 0.7848 0.6127 0.7827
No log 1.5429 162 0.6933 0.7342 0.6933 0.8327
No log 1.5619 164 0.7437 0.6933 0.7437 0.8624
No log 1.5810 166 0.8044 0.6575 0.8044 0.8969
No log 1.6 168 0.8255 0.6395 0.8255 0.9086
No log 1.6190 170 0.6713 0.7534 0.6713 0.8193
No log 1.6381 172 0.6388 0.7843 0.6388 0.7992
No log 1.6571 174 0.6740 0.7484 0.6740 0.8210
No log 1.6762 176 0.5706 0.8375 0.5706 0.7554
No log 1.6952 178 0.5815 0.7643 0.5815 0.7626
No log 1.7143 180 0.6285 0.7532 0.6285 0.7928
No log 1.7333 182 0.6614 0.7532 0.6614 0.8133
No log 1.7524 184 0.7129 0.7152 0.7129 0.8443
No log 1.7714 186 0.7282 0.7361 0.7282 0.8533
No log 1.7905 188 0.7503 0.7286 0.7503 0.8662
No log 1.8095 190 0.8556 0.7 0.8556 0.9250
No log 1.8286 192 0.8324 0.6906 0.8324 0.9123
No log 1.8476 194 0.7918 0.7050 0.7918 0.8898
No log 1.8667 196 0.6338 0.7639 0.6338 0.7961
No log 1.8857 198 0.5462 0.8289 0.5462 0.7390
No log 1.9048 200 0.4957 0.8387 0.4957 0.7041
No log 1.9238 202 0.4710 0.8395 0.4710 0.6863
No log 1.9429 204 0.4777 0.8571 0.4777 0.6911
No log 1.9619 206 0.4964 0.8478 0.4964 0.7046
No log 1.9810 208 0.6313 0.8085 0.6313 0.7945
No log 2.0 210 0.9157 0.73 0.9157 0.9569
No log 2.0190 212 0.8711 0.7320 0.8711 0.9333
No log 2.0381 214 0.8205 0.7368 0.8205 0.9058
No log 2.0571 216 0.6370 0.7802 0.6370 0.7981
No log 2.0762 218 0.5574 0.8205 0.5574 0.7466
No log 2.0952 220 0.6803 0.7432 0.6803 0.8248
No log 2.1143 222 0.6350 0.7785 0.6350 0.7968
No log 2.1333 224 0.6395 0.7333 0.6395 0.7997
No log 2.1524 226 0.7550 0.7630 0.7550 0.8689
No log 2.1714 228 0.8245 0.7630 0.8245 0.9080
No log 2.1905 230 0.7387 0.7545 0.7387 0.8595
No log 2.2095 232 0.6800 0.7297 0.6800 0.8246
No log 2.2286 234 0.6665 0.7465 0.6665 0.8164
No log 2.2476 236 0.6284 0.7755 0.6284 0.7927
No log 2.2667 238 0.5438 0.7974 0.5438 0.7375
No log 2.2857 240 0.5226 0.8171 0.5226 0.7229
No log 2.3048 242 0.5855 0.7955 0.5855 0.7652
No log 2.3238 244 0.6964 0.7709 0.6964 0.8345
No log 2.3429 246 0.7315 0.7647 0.7315 0.8553
No log 2.3619 248 0.6716 0.7389 0.6716 0.8195
No log 2.3810 250 0.6520 0.7778 0.6520 0.8074
No log 2.4 252 0.7122 0.7606 0.7122 0.8439
No log 2.4190 254 0.6975 0.7101 0.6975 0.8352
No log 2.4381 256 0.7627 0.6667 0.7627 0.8733
No log 2.4571 258 0.9079 0.6790 0.9079 0.9528
No log 2.4762 260 0.9225 0.7086 0.9225 0.9604
No log 2.4952 262 0.7960 0.7381 0.7960 0.8922
No log 2.5143 264 0.6119 0.7722 0.6119 0.7822
No log 2.5333 266 0.6536 0.8158 0.6536 0.8085
No log 2.5524 268 0.6367 0.8079 0.6367 0.7979
No log 2.5714 270 0.6376 0.7432 0.6376 0.7985
No log 2.5905 272 0.8925 0.6667 0.8925 0.9447
No log 2.6095 274 1.0205 0.65 1.0205 1.0102
No log 2.6286 276 0.9694 0.6395 0.9694 0.9846
No log 2.6476 278 0.8762 0.6324 0.8762 0.9360
No log 2.6667 280 0.7778 0.6357 0.7778 0.8819
No log 2.6857 282 0.7090 0.7007 0.7090 0.8420
No log 2.7048 284 0.6955 0.7067 0.6955 0.8340
No log 2.7238 286 0.7898 0.6875 0.7898 0.8887
No log 2.7429 288 0.8869 0.6788 0.8869 0.9418
No log 2.7619 290 0.7946 0.7044 0.7946 0.8914
No log 2.7810 292 0.6416 0.7582 0.6416 0.8010
No log 2.8 294 0.6091 0.7808 0.6091 0.7804
No log 2.8190 296 0.6335 0.8 0.6335 0.7959
No log 2.8381 298 0.6439 0.7552 0.6439 0.8024
No log 2.8571 300 0.8212 0.6667 0.8212 0.9062
No log 2.8762 302 1.0236 0.6303 1.0236 1.0118
No log 2.8952 304 0.9824 0.6173 0.9824 0.9912
No log 2.9143 306 0.7579 0.7162 0.7579 0.8706
No log 2.9333 308 0.7078 0.7429 0.7078 0.8413
No log 2.9524 310 0.7385 0.6957 0.7385 0.8594
No log 2.9714 312 0.7357 0.6815 0.7357 0.8577
No log 2.9905 314 0.8328 0.6522 0.8328 0.9126
No log 3.0095 316 0.8631 0.6525 0.8631 0.9290
No log 3.0286 318 0.8203 0.6667 0.8203 0.9057
No log 3.0476 320 0.7236 0.7183 0.7236 0.8506
No log 3.0667 322 0.6998 0.7234 0.6998 0.8365
No log 3.0857 324 0.6989 0.6716 0.6989 0.8360
No log 3.1048 326 0.7365 0.6815 0.7365 0.8582
No log 3.1238 328 0.7779 0.6316 0.7779 0.8820
No log 3.1429 330 0.8528 0.6232 0.8528 0.9235
No log 3.1619 332 0.9292 0.5915 0.9292 0.9639
No log 3.1810 334 0.8466 0.6164 0.8466 0.9201
No log 3.2 336 0.7558 0.6395 0.7558 0.8694
No log 3.2190 338 0.8008 0.6538 0.8008 0.8949
No log 3.2381 340 0.7753 0.6389 0.7753 0.8805
No log 3.2571 342 0.7108 0.6812 0.7108 0.8431
No log 3.2762 344 0.7193 0.6906 0.7193 0.8481
No log 3.2952 346 0.8056 0.6528 0.8056 0.8976
No log 3.3143 348 0.8307 0.6438 0.8307 0.9114
No log 3.3333 350 0.7422 0.6761 0.7422 0.8615
No log 3.3524 352 0.6941 0.7153 0.6941 0.8331
No log 3.3714 354 0.6970 0.75 0.6970 0.8349
No log 3.3905 356 0.7666 0.6715 0.7666 0.8755
No log 3.4095 358 1.0083 0.64 1.0083 1.0041
No log 3.4286 360 1.1237 0.6234 1.1237 1.0600
No log 3.4476 362 1.0229 0.6197 1.0229 1.0114
No log 3.4667 364 0.9055 0.6269 0.9055 0.9516
No log 3.4857 366 0.7624 0.6970 0.7624 0.8731
No log 3.5048 368 0.7026 0.7391 0.7026 0.8382
No log 3.5238 370 0.7007 0.7050 0.7007 0.8371
No log 3.5429 372 0.6979 0.6950 0.6979 0.8354
No log 3.5619 374 0.6456 0.7871 0.6456 0.8035
No log 3.5810 376 0.6281 0.7651 0.6281 0.7925
No log 3.6 378 0.6547 0.7755 0.6547 0.8091
No log 3.6190 380 0.7007 0.7413 0.7007 0.8371
No log 3.6381 382 0.8162 0.6986 0.8162 0.9034
No log 3.6571 384 0.8806 0.6483 0.8806 0.9384
No log 3.6762 386 0.7733 0.6892 0.7733 0.8794
No log 3.6952 388 0.6396 0.7347 0.6396 0.7997
No log 3.7143 390 0.5970 0.7974 0.5970 0.7727
No log 3.7333 392 0.5712 0.8052 0.5712 0.7558
No log 3.7524 394 0.6488 0.7456 0.6488 0.8055
No log 3.7714 396 0.8190 0.7052 0.8190 0.9050
No log 3.7905 398 0.7703 0.7159 0.7703 0.8777
No log 3.8095 400 0.6076 0.7955 0.6076 0.7795
No log 3.8286 402 0.5383 0.8075 0.5383 0.7337
No log 3.8476 404 0.5664 0.8158 0.5664 0.7526
No log 3.8667 406 0.6323 0.7947 0.6323 0.7952
No log 3.8857 408 0.7062 0.6803 0.7062 0.8404
No log 3.9048 410 0.7800 0.6708 0.7800 0.8832
No log 3.9238 412 0.7652 0.7052 0.7652 0.8748
No log 3.9429 414 0.7030 0.7284 0.7030 0.8385
No log 3.9619 416 0.7322 0.6573 0.7322 0.8557
No log 3.9810 418 0.7349 0.6569 0.7349 0.8573
No log 4.0 420 0.7013 0.7338 0.7013 0.8374
No log 4.0190 422 0.7515 0.7111 0.7515 0.8669
No log 4.0381 424 0.8635 0.6154 0.8635 0.9293
No log 4.0571 426 0.9056 0.6107 0.9056 0.9516
No log 4.0762 428 0.8175 0.6418 0.8175 0.9042
No log 4.0952 430 0.7210 0.7059 0.7210 0.8491
No log 4.1143 432 0.6674 0.7606 0.6674 0.8169
No log 4.1333 434 0.6269 0.7632 0.6269 0.7918
No log 4.1524 436 0.7054 0.7470 0.7054 0.8399
No log 4.1714 438 0.8457 0.7045 0.8457 0.9196
No log 4.1905 440 0.8643 0.6936 0.8643 0.9297
No log 4.2095 442 0.7451 0.7170 0.7451 0.8632
No log 4.2286 444 0.6519 0.7552 0.6519 0.8074
No log 4.2476 446 0.6529 0.7429 0.6529 0.8080
No log 4.2667 448 0.6996 0.6897 0.6996 0.8364
No log 4.2857 450 0.8656 0.6708 0.8656 0.9304
No log 4.3048 452 0.9727 0.6667 0.9727 0.9863
No log 4.3238 454 0.8830 0.6897 0.8830 0.9397
No log 4.3429 456 0.6980 0.7251 0.6980 0.8355
No log 4.3619 458 0.6019 0.7922 0.6019 0.7759
No log 4.3810 460 0.6335 0.7619 0.6335 0.7959
No log 4.4 462 0.7198 0.6620 0.7198 0.8484
No log 4.4190 464 0.8223 0.6575 0.8223 0.9068
No log 4.4381 466 0.8710 0.6901 0.8710 0.9333
No log 4.4571 468 0.7842 0.7066 0.7842 0.8856
No log 4.4762 470 0.6315 0.7632 0.6315 0.7947
No log 4.4952 472 0.5915 0.7413 0.5915 0.7691
No log 4.5143 474 0.5864 0.7536 0.5864 0.7657
No log 4.5333 476 0.5953 0.7536 0.5953 0.7715
No log 4.5524 478 0.5925 0.75 0.5925 0.7698
No log 4.5714 480 0.7016 0.7284 0.7016 0.8376
No log 4.5905 482 0.7668 0.7262 0.7668 0.8757
No log 4.6095 484 0.6970 0.7284 0.6970 0.8349
No log 4.6286 486 0.5920 0.7671 0.5920 0.7694
No log 4.6476 488 0.5912 0.7571 0.5912 0.7689
No log 4.6667 490 0.6231 0.7606 0.6231 0.7894
No log 4.6857 492 0.6429 0.7606 0.6429 0.8018
No log 4.7048 494 0.6700 0.7376 0.6700 0.8185
No log 4.7238 496 0.7438 0.7050 0.7438 0.8625
No log 4.7429 498 0.7963 0.6715 0.7963 0.8924
0.4394 4.7619 500 0.7589 0.7050 0.7589 0.8712
0.4394 4.7810 502 0.6613 0.7391 0.6613 0.8132
0.4394 4.8 504 0.6204 0.7606 0.6204 0.7876
0.4394 4.8190 506 0.6510 0.7517 0.6510 0.8069
0.4394 4.8381 508 0.8332 0.6909 0.8332 0.9128
0.4394 4.8571 510 0.9099 0.7018 0.9099 0.9539
0.4394 4.8762 512 0.8080 0.7073 0.8080 0.8989
0.4394 4.8952 514 0.6414 0.7682 0.6414 0.8009
0.4394 4.9143 516 0.6112 0.7755 0.6112 0.7818
0.4394 4.9333 518 0.6284 0.7945 0.6284 0.7927
0.4394 4.9524 520 0.6639 0.7536 0.6639 0.8148
0.4394 4.9714 522 0.6816 0.6767 0.6816 0.8256
0.4394 4.9905 524 0.6757 0.7083 0.6757 0.8220
0.4394 5.0095 526 0.6646 0.7027 0.6646 0.8152
0.4394 5.0286 528 0.6667 0.7647 0.6667 0.8165
0.4394 5.0476 530 0.6703 0.7665 0.6703 0.8187
0.4394 5.0667 532 0.7150 0.6939 0.7150 0.8456
0.4394 5.0857 534 0.7512 0.6806 0.7512 0.8667
0.4394 5.1048 536 0.7374 0.6866 0.7374 0.8587
0.4394 5.1238 538 0.6841 0.7482 0.6841 0.8271
0.4394 5.1429 540 0.6228 0.7376 0.6228 0.7892
0.4394 5.1619 542 0.5970 0.7571 0.5970 0.7727
0.4394 5.1810 544 0.6386 0.7413 0.6386 0.7991
0.4394 5.2 546 0.7615 0.6528 0.7615 0.8726
0.4394 5.2190 548 0.8097 0.6875 0.8097 0.8999
0.4394 5.2381 550 0.7129 0.6842 0.7129 0.8444
0.4394 5.2571 552 0.6130 0.7429 0.6130 0.7829
0.4394 5.2762 554 0.6435 0.7445 0.6435 0.8022
0.4394 5.2952 556 0.6872 0.7007 0.6872 0.8290
0.4394 5.3143 558 0.7784 0.6714 0.7784 0.8823
0.4394 5.3333 560 0.8203 0.6383 0.8203 0.9057
0.4394 5.3524 562 0.8303 0.6383 0.8303 0.9112
0.4394 5.3714 564 0.9146 0.6056 0.9146 0.9563

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
4
Safetensors
Model size
135M params
Tensor type
F32
·
Inference Providers NEW
This model is not currently available via any of the supported Inference Providers.

Model tree for MayBashendy/ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run1_AugV5_k14_task1_organization

Finetuned
(4222)
this model