ArabicNewSplits8_usingWellWrittenEssays_FineTuningAraBERT_run1_AugV5_k11_task2_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.5664
  • Qwk: 0.4173
  • Mse: 0.5664
  • Rmse: 0.7526

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0339 2 4.1225 -0.0238 4.1225 2.0304
No log 0.0678 4 2.1463 0.0637 2.1463 1.4650
No log 0.1017 6 1.4022 0.0059 1.4022 1.1841
No log 0.1356 8 1.0389 -0.0508 1.0389 1.0193
No log 0.1695 10 0.9354 0.1192 0.9354 0.9672
No log 0.2034 12 0.8272 0.1706 0.8272 0.9095
No log 0.2373 14 0.7632 0.2382 0.7632 0.8736
No log 0.2712 16 1.0661 0.0794 1.0661 1.0325
No log 0.3051 18 1.1157 0.0210 1.1157 1.0563
No log 0.3390 20 0.9744 0.0418 0.9744 0.9871
No log 0.3729 22 0.9062 0.0736 0.9062 0.9519
No log 0.4068 24 0.8501 0.1624 0.8501 0.9220
No log 0.4407 26 0.8557 0.0658 0.8557 0.9250
No log 0.4746 28 0.8581 0.0396 0.8581 0.9263
No log 0.5085 30 0.9253 0.1160 0.9253 0.9619
No log 0.5424 32 0.9600 0.0736 0.9600 0.9798
No log 0.5763 34 0.9652 0.0564 0.9652 0.9825
No log 0.6102 36 0.9350 0.0912 0.9350 0.9670
No log 0.6441 38 0.8668 0.1107 0.8668 0.9310
No log 0.6780 40 0.8554 0.1305 0.8554 0.9249
No log 0.7119 42 0.8795 0.0857 0.8795 0.9378
No log 0.7458 44 0.8582 0.1289 0.8582 0.9264
No log 0.7797 46 0.8611 0.1532 0.8611 0.9280
No log 0.8136 48 0.8959 0.1825 0.8959 0.9465
No log 0.8475 50 0.9392 0.0447 0.9392 0.9691
No log 0.8814 52 0.9313 0.0625 0.9313 0.9650
No log 0.9153 54 0.9962 0.0305 0.9962 0.9981
No log 0.9492 56 1.0365 -0.0124 1.0365 1.0181
No log 0.9831 58 0.9478 0.0794 0.9478 0.9736
No log 1.0169 60 0.8742 0.1748 0.8742 0.9350
No log 1.0508 62 0.8730 0.1808 0.8730 0.9344
No log 1.0847 64 0.8369 0.2197 0.8369 0.9148
No log 1.1186 66 0.7824 0.2083 0.7824 0.8845
No log 1.1525 68 0.7933 0.2014 0.7933 0.8907
No log 1.1864 70 0.7496 0.1417 0.7496 0.8658
No log 1.2203 72 0.7768 0.1155 0.7768 0.8813
No log 1.2542 74 0.7221 0.1825 0.7221 0.8498
No log 1.2881 76 0.7418 0.2889 0.7418 0.8613
No log 1.3220 78 0.8301 0.1663 0.8301 0.9111
No log 1.3559 80 0.8182 0.1785 0.8182 0.9046
No log 1.3898 82 0.7701 0.2161 0.7701 0.8776
No log 1.4237 84 0.7349 0.2925 0.7349 0.8573
No log 1.4576 86 0.7400 0.2404 0.7400 0.8602
No log 1.4915 88 0.8103 0.1596 0.8103 0.9002
No log 1.5254 90 0.8432 0.1804 0.8432 0.9183
No log 1.5593 92 0.7029 0.2248 0.7029 0.8384
No log 1.5932 94 0.6651 0.2404 0.6651 0.8155
No log 1.6271 96 0.6861 0.3284 0.6861 0.8283
No log 1.6610 98 0.6888 0.3811 0.6888 0.8299
No log 1.6949 100 0.7024 0.3951 0.7024 0.8381
No log 1.7288 102 0.6872 0.3758 0.6872 0.8290
No log 1.7627 104 0.6695 0.3801 0.6695 0.8182
No log 1.7966 106 0.6673 0.3593 0.6673 0.8169
No log 1.8305 108 0.7068 0.2972 0.7068 0.8407
No log 1.8644 110 0.9012 0.3253 0.9012 0.9493
No log 1.8983 112 1.1576 0.2832 1.1576 1.0759
No log 1.9322 114 1.0324 0.2782 1.0324 1.0161
No log 1.9661 116 0.8649 0.2909 0.8649 0.9300
No log 2.0 118 0.8401 0.3115 0.8401 0.9166
No log 2.0339 120 0.8326 0.3263 0.8326 0.9125
No log 2.0678 122 0.8310 0.3676 0.8310 0.9116
No log 2.1017 124 0.8966 0.3544 0.8966 0.9469
No log 2.1356 126 0.8388 0.3873 0.8388 0.9159
No log 2.1695 128 0.7707 0.3654 0.7707 0.8779
No log 2.2034 130 0.7506 0.3921 0.7506 0.8664
No log 2.2373 132 0.7349 0.3861 0.7349 0.8573
No log 2.2712 134 0.8069 0.3227 0.8069 0.8983
No log 2.3051 136 0.9954 0.3416 0.9954 0.9977
No log 2.3390 138 1.2255 0.2579 1.2255 1.1070
No log 2.3729 140 1.0870 0.3338 1.0870 1.0426
No log 2.4068 142 0.7419 0.4435 0.7419 0.8613
No log 2.4407 144 0.6682 0.4309 0.6682 0.8175
No log 2.4746 146 0.6781 0.3511 0.6781 0.8235
No log 2.5085 148 0.7196 0.4356 0.7196 0.8483
No log 2.5424 150 0.7364 0.3612 0.7364 0.8581
No log 2.5763 152 0.8048 0.3649 0.8048 0.8971
No log 2.6102 154 0.8032 0.3449 0.8032 0.8962
No log 2.6441 156 0.7826 0.3640 0.7826 0.8847
No log 2.6780 158 0.8145 0.4235 0.8145 0.9025
No log 2.7119 160 0.8654 0.4552 0.8654 0.9303
No log 2.7458 162 0.9148 0.4427 0.9148 0.9564
No log 2.7797 164 0.8710 0.4143 0.8710 0.9333
No log 2.8136 166 0.7419 0.4745 0.7419 0.8613
No log 2.8475 168 0.7123 0.4358 0.7123 0.8440
No log 2.8814 170 0.7353 0.4856 0.7353 0.8575
No log 2.9153 172 0.9396 0.3716 0.9396 0.9693
No log 2.9492 174 0.9291 0.3686 0.9291 0.9639
No log 2.9831 176 0.7016 0.4448 0.7016 0.8376
No log 3.0169 178 0.6602 0.4559 0.6602 0.8125
No log 3.0508 180 0.7287 0.4662 0.7287 0.8537
No log 3.0847 182 0.6843 0.4735 0.6843 0.8272
No log 3.1186 184 0.7164 0.4943 0.7164 0.8464
No log 3.1525 186 0.7399 0.4629 0.7399 0.8602
No log 3.1864 188 0.6682 0.4296 0.6682 0.8175
No log 3.2203 190 0.6388 0.3630 0.6388 0.7992
No log 3.2542 192 0.6405 0.4091 0.6405 0.8003
No log 3.2881 194 0.6580 0.3925 0.6580 0.8112
No log 3.3220 196 0.7223 0.4305 0.7223 0.8499
No log 3.3559 198 0.7681 0.4105 0.7681 0.8764
No log 3.3898 200 0.6816 0.4452 0.6816 0.8256
No log 3.4237 202 0.6508 0.4556 0.6508 0.8067
No log 3.4576 204 0.6705 0.4723 0.6705 0.8189
No log 3.4915 206 0.7297 0.4706 0.7297 0.8542
No log 3.5254 208 0.6668 0.4691 0.6668 0.8166
No log 3.5593 210 0.6451 0.4356 0.6451 0.8032
No log 3.5932 212 0.6539 0.4505 0.6539 0.8086
No log 3.6271 214 0.7036 0.4732 0.7036 0.8388
No log 3.6610 216 0.8470 0.4320 0.8470 0.9203
No log 3.6949 218 0.7988 0.4494 0.7988 0.8938
No log 3.7288 220 0.6899 0.4872 0.6899 0.8306
No log 3.7627 222 0.6412 0.4529 0.6412 0.8007
No log 3.7966 224 0.6514 0.5111 0.6514 0.8071
No log 3.8305 226 0.6558 0.4273 0.6558 0.8098
No log 3.8644 228 0.6252 0.4507 0.6252 0.7907
No log 3.8983 230 0.6260 0.4528 0.6260 0.7912
No log 3.9322 232 0.6416 0.4855 0.6416 0.8010
No log 3.9661 234 0.6250 0.4749 0.6250 0.7906
No log 4.0 236 0.6132 0.4598 0.6132 0.7831
No log 4.0339 238 0.6315 0.5374 0.6315 0.7947
No log 4.0678 240 0.6563 0.5368 0.6563 0.8101
No log 4.1017 242 0.6296 0.5051 0.6296 0.7935
No log 4.1356 244 0.6419 0.5077 0.6419 0.8012
No log 4.1695 246 0.6703 0.5160 0.6703 0.8187
No log 4.2034 248 0.6969 0.5178 0.6969 0.8348
No log 4.2373 250 0.6873 0.4907 0.6873 0.8290
No log 4.2712 252 0.7121 0.5017 0.7121 0.8439
No log 4.3051 254 0.7827 0.4875 0.7827 0.8847
No log 4.3390 256 0.8886 0.4634 0.8886 0.9426
No log 4.3729 258 0.8544 0.4722 0.8544 0.9244
No log 4.4068 260 0.7615 0.4722 0.7615 0.8726
No log 4.4407 262 0.7612 0.4749 0.7612 0.8725
No log 4.4746 264 0.6912 0.5236 0.6912 0.8314
No log 4.5085 266 0.6358 0.4711 0.6358 0.7974
No log 4.5424 268 0.6510 0.4671 0.6510 0.8069
No log 4.5763 270 0.6797 0.4943 0.6797 0.8244
No log 4.6102 272 0.7267 0.5191 0.7267 0.8525
No log 4.6441 274 0.7414 0.5044 0.7414 0.8611
No log 4.6780 276 0.6959 0.5260 0.6959 0.8342
No log 4.7119 278 0.6766 0.5208 0.6766 0.8226
No log 4.7458 280 0.7046 0.5262 0.7046 0.8394
No log 4.7797 282 0.6969 0.5024 0.6969 0.8348
No log 4.8136 284 0.7541 0.4856 0.7541 0.8684
No log 4.8475 286 0.7947 0.4813 0.7947 0.8914
No log 4.8814 288 0.7064 0.5108 0.7064 0.8405
No log 4.9153 290 0.6443 0.5682 0.6443 0.8027
No log 4.9492 292 0.6323 0.5878 0.6323 0.7952
No log 4.9831 294 0.7126 0.5328 0.7126 0.8441
No log 5.0169 296 0.7005 0.5254 0.7005 0.8370
No log 5.0508 298 0.6153 0.5279 0.6153 0.7844
No log 5.0847 300 0.6473 0.4546 0.6473 0.8045
No log 5.1186 302 0.6482 0.4567 0.6482 0.8051
No log 5.1525 304 0.6338 0.5261 0.6338 0.7961
No log 5.1864 306 0.7304 0.5126 0.7304 0.8546
No log 5.2203 308 0.7545 0.4929 0.7545 0.8686
No log 5.2542 310 0.6671 0.5219 0.6671 0.8167
No log 5.2881 312 0.6206 0.5409 0.6206 0.7878
No log 5.3220 314 0.6159 0.5519 0.6159 0.7848
No log 5.3559 316 0.6233 0.5599 0.6233 0.7895
No log 5.3898 318 0.6942 0.5336 0.6942 0.8332
No log 5.4237 320 0.6897 0.5076 0.6897 0.8305
No log 5.4576 322 0.5955 0.6049 0.5955 0.7717
No log 5.4915 324 0.5741 0.4389 0.5741 0.7577
No log 5.5254 326 0.6079 0.4379 0.6079 0.7796
No log 5.5593 328 0.5839 0.4488 0.5839 0.7642
No log 5.5932 330 0.5902 0.5683 0.5902 0.7682
No log 5.6271 332 0.7299 0.5134 0.7299 0.8543
No log 5.6610 334 0.7369 0.5025 0.7369 0.8584
No log 5.6949 336 0.6050 0.5514 0.6050 0.7778
No log 5.7288 338 0.5849 0.4896 0.5849 0.7648
No log 5.7627 340 0.5988 0.5477 0.5988 0.7738
No log 5.7966 342 0.6235 0.5663 0.6235 0.7896
No log 5.8305 344 0.6119 0.5569 0.6119 0.7823
No log 5.8644 346 0.6143 0.5473 0.6143 0.7838
No log 5.8983 348 0.6321 0.5472 0.6321 0.7950
No log 5.9322 350 0.6483 0.5355 0.6483 0.8052
No log 5.9661 352 0.6039 0.5127 0.6039 0.7771
No log 6.0 354 0.5919 0.5644 0.5919 0.7693
No log 6.0339 356 0.5831 0.5316 0.5831 0.7636
No log 6.0678 358 0.5862 0.5381 0.5862 0.7656
No log 6.1017 360 0.5792 0.5514 0.5792 0.7611
No log 6.1356 362 0.5855 0.4908 0.5855 0.7652
No log 6.1695 364 0.6070 0.5356 0.6070 0.7791
No log 6.2034 366 0.6010 0.5443 0.6010 0.7753
No log 6.2373 368 0.6027 0.5353 0.6027 0.7764
No log 6.2712 370 0.5920 0.5426 0.5920 0.7694
No log 6.3051 372 0.6090 0.5579 0.6090 0.7804
No log 6.3390 374 0.6604 0.5192 0.6604 0.8126
No log 6.3729 376 0.6088 0.5718 0.6088 0.7802
No log 6.4068 378 0.5663 0.4560 0.5663 0.7525
No log 6.4407 380 0.5731 0.4869 0.5731 0.7570
No log 6.4746 382 0.5934 0.5450 0.5934 0.7703
No log 6.5085 384 0.6225 0.5842 0.6225 0.7890
No log 6.5424 386 0.6218 0.5691 0.6218 0.7886
No log 6.5763 388 0.6099 0.4627 0.6099 0.7809
No log 6.6102 390 0.6560 0.5236 0.6560 0.8100
No log 6.6441 392 0.6956 0.4975 0.6956 0.8340
No log 6.6780 394 0.6699 0.5355 0.6699 0.8185
No log 6.7119 396 0.6538 0.5125 0.6538 0.8086
No log 6.7458 398 0.6849 0.5573 0.6849 0.8276
No log 6.7797 400 0.8330 0.4915 0.8330 0.9127
No log 6.8136 402 0.8812 0.4625 0.8812 0.9387
No log 6.8475 404 0.7348 0.4788 0.7348 0.8572
No log 6.8814 406 0.6108 0.4840 0.6108 0.7815
No log 6.9153 408 0.5994 0.3913 0.5994 0.7742
No log 6.9492 410 0.5899 0.3896 0.5899 0.7680
No log 6.9831 412 0.6375 0.4925 0.6375 0.7984
No log 7.0169 414 0.7382 0.4451 0.7382 0.8592
No log 7.0508 416 0.7085 0.3898 0.7085 0.8417
No log 7.0847 418 0.6268 0.3837 0.6268 0.7917
No log 7.1186 420 0.5758 0.3341 0.5758 0.7588
No log 7.1525 422 0.6212 0.4300 0.6212 0.7882
No log 7.1864 424 0.6218 0.4295 0.6218 0.7885
No log 7.2203 426 0.5821 0.3516 0.5821 0.7630
No log 7.2542 428 0.6338 0.5380 0.6338 0.7961
No log 7.2881 430 0.7998 0.4860 0.7998 0.8943
No log 7.3220 432 0.7964 0.4494 0.7964 0.8924
No log 7.3559 434 0.6835 0.4898 0.6835 0.8267
No log 7.3898 436 0.6134 0.4976 0.6134 0.7832
No log 7.4237 438 0.6072 0.4825 0.6072 0.7793
No log 7.4576 440 0.6210 0.4806 0.6210 0.7881
No log 7.4915 442 0.6308 0.5073 0.6308 0.7942
No log 7.5254 444 0.6467 0.5283 0.6467 0.8042
No log 7.5593 446 0.6476 0.5364 0.6476 0.8047
No log 7.5932 448 0.6319 0.5433 0.6319 0.7949
No log 7.6271 450 0.6251 0.5662 0.6251 0.7906
No log 7.6610 452 0.6252 0.5548 0.6252 0.7907
No log 7.6949 454 0.6163 0.5240 0.6163 0.7851
No log 7.7288 456 0.6377 0.5404 0.6377 0.7986
No log 7.7627 458 0.6689 0.5109 0.6689 0.8179
No log 7.7966 460 0.6463 0.5079 0.6463 0.8039
No log 7.8305 462 0.6069 0.4951 0.6069 0.7790
No log 7.8644 464 0.5957 0.4635 0.5957 0.7718
No log 7.8983 466 0.5987 0.4426 0.5987 0.7738
No log 7.9322 468 0.5963 0.4078 0.5963 0.7722
No log 7.9661 470 0.5910 0.3706 0.5910 0.7688
No log 8.0 472 0.5868 0.3601 0.5868 0.7660
No log 8.0339 474 0.5906 0.3633 0.5906 0.7685
No log 8.0678 476 0.6036 0.4025 0.6036 0.7769
No log 8.1017 478 0.5873 0.4083 0.5873 0.7664
No log 8.1356 480 0.6135 0.5194 0.6135 0.7832
No log 8.1695 482 0.6123 0.5210 0.6123 0.7825
No log 8.2034 484 0.5883 0.3601 0.5883 0.7670
No log 8.2373 486 0.5928 0.4281 0.5928 0.7700
No log 8.2712 488 0.5962 0.4200 0.5962 0.7721
No log 8.3051 490 0.6080 0.4442 0.6080 0.7797
No log 8.3390 492 0.6429 0.5466 0.6429 0.8018
No log 8.3729 494 0.6193 0.5026 0.6193 0.7870
No log 8.4068 496 0.6333 0.5069 0.6333 0.7958
No log 8.4407 498 0.6416 0.5023 0.6416 0.8010
0.4689 8.4746 500 0.6169 0.4819 0.6169 0.7854
0.4689 8.5085 502 0.6106 0.4905 0.6106 0.7814
0.4689 8.5424 504 0.6114 0.5447 0.6114 0.7819
0.4689 8.5763 506 0.6108 0.5461 0.6108 0.7815
0.4689 8.6102 508 0.6082 0.5525 0.6082 0.7799
0.4689 8.6441 510 0.5861 0.5453 0.5861 0.7655
0.4689 8.6780 512 0.5744 0.5134 0.5744 0.7579
0.4689 8.7119 514 0.5750 0.5084 0.5750 0.7583
0.4689 8.7458 516 0.5850 0.5857 0.5850 0.7648
0.4689 8.7797 518 0.5976 0.5710 0.5976 0.7731
0.4689 8.8136 520 0.6127 0.5869 0.6127 0.7827
0.4689 8.8475 522 0.6155 0.5456 0.6155 0.7845
0.4689 8.8814 524 0.6217 0.5624 0.6217 0.7885
0.4689 8.9153 526 0.6067 0.5220 0.6067 0.7789
0.4689 8.9492 528 0.5828 0.5097 0.5828 0.7634
0.4689 8.9831 530 0.5845 0.4375 0.5845 0.7645
0.4689 9.0169 532 0.5779 0.4756 0.5779 0.7602
0.4689 9.0508 534 0.5879 0.5668 0.5879 0.7667
0.4689 9.0847 536 0.6217 0.5416 0.6217 0.7885
0.4689 9.1186 538 0.6656 0.5565 0.6656 0.8159
0.4689 9.1525 540 0.6676 0.5657 0.6676 0.8171
0.4689 9.1864 542 0.6646 0.6052 0.6646 0.8152
0.4689 9.2203 544 0.6479 0.5972 0.6479 0.8049
0.4689 9.2542 546 0.5994 0.5443 0.5994 0.7742
0.4689 9.2881 548 0.5810 0.5514 0.5810 0.7622
0.4689 9.3220 550 0.5775 0.5553 0.5775 0.7600
0.4689 9.3559 552 0.5808 0.4909 0.5808 0.7621
0.4689 9.3898 554 0.5879 0.4837 0.5879 0.7668
0.4689 9.4237 556 0.5885 0.5037 0.5885 0.7671
0.4689 9.4576 558 0.5915 0.5089 0.5915 0.7691
0.4689 9.4915 560 0.6061 0.4594 0.6061 0.7786
0.4689 9.5254 562 0.5970 0.4916 0.5970 0.7726
0.4689 9.5593 564 0.5902 0.5451 0.5902 0.7683
0.4689 9.5932 566 0.6518 0.5143 0.6518 0.8073
0.4689 9.6271 568 0.6730 0.5187 0.6730 0.8204
0.4689 9.6610 570 0.6447 0.5127 0.6447 0.8029
0.4689 9.6949 572 0.6094 0.5180 0.6094 0.7807
0.4689 9.7288 574 0.5951 0.4110 0.5951 0.7714
0.4689 9.7627 576 0.6019 0.4792 0.6019 0.7758
0.4689 9.7966 578 0.6096 0.5440 0.6096 0.7808
0.4689 9.8305 580 0.6527 0.5336 0.6527 0.8079
0.4689 9.8644 582 0.7199 0.4866 0.7199 0.8485
0.4689 9.8983 584 0.7025 0.4866 0.7025 0.8381
0.4689 9.9322 586 0.6370 0.5485 0.6370 0.7981
0.4689 9.9661 588 0.6188 0.5588 0.6188 0.7866
0.4689 10.0 590 0.6187 0.5610 0.6187 0.7866
0.4689 10.0339 592 0.6432 0.5468 0.6432 0.8020
0.4689 10.0678 594 0.6871 0.4956 0.6871 0.8289
0.4689 10.1017 596 0.6621 0.5291 0.6621 0.8137
0.4689 10.1356 598 0.6026 0.4984 0.6026 0.7763
0.4689 10.1695 600 0.5931 0.4426 0.5931 0.7702
0.4689 10.2034 602 0.5965 0.4545 0.5965 0.7724
0.4689 10.2373 604 0.6058 0.5026 0.6058 0.7783
0.4689 10.2712 606 0.6042 0.5090 0.6042 0.7773
0.4689 10.3051 608 0.5903 0.5067 0.5903 0.7683
0.4689 10.3390 610 0.5850 0.5397 0.5850 0.7649
0.4689 10.3729 612 0.5796 0.5365 0.5796 0.7613
0.4689 10.4068 614 0.5975 0.5511 0.5975 0.7730
0.4689 10.4407 616 0.6048 0.5323 0.6048 0.7777
0.4689 10.4746 618 0.6389 0.5398 0.6389 0.7993
0.4689 10.5085 620 0.6913 0.5025 0.6913 0.8314
0.4689 10.5424 622 0.7052 0.5163 0.7052 0.8398
0.4689 10.5763 624 0.6299 0.5799 0.6299 0.7937
0.4689 10.6102 626 0.6033 0.5790 0.6033 0.7767
0.4689 10.6441 628 0.6068 0.5790 0.6068 0.7790
0.4689 10.6780 630 0.5978 0.5344 0.5978 0.7732
0.4689 10.7119 632 0.6073 0.5790 0.6073 0.7793
0.4689 10.7458 634 0.6401 0.5133 0.6401 0.8000
0.4689 10.7797 636 0.6940 0.4748 0.6940 0.8331
0.4689 10.8136 638 0.7055 0.4454 0.7055 0.8399
0.4689 10.8475 640 0.6791 0.4716 0.6791 0.8241
0.4689 10.8814 642 0.6117 0.5196 0.6117 0.7821
0.4689 10.9153 644 0.6012 0.4443 0.6012 0.7754
0.4689 10.9492 646 0.6198 0.5315 0.6198 0.7873
0.4689 10.9831 648 0.6270 0.5565 0.6270 0.7918
0.4689 11.0169 650 0.6130 0.4573 0.6130 0.7829
0.4689 11.0508 652 0.6154 0.4643 0.6154 0.7845
0.4689 11.0847 654 0.6423 0.5289 0.6423 0.8014
0.4689 11.1186 656 0.7571 0.4381 0.7571 0.8701
0.4689 11.1525 658 0.8511 0.4236 0.8511 0.9226
0.4689 11.1864 660 0.8283 0.4134 0.8283 0.9101
0.4689 11.2203 662 0.7210 0.5445 0.7210 0.8491
0.4689 11.2542 664 0.6328 0.5481 0.6328 0.7955
0.4689 11.2881 666 0.6014 0.4756 0.6014 0.7755
0.4689 11.3220 668 0.5939 0.4113 0.5939 0.7707
0.4689 11.3559 670 0.5864 0.4714 0.5864 0.7657
0.4689 11.3898 672 0.5916 0.5427 0.5916 0.7691
0.4689 11.4237 674 0.6043 0.5814 0.6043 0.7774
0.4689 11.4576 676 0.6153 0.5835 0.6153 0.7844
0.4689 11.4915 678 0.6124 0.5637 0.6124 0.7825
0.4689 11.5254 680 0.6091 0.5673 0.6091 0.7805
0.4689 11.5593 682 0.6056 0.5870 0.6056 0.7782
0.4689 11.5932 684 0.6003 0.5752 0.6003 0.7748
0.4689 11.6271 686 0.5940 0.5735 0.5940 0.7707
0.4689 11.6610 688 0.5947 0.5053 0.5947 0.7712
0.4689 11.6949 690 0.5986 0.5249 0.5986 0.7737
0.4689 11.7288 692 0.6062 0.5190 0.6062 0.7786
0.4689 11.7627 694 0.6009 0.5111 0.6009 0.7752
0.4689 11.7966 696 0.5932 0.5166 0.5932 0.7702
0.4689 11.8305 698 0.5936 0.4964 0.5936 0.7704
0.4689 11.8644 700 0.5973 0.5462 0.5973 0.7728
0.4689 11.8983 702 0.5922 0.6080 0.5922 0.7695
0.4689 11.9322 704 0.6446 0.5253 0.6446 0.8029
0.4689 11.9661 706 0.6818 0.4769 0.6818 0.8257
0.4689 12.0 708 0.6394 0.5291 0.6394 0.7996
0.4689 12.0339 710 0.5854 0.5883 0.5854 0.7651
0.4689 12.0678 712 0.5737 0.4740 0.5737 0.7574
0.4689 12.1017 714 0.5786 0.4543 0.5786 0.7607
0.4689 12.1356 716 0.5723 0.5172 0.5723 0.7565
0.4689 12.1695 718 0.6037 0.5785 0.6037 0.7770
0.4689 12.2034 720 0.6480 0.5354 0.6480 0.8050
0.4689 12.2373 722 0.6453 0.5361 0.6453 0.8033
0.4689 12.2712 724 0.5947 0.5705 0.5947 0.7712
0.4689 12.3051 726 0.5788 0.4457 0.5788 0.7608
0.4689 12.3390 728 0.5765 0.3943 0.5765 0.7593
0.4689 12.3729 730 0.5768 0.4074 0.5768 0.7594
0.4689 12.4068 732 0.6124 0.5333 0.6124 0.7825
0.4689 12.4407 734 0.6632 0.5258 0.6632 0.8144
0.4689 12.4746 736 0.6587 0.5275 0.6587 0.8116
0.4689 12.5085 738 0.6088 0.5617 0.6088 0.7803
0.4689 12.5424 740 0.5746 0.4523 0.5746 0.7580
0.4689 12.5763 742 0.5753 0.5219 0.5753 0.7585
0.4689 12.6102 744 0.5801 0.5422 0.5801 0.7617
0.4689 12.6441 746 0.5805 0.5323 0.5805 0.7619
0.4689 12.6780 748 0.5707 0.4245 0.5707 0.7555
0.4689 12.7119 750 0.5692 0.4420 0.5692 0.7544
0.4689 12.7458 752 0.5768 0.3848 0.5768 0.7595
0.4689 12.7797 754 0.5687 0.3876 0.5687 0.7542
0.4689 12.8136 756 0.5664 0.4173 0.5664 0.7526

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
4
Safetensors
Model size
135M params
Tensor type
F32
·
Inference Providers NEW
This model is not currently available via any of the supported Inference Providers.

Model tree for MayBashendy/ArabicNewSplits8_usingWellWrittenEssays_FineTuningAraBERT_run1_AugV5_k11_task2_organization

Finetuned
(4222)
this model