ArabicNewSplits8_usingWellWrittenEssays_FineTuningAraBERT_run2_AugV5_k16_task2_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.5897
  • Qwk: 0.4751
  • Mse: 0.5897
  • Rmse: 0.7679

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0235 2 4.2952 -0.0170 4.2952 2.0725
No log 0.0471 4 2.5874 0.0216 2.5874 1.6085
No log 0.0706 6 1.5126 0.0302 1.5126 1.2299
No log 0.0941 8 1.0531 -0.0157 1.0531 1.0262
No log 0.1176 10 0.9095 0.1579 0.9095 0.9537
No log 0.1412 12 0.9421 0.0243 0.9421 0.9706
No log 0.1647 14 0.8504 0.1555 0.8504 0.9222
No log 0.1882 16 0.9690 0.1489 0.9690 0.9844
No log 0.2118 18 0.8846 0.1231 0.8846 0.9405
No log 0.2353 20 0.8602 0.0994 0.8602 0.9275
No log 0.2588 22 0.8557 0.1597 0.8557 0.9250
No log 0.2824 24 0.7946 0.2179 0.7946 0.8914
No log 0.3059 26 0.7970 0.0734 0.7970 0.8927
No log 0.3294 28 0.8333 0.1448 0.8333 0.9129
No log 0.3529 30 1.0097 0.1955 1.0097 1.0048
No log 0.3765 32 1.1288 0.1446 1.1288 1.0624
No log 0.4 34 1.3164 0.0421 1.3164 1.1474
No log 0.4235 36 1.2171 0.0210 1.2171 1.1032
No log 0.4471 38 1.0197 0.0794 1.0197 1.0098
No log 0.4706 40 0.8346 0.2146 0.8346 0.9136
No log 0.4941 42 0.7870 0.1837 0.7870 0.8871
No log 0.5176 44 0.7661 0.2221 0.7661 0.8753
No log 0.5412 46 0.7612 0.2477 0.7612 0.8725
No log 0.5647 48 0.7906 0.2722 0.7906 0.8891
No log 0.5882 50 0.9523 0.2669 0.9523 0.9759
No log 0.6118 52 1.0150 0.2838 1.0150 1.0075
No log 0.6353 54 1.0136 0.3052 1.0136 1.0068
No log 0.6588 56 0.8953 0.3138 0.8953 0.9462
No log 0.6824 58 0.6821 0.3974 0.6821 0.8259
No log 0.7059 60 0.6418 0.3848 0.6418 0.8011
No log 0.7294 62 0.6681 0.4125 0.6681 0.8174
No log 0.7529 64 0.6741 0.4415 0.6741 0.8210
No log 0.7765 66 0.6467 0.4521 0.6467 0.8042
No log 0.8 68 0.6487 0.4440 0.6487 0.8054
No log 0.8235 70 0.7782 0.3332 0.7782 0.8821
No log 0.8471 72 0.8817 0.2896 0.8817 0.9390
No log 0.8706 74 0.7401 0.4543 0.7401 0.8603
No log 0.8941 76 0.7804 0.48 0.7804 0.8834
No log 0.9176 78 0.8931 0.4307 0.8931 0.9450
No log 0.9412 80 0.7852 0.4850 0.7852 0.8861
No log 0.9647 82 0.6887 0.4372 0.6887 0.8299
No log 0.9882 84 0.7347 0.3999 0.7347 0.8572
No log 1.0118 86 0.7282 0.4368 0.7282 0.8534
No log 1.0353 88 0.7235 0.4531 0.7235 0.8506
No log 1.0588 90 0.8642 0.4219 0.8642 0.9296
No log 1.0824 92 1.0087 0.2179 1.0087 1.0043
No log 1.1059 94 0.9439 0.3560 0.9439 0.9716
No log 1.1294 96 0.7546 0.4320 0.7546 0.8687
No log 1.1529 98 0.7011 0.4768 0.7011 0.8373
No log 1.1765 100 0.7151 0.5012 0.7151 0.8456
No log 1.2 102 0.7835 0.4149 0.7835 0.8852
No log 1.2235 104 0.8434 0.3883 0.8434 0.9183
No log 1.2471 106 0.7963 0.4573 0.7963 0.8924
No log 1.2706 108 0.7039 0.4589 0.7039 0.8390
No log 1.2941 110 0.6839 0.4443 0.6839 0.8270
No log 1.3176 112 0.6859 0.4334 0.6859 0.8282
No log 1.3412 114 0.6955 0.4389 0.6955 0.8340
No log 1.3647 116 0.6540 0.4123 0.6540 0.8087
No log 1.3882 118 0.6603 0.4228 0.6603 0.8126
No log 1.4118 120 0.6532 0.4559 0.6532 0.8082
No log 1.4353 122 0.6614 0.5051 0.6614 0.8132
No log 1.4588 124 0.6658 0.4986 0.6658 0.8159
No log 1.4824 126 0.6764 0.4946 0.6764 0.8225
No log 1.5059 128 0.6959 0.5221 0.6959 0.8342
No log 1.5294 130 0.7031 0.4447 0.7031 0.8385
No log 1.5529 132 0.6906 0.5197 0.6906 0.8310
No log 1.5765 134 0.6954 0.4974 0.6954 0.8339
No log 1.6 136 0.8211 0.3928 0.8211 0.9062
No log 1.6235 138 0.7858 0.4100 0.7858 0.8865
No log 1.6471 140 0.6560 0.4222 0.6560 0.8099
No log 1.6706 142 0.7225 0.4708 0.7225 0.8500
No log 1.6941 144 0.7263 0.4642 0.7263 0.8523
No log 1.7176 146 0.6655 0.4482 0.6655 0.8158
No log 1.7412 148 0.6659 0.4596 0.6659 0.8160
No log 1.7647 150 0.6254 0.4108 0.6254 0.7909
No log 1.7882 152 0.6203 0.4108 0.6203 0.7876
No log 1.8118 154 0.6212 0.4008 0.6212 0.7882
No log 1.8353 156 0.6449 0.4751 0.6449 0.8030
No log 1.8588 158 0.6938 0.5567 0.6938 0.8330
No log 1.8824 160 0.7573 0.5446 0.7573 0.8702
No log 1.9059 162 0.7771 0.5348 0.7771 0.8815
No log 1.9294 164 0.7677 0.4910 0.7677 0.8762
No log 1.9529 166 0.6965 0.5089 0.6965 0.8346
No log 1.9765 168 0.8327 0.4853 0.8327 0.9125
No log 2.0 170 0.9319 0.3962 0.9319 0.9653
No log 2.0235 172 0.7443 0.4077 0.7443 0.8627
No log 2.0471 174 0.5958 0.4284 0.5958 0.7719
No log 2.0706 176 0.7032 0.4307 0.7032 0.8386
No log 2.0941 178 0.8592 0.3577 0.8592 0.9269
No log 2.1176 180 0.8443 0.3755 0.8443 0.9188
No log 2.1412 182 0.6867 0.4706 0.6867 0.8287
No log 2.1647 184 0.6287 0.5151 0.6287 0.7929
No log 2.1882 186 0.7707 0.4654 0.7707 0.8779
No log 2.2118 188 0.7411 0.4544 0.7411 0.8609
No log 2.2353 190 0.6375 0.5282 0.6375 0.7984
No log 2.2588 192 0.6828 0.5178 0.6828 0.8263
No log 2.2824 194 0.7071 0.5092 0.7071 0.8409
No log 2.3059 196 0.6468 0.4743 0.6468 0.8043
No log 2.3294 198 0.6119 0.4704 0.6119 0.7822
No log 2.3529 200 0.6112 0.5107 0.6112 0.7818
No log 2.3765 202 0.5955 0.4139 0.5955 0.7717
No log 2.4 204 0.6192 0.4291 0.6192 0.7869
No log 2.4235 206 0.5943 0.4329 0.5943 0.7709
No log 2.4471 208 0.5791 0.4603 0.5791 0.7610
No log 2.4706 210 0.6369 0.5109 0.6369 0.7980
No log 2.4941 212 0.6369 0.5166 0.6369 0.7981
No log 2.5176 214 0.6277 0.5416 0.6277 0.7923
No log 2.5412 216 0.6958 0.5 0.6958 0.8342
No log 2.5647 218 0.6625 0.5283 0.6625 0.8139
No log 2.5882 220 0.5980 0.4776 0.5980 0.7733
No log 2.6118 222 0.6050 0.4586 0.6050 0.7778
No log 2.6353 224 0.7867 0.4640 0.7867 0.8870
No log 2.6588 226 0.8655 0.4519 0.8655 0.9303
No log 2.6824 228 0.7346 0.5051 0.7346 0.8571
No log 2.7059 230 0.6346 0.4800 0.6346 0.7966
No log 2.7294 232 0.6477 0.5391 0.6477 0.8048
No log 2.7529 234 0.6599 0.5358 0.6599 0.8123
No log 2.7765 236 0.6636 0.5705 0.6636 0.8146
No log 2.8 238 0.6548 0.5663 0.6548 0.8092
No log 2.8235 240 0.6459 0.5427 0.6459 0.8037
No log 2.8471 242 0.6454 0.4483 0.6454 0.8033
No log 2.8706 244 0.6950 0.5042 0.6950 0.8337
No log 2.8941 246 0.6588 0.4823 0.6588 0.8117
No log 2.9176 248 0.6121 0.4316 0.6121 0.7823
No log 2.9412 250 0.6029 0.4225 0.6029 0.7765
No log 2.9647 252 0.6122 0.3857 0.6122 0.7824
No log 2.9882 254 0.6297 0.3930 0.6297 0.7936
No log 3.0118 256 0.6250 0.4193 0.6250 0.7906
No log 3.0353 258 0.6648 0.4754 0.6648 0.8154
No log 3.0588 260 0.7916 0.4816 0.7916 0.8897
No log 3.0824 262 0.7536 0.4816 0.7536 0.8681
No log 3.1059 264 0.6518 0.5021 0.6518 0.8073
No log 3.1294 266 0.5980 0.5604 0.5980 0.7733
No log 3.1529 268 0.6356 0.5441 0.6356 0.7973
No log 3.1765 270 0.7369 0.4933 0.7369 0.8584
No log 3.2 272 0.7291 0.5101 0.7291 0.8539
No log 3.2235 274 0.6236 0.5160 0.6236 0.7897
No log 3.2471 276 0.6126 0.5023 0.6126 0.7827
No log 3.2706 278 0.6018 0.4967 0.6018 0.7757
No log 3.2941 280 0.5787 0.4531 0.5787 0.7607
No log 3.3176 282 0.5754 0.3685 0.5754 0.7586
No log 3.3412 284 0.5784 0.3976 0.5784 0.7605
No log 3.3647 286 0.6101 0.4130 0.6101 0.7811
No log 3.3882 288 0.6510 0.4806 0.6510 0.8068
No log 3.4118 290 0.6381 0.5209 0.6381 0.7988
No log 3.4353 292 0.6498 0.5484 0.6498 0.8061
No log 3.4588 294 0.6611 0.5592 0.6611 0.8131
No log 3.4824 296 0.6569 0.5498 0.6569 0.8105
No log 3.5059 298 0.6408 0.5587 0.6408 0.8005
No log 3.5294 300 0.6850 0.5141 0.6850 0.8276
No log 3.5529 302 0.8214 0.4743 0.8214 0.9063
No log 3.5765 304 0.7773 0.4396 0.7773 0.8816
No log 3.6 306 0.6129 0.4198 0.6129 0.7829
No log 3.6235 308 0.6565 0.3952 0.6565 0.8102
No log 3.6471 310 0.7686 0.4671 0.7686 0.8767
No log 3.6706 312 0.7438 0.4091 0.7438 0.8624
No log 3.6941 314 0.6232 0.3934 0.6232 0.7894
No log 3.7176 316 0.6653 0.4416 0.6653 0.8156
No log 3.7412 318 0.7330 0.4254 0.7330 0.8562
No log 3.7647 320 0.6644 0.4448 0.6644 0.8151
No log 3.7882 322 0.6242 0.4523 0.6242 0.7901
No log 3.8118 324 0.7375 0.4464 0.7375 0.8588
No log 3.8353 326 0.7535 0.4608 0.7535 0.8681
No log 3.8588 328 0.6913 0.4236 0.6913 0.8314
No log 3.8824 330 0.6439 0.3557 0.6439 0.8025
No log 3.9059 332 0.6352 0.3547 0.6352 0.7970
No log 3.9294 334 0.6232 0.3974 0.6232 0.7894
No log 3.9529 336 0.6368 0.3547 0.6368 0.7980
No log 3.9765 338 0.6608 0.3696 0.6608 0.8129
No log 4.0 340 0.6722 0.5374 0.6722 0.8199
No log 4.0235 342 0.6902 0.5489 0.6902 0.8308
No log 4.0471 344 0.6468 0.5905 0.6468 0.8042
No log 4.0706 346 0.6228 0.5799 0.6228 0.7892
No log 4.0941 348 0.6219 0.5036 0.6219 0.7886
No log 4.1176 350 0.5991 0.4743 0.5991 0.7740
No log 4.1412 352 0.5962 0.4226 0.5962 0.7722
No log 4.1647 354 0.5969 0.3462 0.5969 0.7726
No log 4.1882 356 0.5982 0.3651 0.5982 0.7734
No log 4.2118 358 0.5824 0.4085 0.5824 0.7632
No log 4.2353 360 0.6464 0.4269 0.6464 0.8040
No log 4.2588 362 0.7434 0.4721 0.7434 0.8622
No log 4.2824 364 0.6970 0.4653 0.6970 0.8349
No log 4.3059 366 0.5905 0.4488 0.5905 0.7684
No log 4.3294 368 0.5827 0.5140 0.5827 0.7633
No log 4.3529 370 0.6094 0.5735 0.6094 0.7807
No log 4.3765 372 0.6304 0.5528 0.6304 0.7940
No log 4.4 374 0.6320 0.5773 0.6320 0.7950
No log 4.4235 376 0.6474 0.5283 0.6474 0.8046
No log 4.4471 378 0.6761 0.5352 0.6761 0.8223
No log 4.4706 380 0.6438 0.5397 0.6438 0.8024
No log 4.4941 382 0.6259 0.5314 0.6259 0.7911
No log 4.5176 384 0.6530 0.5165 0.6530 0.8081
No log 4.5412 386 0.6393 0.5165 0.6393 0.7996
No log 4.5647 388 0.5755 0.5336 0.5755 0.7586
No log 4.5882 390 0.5891 0.4705 0.5891 0.7675
No log 4.6118 392 0.6837 0.4970 0.6837 0.8269
No log 4.6353 394 0.6496 0.4951 0.6496 0.8060
No log 4.6588 396 0.5761 0.5131 0.5761 0.7590
No log 4.6824 398 0.6178 0.5482 0.6178 0.7860
No log 4.7059 400 0.6497 0.4942 0.6497 0.8061
No log 4.7294 402 0.6045 0.4991 0.6045 0.7775
No log 4.7529 404 0.5881 0.4707 0.5881 0.7669
No log 4.7765 406 0.6207 0.5068 0.6207 0.7879
No log 4.8 408 0.6279 0.4996 0.6279 0.7924
No log 4.8235 410 0.5824 0.4198 0.5824 0.7631
No log 4.8471 412 0.5611 0.4097 0.5611 0.7491
No log 4.8706 414 0.5528 0.4265 0.5528 0.7435
No log 4.8941 416 0.5533 0.4577 0.5533 0.7438
No log 4.9176 418 0.5563 0.5127 0.5563 0.7458
No log 4.9412 420 0.5768 0.5146 0.5768 0.7595
No log 4.9647 422 0.5983 0.5105 0.5983 0.7735
No log 4.9882 424 0.5906 0.5498 0.5906 0.7685
No log 5.0118 426 0.6016 0.5193 0.6016 0.7756
No log 5.0353 428 0.6036 0.5183 0.6036 0.7769
No log 5.0588 430 0.5995 0.4663 0.5995 0.7743
No log 5.0824 432 0.5874 0.4594 0.5874 0.7664
No log 5.1059 434 0.5749 0.4690 0.5749 0.7582
No log 5.1294 436 0.5650 0.4717 0.5650 0.7517
No log 5.1529 438 0.6029 0.5355 0.6029 0.7765
No log 5.1765 440 0.5900 0.5095 0.5900 0.7681
No log 5.2 442 0.5647 0.5488 0.5647 0.7515
No log 5.2235 444 0.5843 0.5006 0.5843 0.7644
No log 5.2471 446 0.5841 0.4790 0.5841 0.7643
No log 5.2706 448 0.5873 0.4801 0.5873 0.7663
No log 5.2941 450 0.5695 0.3868 0.5695 0.7546
No log 5.3176 452 0.5637 0.3905 0.5637 0.7508
No log 5.3412 454 0.5610 0.4341 0.5610 0.7490
No log 5.3647 456 0.5633 0.4723 0.5633 0.7505
No log 5.3882 458 0.5812 0.4657 0.5812 0.7624
No log 5.4118 460 0.5777 0.5197 0.5777 0.7600
No log 5.4353 462 0.5738 0.5367 0.5738 0.7575
No log 5.4588 464 0.5677 0.5271 0.5677 0.7534
No log 5.4824 466 0.5657 0.5177 0.5657 0.7521
No log 5.5059 468 0.5640 0.4465 0.5640 0.7510
No log 5.5294 470 0.5689 0.4038 0.5689 0.7542
No log 5.5529 472 0.5626 0.4339 0.5626 0.7501
No log 5.5765 474 0.5634 0.4680 0.5634 0.7506
No log 5.6 476 0.5678 0.4821 0.5678 0.7535
No log 5.6235 478 0.6005 0.5387 0.6005 0.7749
No log 5.6471 480 0.6573 0.5085 0.6573 0.8107
No log 5.6706 482 0.6836 0.4970 0.6836 0.8268
No log 5.6941 484 0.6695 0.5200 0.6695 0.8182
No log 5.7176 486 0.6187 0.5630 0.6187 0.7866
No log 5.7412 488 0.5957 0.5193 0.5957 0.7718
No log 5.7647 490 0.5833 0.4998 0.5833 0.7637
No log 5.7882 492 0.5898 0.4632 0.5898 0.7680
No log 5.8118 494 0.6481 0.5084 0.6481 0.8051
No log 5.8353 496 0.6666 0.5352 0.6666 0.8165
No log 5.8588 498 0.6034 0.4892 0.6034 0.7768
0.3823 5.8824 500 0.5687 0.5004 0.5687 0.7541
0.3823 5.9059 502 0.5847 0.4808 0.5847 0.7647
0.3823 5.9294 504 0.5898 0.4600 0.5898 0.7680
0.3823 5.9529 506 0.5883 0.4680 0.5883 0.7670
0.3823 5.9765 508 0.5901 0.4897 0.5901 0.7682
0.3823 6.0 510 0.5897 0.4751 0.5897 0.7679

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
4
Safetensors
Model size
135M params
Tensor type
F32
·
Inference Providers NEW
This model is not currently available via any of the supported Inference Providers.

Model tree for MayBashendy/ArabicNewSplits8_usingWellWrittenEssays_FineTuningAraBERT_run2_AugV5_k16_task2_organization

Finetuned
(4222)
this model