ArabicNewSplits7_B_usingWellWrittenEssays_FineTuningAraBERT_run999_AugV5_k20_task2_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.8610
  • Qwk: 0.3970
  • Mse: 0.8610
  • Rmse: 0.9279

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0185 2 4.8061 0.0010 4.8061 2.1923
No log 0.0370 4 2.6276 0.0051 2.6276 1.6210
No log 0.0556 6 1.6356 0.0682 1.6356 1.2789
No log 0.0741 8 1.3581 0.0958 1.3581 1.1654
No log 0.0926 10 1.4611 -0.0494 1.4611 1.2088
No log 0.1111 12 1.4770 -0.1091 1.4770 1.2153
No log 0.1296 14 1.3101 0.0847 1.3101 1.1446
No log 0.1481 16 1.3902 0.0253 1.3902 1.1791
No log 0.1667 18 1.4876 0.1288 1.4876 1.2197
No log 0.1852 20 1.3910 0.1507 1.3910 1.1794
No log 0.2037 22 1.2287 0.0788 1.2287 1.1085
No log 0.2222 24 1.1899 0.1043 1.1899 1.0908
No log 0.2407 26 1.1645 0.1443 1.1645 1.0791
No log 0.2593 28 1.1629 0.1344 1.1629 1.0784
No log 0.2778 30 1.1827 0.1344 1.1827 1.0875
No log 0.2963 32 1.2156 0.0977 1.2156 1.1025
No log 0.3148 34 1.2314 0.1232 1.2314 1.1097
No log 0.3333 36 1.4842 0.0537 1.4842 1.2183
No log 0.3519 38 1.6229 0.1032 1.6229 1.2739
No log 0.3704 40 1.3818 0.1530 1.3818 1.1755
No log 0.3889 42 1.2727 0.2446 1.2727 1.1281
No log 0.4074 44 1.1019 0.2168 1.1019 1.0497
No log 0.4259 46 1.0527 0.3066 1.0527 1.0260
No log 0.4444 48 0.9995 0.3695 0.9995 0.9997
No log 0.4630 50 0.9803 0.3596 0.9803 0.9901
No log 0.4815 52 0.9745 0.3346 0.9745 0.9872
No log 0.5 54 0.9892 0.3154 0.9892 0.9946
No log 0.5185 56 0.9996 0.4318 0.9996 0.9998
No log 0.5370 58 1.0600 0.2883 1.0600 1.0296
No log 0.5556 60 1.0838 0.2877 1.0838 1.0411
No log 0.5741 62 1.0717 0.3430 1.0717 1.0352
No log 0.5926 64 1.0834 0.2431 1.0834 1.0409
No log 0.6111 66 1.0516 0.2709 1.0516 1.0255
No log 0.6296 68 1.0386 0.2871 1.0386 1.0191
No log 0.6481 70 1.0151 0.3294 1.0151 1.0075
No log 0.6667 72 1.0660 0.2938 1.0660 1.0325
No log 0.6852 74 1.2035 0.4045 1.2035 1.0970
No log 0.7037 76 1.2189 0.4033 1.2189 1.1040
No log 0.7222 78 1.0613 0.4005 1.0613 1.0302
No log 0.7407 80 0.9840 0.3457 0.9840 0.9920
No log 0.7593 82 0.9527 0.4260 0.9527 0.9761
No log 0.7778 84 0.9423 0.4260 0.9423 0.9707
No log 0.7963 86 0.9502 0.3814 0.9502 0.9748
No log 0.8148 88 0.9627 0.3798 0.9627 0.9812
No log 0.8333 90 0.9732 0.3798 0.9732 0.9865
No log 0.8519 92 0.9781 0.3699 0.9781 0.9890
No log 0.8704 94 0.9746 0.3559 0.9746 0.9872
No log 0.8889 96 0.9998 0.3338 0.9998 0.9999
No log 0.9074 98 1.0160 0.2891 1.0160 1.0080
No log 0.9259 100 1.0355 0.2672 1.0355 1.0176
No log 0.9444 102 1.0981 0.2482 1.0981 1.0479
No log 0.9630 104 1.0951 0.2750 1.0951 1.0465
No log 0.9815 106 1.0438 0.3173 1.0438 1.0217
No log 1.0 108 1.0207 0.2796 1.0207 1.0103
No log 1.0185 110 0.9698 0.3554 0.9698 0.9848
No log 1.0370 112 0.9688 0.3351 0.9688 0.9843
No log 1.0556 114 0.9859 0.3725 0.9859 0.9929
No log 1.0741 116 0.9732 0.3303 0.9732 0.9865
No log 1.0926 118 1.0109 0.3427 1.0109 1.0054
No log 1.1111 120 1.0989 0.2203 1.0989 1.0483
No log 1.1296 122 1.0715 0.2721 1.0715 1.0351
No log 1.1481 124 0.9905 0.3276 0.9905 0.9952
No log 1.1667 126 0.9455 0.3650 0.9455 0.9724
No log 1.1852 128 0.9577 0.4736 0.9577 0.9786
No log 1.2037 130 1.0176 0.3518 1.0176 1.0088
No log 1.2222 132 0.9782 0.3725 0.9782 0.9890
No log 1.2407 134 0.9128 0.4527 0.9128 0.9554
No log 1.2593 136 0.8783 0.4197 0.8783 0.9372
No log 1.2778 138 0.8656 0.4197 0.8656 0.9304
No log 1.2963 140 0.9447 0.4631 0.9447 0.9720
No log 1.3148 142 1.0511 0.3807 1.0511 1.0252
No log 1.3333 144 0.9450 0.4565 0.9450 0.9721
No log 1.3519 146 0.8753 0.4916 0.8753 0.9356
No log 1.3704 148 0.8913 0.3965 0.8913 0.9441
No log 1.3889 150 0.9184 0.4789 0.9184 0.9583
No log 1.4074 152 0.9299 0.4454 0.9299 0.9643
No log 1.4259 154 0.9219 0.4628 0.9219 0.9601
No log 1.4444 156 0.9130 0.3814 0.9130 0.9555
No log 1.4630 158 0.9167 0.4578 0.9167 0.9574
No log 1.4815 160 0.9134 0.3382 0.9134 0.9557
No log 1.5 162 0.9653 0.4074 0.9653 0.9825
No log 1.5185 164 0.9814 0.3908 0.9814 0.9907
No log 1.5370 166 0.9420 0.4074 0.9420 0.9706
No log 1.5556 168 0.8930 0.4294 0.8930 0.9450
No log 1.5741 170 0.8894 0.4661 0.8894 0.9431
No log 1.5926 172 0.8838 0.4661 0.8838 0.9401
No log 1.6111 174 0.8736 0.4004 0.8736 0.9347
No log 1.6296 176 0.8568 0.4429 0.8568 0.9256
No log 1.6481 178 0.8741 0.3991 0.8741 0.9349
No log 1.6667 180 0.8583 0.3920 0.8583 0.9264
No log 1.6852 182 0.8547 0.3920 0.8547 0.9245
No log 1.7037 184 0.8589 0.3780 0.8589 0.9268
No log 1.7222 186 0.8637 0.4197 0.8637 0.9293
No log 1.7407 188 0.8782 0.4334 0.8782 0.9371
No log 1.7593 190 0.8765 0.3627 0.8765 0.9362
No log 1.7778 192 0.8782 0.3648 0.8782 0.9371
No log 1.7963 194 0.8901 0.3648 0.8901 0.9434
No log 1.8148 196 0.9284 0.3988 0.9284 0.9635
No log 1.8333 198 0.8939 0.4093 0.8939 0.9455
No log 1.8519 200 0.9117 0.3951 0.9117 0.9548
No log 1.8704 202 0.9536 0.3988 0.9536 0.9765
No log 1.8889 204 0.9097 0.4337 0.9097 0.9538
No log 1.9074 206 0.9028 0.4337 0.9028 0.9502
No log 1.9259 208 0.9348 0.4550 0.9348 0.9668
No log 1.9444 210 0.9483 0.5163 0.9483 0.9738
No log 1.9630 212 0.8748 0.4730 0.8748 0.9353
No log 1.9815 214 0.8462 0.5024 0.8462 0.9199
No log 2.0 216 0.8723 0.4563 0.8723 0.9340
No log 2.0185 218 1.0110 0.4153 1.0110 1.0055
No log 2.0370 220 1.0326 0.4214 1.0326 1.0161
No log 2.0556 222 0.8998 0.4476 0.8998 0.9486
No log 2.0741 224 0.8997 0.4144 0.8997 0.9485
No log 2.0926 226 0.8919 0.3819 0.8919 0.9444
No log 2.1111 228 0.8765 0.4563 0.8765 0.9362
No log 2.1296 230 0.8990 0.4507 0.8990 0.9481
No log 2.1481 232 0.8755 0.4841 0.8755 0.9357
No log 2.1667 234 0.8642 0.4334 0.8642 0.9296
No log 2.1852 236 0.8493 0.5216 0.8493 0.9216
No log 2.2037 238 0.8464 0.4962 0.8464 0.9200
No log 2.2222 240 0.8951 0.4848 0.8951 0.9461
No log 2.2407 242 0.9781 0.4059 0.9781 0.9890
No log 2.2593 244 1.0199 0.4056 1.0199 1.0099
No log 2.2778 246 0.9495 0.3348 0.9495 0.9744
No log 2.2963 248 0.9076 0.3992 0.9076 0.9527
No log 2.3148 250 0.9068 0.4094 0.9068 0.9523
No log 2.3333 252 0.9247 0.3992 0.9247 0.9616
No log 2.3519 254 0.9014 0.3956 0.9014 0.9494
No log 2.3704 256 0.9229 0.4136 0.9229 0.9607
No log 2.3889 258 1.0185 0.4516 1.0185 1.0092
No log 2.4074 260 0.9443 0.4991 0.9443 0.9717
No log 2.4259 262 0.8616 0.3983 0.8616 0.9282
No log 2.4444 264 0.8613 0.4757 0.8613 0.9280
No log 2.4630 266 0.8595 0.4158 0.8595 0.9271
No log 2.4815 268 0.9163 0.4763 0.9163 0.9572
No log 2.5 270 0.9032 0.4861 0.9032 0.9504
No log 2.5185 272 0.8801 0.4337 0.8801 0.9381
No log 2.5370 274 0.8654 0.3596 0.8654 0.9302
No log 2.5556 276 0.8752 0.4548 0.8752 0.9355
No log 2.5741 278 0.8582 0.3483 0.8582 0.9264
No log 2.5926 280 0.8549 0.4548 0.8549 0.9246
No log 2.6111 282 0.8602 0.4646 0.8602 0.9275
No log 2.6296 284 0.8500 0.3914 0.8500 0.9219
No log 2.6481 286 0.8549 0.4056 0.8549 0.9246
No log 2.6667 288 0.8686 0.4337 0.8686 0.9320
No log 2.6852 290 0.8592 0.4297 0.8592 0.9269
No log 2.7037 292 0.8533 0.4450 0.8533 0.9238
No log 2.7222 294 0.8750 0.3943 0.8750 0.9354
No log 2.7407 296 0.8459 0.4219 0.8459 0.9197
No log 2.7593 298 0.8281 0.5042 0.8281 0.9100
No log 2.7778 300 0.8731 0.3590 0.8731 0.9344
No log 2.7963 302 0.8381 0.3946 0.8381 0.9155
No log 2.8148 304 0.8299 0.4157 0.8299 0.9110
No log 2.8333 306 0.8495 0.4470 0.8495 0.9217
No log 2.8519 308 0.8499 0.4898 0.8499 0.9219
No log 2.8704 310 0.8255 0.4012 0.8255 0.9086
No log 2.8889 312 0.8458 0.3946 0.8458 0.9197
No log 2.9074 314 0.8425 0.3951 0.8425 0.9179
No log 2.9259 316 0.8074 0.3728 0.8074 0.8985
No log 2.9444 318 0.8000 0.3583 0.8000 0.8944
No log 2.9630 320 0.8083 0.4916 0.8083 0.8990
No log 2.9815 322 0.8199 0.4998 0.8199 0.9055
No log 3.0 324 0.7871 0.3787 0.7871 0.8872
No log 3.0185 326 0.7799 0.4075 0.7799 0.8831
No log 3.0370 328 0.7763 0.4075 0.7763 0.8811
No log 3.0556 330 0.7751 0.4280 0.7751 0.8804
No log 3.0741 332 0.7860 0.4611 0.7860 0.8866
No log 3.0926 334 0.7832 0.4656 0.7832 0.8850
No log 3.1111 336 0.8004 0.4075 0.8004 0.8946
No log 3.1296 338 0.8624 0.3660 0.8624 0.9287
No log 3.1481 340 0.8872 0.3866 0.8872 0.9419
No log 3.1667 342 0.8758 0.3168 0.8758 0.9358
No log 3.1852 344 0.8449 0.3437 0.8449 0.9192
No log 3.2037 346 0.8313 0.3719 0.8313 0.9118
No log 3.2222 348 0.8613 0.3946 0.8613 0.9281
No log 3.2407 350 0.8908 0.3946 0.8908 0.9438
No log 3.2593 352 0.8884 0.3356 0.8884 0.9426
No log 3.2778 354 0.8856 0.3020 0.8856 0.9411
No log 3.2963 356 0.8813 0.3229 0.8813 0.9388
No log 3.3148 358 0.8314 0.3596 0.8314 0.9118
No log 3.3333 360 0.7783 0.4466 0.7783 0.8822
No log 3.3519 362 0.7897 0.4198 0.7897 0.8886
No log 3.3704 364 0.7770 0.4587 0.7770 0.8815
No log 3.3889 366 0.7246 0.4942 0.7246 0.8512
No log 3.4074 368 0.7843 0.5567 0.7843 0.8856
No log 3.4259 370 0.7833 0.5368 0.7833 0.8850
No log 3.4444 372 0.7477 0.3933 0.7477 0.8647
No log 3.4630 374 0.7421 0.4853 0.7421 0.8614
No log 3.4815 376 0.7470 0.4853 0.7470 0.8643
No log 3.5 378 0.7697 0.3933 0.7697 0.8773
No log 3.5185 380 0.8245 0.3045 0.8245 0.9080
No log 3.5370 382 0.8643 0.3519 0.8643 0.9297
No log 3.5556 384 0.8671 0.4503 0.8671 0.9312
No log 3.5741 386 0.8494 0.3147 0.8494 0.9216
No log 3.5926 388 0.8145 0.4075 0.8145 0.9025
No log 3.6111 390 0.8096 0.4054 0.8096 0.8998
No log 3.6296 392 0.7907 0.3627 0.7907 0.8892
No log 3.6481 394 0.8544 0.4949 0.8544 0.9243
No log 3.6667 396 0.9670 0.4186 0.9670 0.9834
No log 3.6852 398 0.9581 0.4186 0.9581 0.9788
No log 3.7037 400 0.8559 0.3298 0.8559 0.9252
No log 3.7222 402 0.8586 0.4483 0.8586 0.9266
No log 3.7407 404 0.8696 0.4489 0.8696 0.9325
No log 3.7593 406 0.8190 0.3951 0.8190 0.9050
No log 3.7778 408 0.7880 0.3938 0.7880 0.8877
No log 3.7963 410 0.8012 0.5467 0.8012 0.8951
No log 3.8148 412 0.7806 0.5476 0.7806 0.8835
No log 3.8333 414 0.7562 0.4019 0.7562 0.8696
No log 3.8519 416 0.7573 0.4471 0.7573 0.8703
No log 3.8704 418 0.7520 0.5057 0.7520 0.8672
No log 3.8889 420 0.7460 0.5770 0.7460 0.8637
No log 3.9074 422 0.7538 0.5450 0.7538 0.8682
No log 3.9259 424 0.7739 0.3909 0.7739 0.8797
No log 3.9444 426 0.8882 0.4594 0.8882 0.9424
No log 3.9630 428 0.9200 0.4594 0.9200 0.9592
No log 3.9815 430 0.8186 0.4315 0.8186 0.9048
No log 4.0 432 0.6914 0.6059 0.6914 0.8315
No log 4.0185 434 0.7329 0.6079 0.7329 0.8561
No log 4.0370 436 0.7654 0.6079 0.7654 0.8749
No log 4.0556 438 0.7051 0.5951 0.7051 0.8397
No log 4.0741 440 0.7309 0.5503 0.7309 0.8549
No log 4.0926 442 0.8199 0.5578 0.8199 0.9055
No log 4.1111 444 0.8140 0.5578 0.8140 0.9022
No log 4.1296 446 0.7557 0.5089 0.7557 0.8693
No log 4.1481 448 0.7437 0.5125 0.7437 0.8624
No log 4.1667 450 0.7631 0.5044 0.7631 0.8735
No log 4.1852 452 0.7899 0.4792 0.7899 0.8888
No log 4.2037 454 0.8066 0.4874 0.8066 0.8981
No log 4.2222 456 0.8319 0.4197 0.8319 0.9121
No log 4.2407 458 0.9779 0.3815 0.9779 0.9889
No log 4.2593 460 1.0743 0.4040 1.0743 1.0365
No log 4.2778 462 0.9684 0.4356 0.9684 0.9841
No log 4.2963 464 0.8000 0.4197 0.8000 0.8944
No log 4.3148 466 0.7748 0.4977 0.7748 0.8802
No log 4.3333 468 0.7874 0.4715 0.7874 0.8874
No log 4.3519 470 0.8109 0.3627 0.8109 0.9005
No log 4.3704 472 0.8439 0.3771 0.8439 0.9187
No log 4.3889 474 0.8567 0.3660 0.8567 0.9256
No log 4.4074 476 0.8428 0.3483 0.8428 0.9180
No log 4.4259 478 0.8335 0.3483 0.8335 0.9130
No log 4.4444 480 0.8268 0.3483 0.8268 0.9093
No log 4.4630 482 0.8385 0.3771 0.8385 0.9157
No log 4.4815 484 0.8638 0.3806 0.8638 0.9294
No log 4.5 486 0.8727 0.3513 0.8727 0.9342
No log 4.5185 488 0.8904 0.3196 0.8904 0.9436
No log 4.5370 490 0.9123 0.2470 0.9123 0.9551
No log 4.5556 492 0.9144 0.2821 0.9144 0.9562
No log 4.5741 494 0.8611 0.3744 0.8611 0.9279
No log 4.5926 496 0.8303 0.4197 0.8303 0.9112
No log 4.6111 498 0.8320 0.4337 0.8320 0.9122
0.2735 4.6296 500 0.8089 0.4197 0.8089 0.8994
0.2735 4.6481 502 0.8035 0.3879 0.8035 0.8964
0.2735 4.6667 504 0.8206 0.4912 0.8206 0.9059
0.2735 4.6852 506 0.8262 0.3583 0.8262 0.9090
0.2735 4.7037 508 0.8333 0.3974 0.8333 0.9129
0.2735 4.7222 510 0.8620 0.4012 0.8620 0.9284
0.2735 4.7407 512 0.8742 0.4012 0.8742 0.9350
0.2735 4.7593 514 0.8610 0.3970 0.8610 0.9279

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
7
Safetensors
Model size
135M params
Tensor type
F32
·
Inference Providers NEW
This model is not currently available via any of the supported Inference Providers.

Model tree for MayBashendy/ArabicNewSplits7_B_usingWellWrittenEssays_FineTuningAraBERT_run999_AugV5_k20_task2_organization

Finetuned
(4222)
this model