ArabicNewSplits7_B_usingWellWrittenEssays_FineTuningAraBERT_run999_AugV5_k20_task7_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.6785
  • Qwk: 0.2883
  • Mse: 0.6785
  • Rmse: 0.8237

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0185 2 2.6272 -0.0729 2.6272 1.6209
No log 0.0370 4 1.2463 0.0983 1.2463 1.1164
No log 0.0556 6 0.7915 0.0441 0.7915 0.8897
No log 0.0741 8 0.7705 0.1368 0.7705 0.8778
No log 0.0926 10 0.6891 0.2955 0.6891 0.8301
No log 0.1111 12 0.6805 0.3141 0.6805 0.8249
No log 0.1296 14 0.7554 0.2223 0.7554 0.8691
No log 0.1481 16 0.8799 0.2259 0.8799 0.9380
No log 0.1667 18 0.7391 0.3590 0.7391 0.8597
No log 0.1852 20 0.6912 0.3348 0.6912 0.8314
No log 0.2037 22 0.8114 0.2772 0.8114 0.9008
No log 0.2222 24 0.7259 0.2813 0.7259 0.8520
No log 0.2407 26 0.6871 0.3050 0.6871 0.8289
No log 0.2593 28 1.3581 0.2590 1.3581 1.1654
No log 0.2778 30 1.7250 0.1895 1.7250 1.3134
No log 0.2963 32 1.2685 0.1895 1.2685 1.1263
No log 0.3148 34 0.7739 0.3606 0.7739 0.8797
No log 0.3333 36 0.6666 0.1983 0.6666 0.8165
No log 0.3519 38 0.6695 0.2046 0.6695 0.8182
No log 0.3704 40 0.7431 0.3564 0.7431 0.8620
No log 0.3889 42 0.9422 0.3579 0.9422 0.9707
No log 0.4074 44 1.0279 0.3516 1.0279 1.0138
No log 0.4259 46 0.9828 0.3516 0.9828 0.9914
No log 0.4444 48 0.8631 0.3777 0.8631 0.9290
No log 0.4630 50 0.7154 0.3746 0.7154 0.8458
No log 0.4815 52 0.6521 0.4219 0.6521 0.8075
No log 0.5 54 0.6224 0.3092 0.6224 0.7889
No log 0.5185 56 0.6890 0.3819 0.6890 0.8301
No log 0.5370 58 1.0277 0.3166 1.0277 1.0138
No log 0.5556 60 1.2795 0.2772 1.2795 1.1312
No log 0.5741 62 1.2126 0.2909 1.2126 1.1012
No log 0.5926 64 0.8438 0.4255 0.8438 0.9186
No log 0.6111 66 0.5983 0.4463 0.5983 0.7735
No log 0.6296 68 0.6445 0.4674 0.6445 0.8028
No log 0.6481 70 0.6404 0.4737 0.6404 0.8003
No log 0.6667 72 0.6235 0.4419 0.6235 0.7897
No log 0.6852 74 0.9033 0.4096 0.9033 0.9504
No log 0.7037 76 1.0313 0.2910 1.0313 1.0155
No log 0.7222 78 0.8396 0.4568 0.8396 0.9163
No log 0.7407 80 0.6278 0.3945 0.6278 0.7923
No log 0.7593 82 0.6544 0.4345 0.6544 0.8090
No log 0.7778 84 0.6348 0.4322 0.6348 0.7968
No log 0.7963 86 0.6784 0.2995 0.6784 0.8236
No log 0.8148 88 0.9486 0.4092 0.9486 0.9740
No log 0.8333 90 1.1878 0.2206 1.1878 1.0899
No log 0.8519 92 1.1619 0.2191 1.1619 1.0779
No log 0.8704 94 0.9051 0.4347 0.9051 0.9514
No log 0.8889 96 0.7585 0.3494 0.7585 0.8709
No log 0.9074 98 0.6845 0.3196 0.6845 0.8273
No log 0.9259 100 0.7034 0.2467 0.7034 0.8387
No log 0.9444 102 0.7146 0.3302 0.7146 0.8453
No log 0.9630 104 0.8031 0.3918 0.8031 0.8962
No log 0.9815 106 0.9954 0.3849 0.9954 0.9977
No log 1.0 108 1.0793 0.3269 1.0793 1.0389
No log 1.0185 110 1.0460 0.3697 1.0460 1.0227
No log 1.0370 112 0.8320 0.3560 0.8320 0.9121
No log 1.0556 114 0.7203 0.3069 0.7203 0.8487
No log 1.0741 116 0.6927 0.3060 0.6927 0.8323
No log 1.0926 118 0.7416 0.2518 0.7416 0.8612
No log 1.1111 120 0.8737 0.3892 0.8737 0.9347
No log 1.1296 122 1.1036 0.3088 1.1036 1.0505
No log 1.1481 124 1.0979 0.3404 1.0979 1.0478
No log 1.1667 126 0.9128 0.3709 0.9128 0.9554
No log 1.1852 128 0.8296 0.2843 0.8296 0.9108
No log 1.2037 130 0.7985 0.2904 0.7985 0.8936
No log 1.2222 132 0.8440 0.4080 0.8440 0.9187
No log 1.2407 134 0.9444 0.3676 0.9444 0.9718
No log 1.2593 136 1.0034 0.3337 1.0034 1.0017
No log 1.2778 138 0.8877 0.4092 0.8877 0.9422
No log 1.2963 140 0.7385 0.3637 0.7385 0.8593
No log 1.3148 142 0.6943 0.2498 0.6943 0.8333
No log 1.3333 144 0.6994 0.2471 0.6994 0.8363
No log 1.3519 146 0.7091 0.2784 0.7091 0.8421
No log 1.3704 148 0.7587 0.3234 0.7587 0.8710
No log 1.3889 150 0.8943 0.3538 0.8943 0.9457
No log 1.4074 152 0.9652 0.3029 0.9652 0.9824
No log 1.4259 154 0.8352 0.4404 0.8352 0.9139
No log 1.4444 156 0.6769 0.2558 0.6769 0.8228
No log 1.4630 158 0.6655 0.3141 0.6655 0.8158
No log 1.4815 160 0.6565 0.3426 0.6565 0.8102
No log 1.5 162 0.7265 0.3817 0.7265 0.8523
No log 1.5185 164 0.8765 0.3499 0.8765 0.9362
No log 1.5370 166 1.0127 0.2898 1.0127 1.0064
No log 1.5556 168 0.9417 0.3052 0.9417 0.9704
No log 1.5741 170 0.7469 0.3562 0.7469 0.8642
No log 1.5926 172 0.6349 0.3763 0.6349 0.7968
No log 1.6111 174 0.6206 0.2877 0.6206 0.7878
No log 1.6296 176 0.6285 0.3399 0.6285 0.7928
No log 1.6481 178 0.6664 0.3099 0.6664 0.8163
No log 1.6667 180 0.7391 0.3746 0.7391 0.8597
No log 1.6852 182 0.7609 0.3746 0.7609 0.8723
No log 1.7037 184 0.7273 0.3372 0.7273 0.8528
No log 1.7222 186 0.6796 0.2227 0.6796 0.8244
No log 1.7407 188 0.7217 0.2383 0.7217 0.8495
No log 1.7593 190 0.8368 0.3456 0.8368 0.9148
No log 1.7778 192 0.8586 0.3688 0.8586 0.9266
No log 1.7963 194 0.7750 0.2871 0.7750 0.8803
No log 1.8148 196 0.7648 0.2871 0.7648 0.8746
No log 1.8333 198 0.7913 0.3095 0.7913 0.8896
No log 1.8519 200 0.7951 0.2926 0.7951 0.8917
No log 1.8704 202 0.8246 0.2471 0.8246 0.9081
No log 1.8889 204 0.8560 0.2364 0.8560 0.9252
No log 1.9074 206 0.9938 0.3052 0.9938 0.9969
No log 1.9259 208 1.1704 0.2643 1.1704 1.0818
No log 1.9444 210 1.1412 0.2501 1.1412 1.0683
No log 1.9630 212 0.9513 0.3601 0.9513 0.9754
No log 1.9815 214 0.8096 0.2904 0.8096 0.8998
No log 2.0 216 0.8180 0.2904 0.8180 0.9044
No log 2.0185 218 0.9502 0.3439 0.9502 0.9748
No log 2.0370 220 0.9671 0.3381 0.9671 0.9834
No log 2.0556 222 0.9231 0.3439 0.9231 0.9608
No log 2.0741 224 0.8631 0.3499 0.8631 0.9290
No log 2.0926 226 0.7739 0.4239 0.7739 0.8797
No log 2.1111 228 0.7480 0.2749 0.7480 0.8648
No log 2.1296 230 0.7852 0.4114 0.7852 0.8861
No log 2.1481 232 0.8783 0.3560 0.8783 0.9372
No log 2.1667 234 0.8716 0.3678 0.8716 0.9336
No log 2.1852 236 0.8379 0.4366 0.8379 0.9154
No log 2.2037 238 0.7586 0.3700 0.7586 0.8710
No log 2.2222 240 0.7216 0.3340 0.7216 0.8495
No log 2.2407 242 0.7426 0.3569 0.7426 0.8617
No log 2.2593 244 0.8270 0.4153 0.8270 0.9094
No log 2.2778 246 0.9176 0.3381 0.9176 0.9579
No log 2.2963 248 0.8500 0.3799 0.8500 0.9219
No log 2.3148 250 0.6978 0.3544 0.6978 0.8354
No log 2.3333 252 0.6435 0.3144 0.6435 0.8022
No log 2.3519 254 0.6297 0.3625 0.6297 0.7935
No log 2.3704 256 0.6371 0.3840 0.6371 0.7982
No log 2.3889 258 0.6757 0.3942 0.6757 0.8220
No log 2.4074 260 0.6659 0.3942 0.6659 0.8160
No log 2.4259 262 0.6379 0.3976 0.6379 0.7987
No log 2.4444 264 0.6425 0.3197 0.6425 0.8016
No log 2.4630 266 0.6550 0.2537 0.6550 0.8093
No log 2.4815 268 0.6578 0.2787 0.6578 0.8110
No log 2.5 270 0.7050 0.3195 0.7050 0.8396
No log 2.5185 272 0.7764 0.4272 0.7764 0.8811
No log 2.5370 274 0.7354 0.4745 0.7354 0.8576
No log 2.5556 276 0.6619 0.3656 0.6619 0.8136
No log 2.5741 278 0.6357 0.4207 0.6357 0.7973
No log 2.5926 280 0.6774 0.4404 0.6774 0.8231
No log 2.6111 282 0.7805 0.4721 0.7805 0.8835
No log 2.6296 284 0.8090 0.4705 0.8090 0.8995
No log 2.6481 286 0.6898 0.4144 0.6898 0.8305
No log 2.6667 288 0.5588 0.4243 0.5588 0.7475
No log 2.6852 290 0.5194 0.4147 0.5194 0.7207
No log 2.7037 292 0.5186 0.4722 0.5186 0.7201
No log 2.7222 294 0.5234 0.4722 0.5234 0.7235
No log 2.7407 296 0.5440 0.4819 0.5440 0.7376
No log 2.7593 298 0.5435 0.4642 0.5435 0.7373
No log 2.7778 300 0.5318 0.3702 0.5318 0.7293
No log 2.7963 302 0.5482 0.4384 0.5482 0.7404
No log 2.8148 304 0.5548 0.3947 0.5548 0.7448
No log 2.8333 306 0.5691 0.3494 0.5691 0.7544
No log 2.8519 308 0.6289 0.4035 0.6289 0.7931
No log 2.8704 310 0.6465 0.4035 0.6465 0.8041
No log 2.8889 312 0.6420 0.3755 0.6420 0.8013
No log 2.9074 314 0.6189 0.3092 0.6189 0.7867
No log 2.9259 316 0.6213 0.3092 0.6213 0.7883
No log 2.9444 318 0.6413 0.3092 0.6413 0.8008
No log 2.9630 320 0.6483 0.3092 0.6483 0.8052
No log 2.9815 322 0.6706 0.3387 0.6706 0.8189
No log 3.0 324 0.7129 0.2883 0.7129 0.8444
No log 3.0185 326 0.7934 0.4224 0.7934 0.8907
No log 3.0370 328 0.8775 0.3473 0.8775 0.9368
No log 3.0556 330 0.8439 0.4624 0.8439 0.9187
No log 3.0741 332 0.7766 0.3099 0.7766 0.8813
No log 3.0926 334 0.6686 0.2981 0.6686 0.8177
No log 3.1111 336 0.6458 0.3123 0.6458 0.8036
No log 3.1296 338 0.6396 0.3166 0.6396 0.7998
No log 3.1481 340 0.6458 0.3092 0.6458 0.8036
No log 3.1667 342 0.6672 0.3312 0.6672 0.8168
No log 3.1852 344 0.7297 0.3099 0.7297 0.8542
No log 3.2037 346 0.7574 0.4197 0.7574 0.8703
No log 3.2222 348 0.6859 0.3261 0.6859 0.8282
No log 3.2407 350 0.6214 0.3312 0.6214 0.7883
No log 3.2593 352 0.5847 0.3166 0.5847 0.7646
No log 3.2778 354 0.5664 0.3354 0.5664 0.7526
No log 3.2963 356 0.5628 0.3354 0.5628 0.7502
No log 3.3148 358 0.5628 0.3354 0.5628 0.7502
No log 3.3333 360 0.5712 0.3006 0.5712 0.7558
No log 3.3519 362 0.5911 0.3323 0.5911 0.7689
No log 3.3704 364 0.5943 0.3243 0.5943 0.7709
No log 3.3889 366 0.5777 0.3039 0.5777 0.7600
No log 3.4074 368 0.5638 0.3354 0.5638 0.7509
No log 3.4259 370 0.5524 0.3889 0.5524 0.7432
No log 3.4444 372 0.5468 0.3369 0.5468 0.7395
No log 3.4630 374 0.5588 0.4845 0.5588 0.7476
No log 3.4815 376 0.5484 0.4060 0.5484 0.7406
No log 3.5 378 0.5375 0.3274 0.5375 0.7332
No log 3.5185 380 0.5438 0.2987 0.5438 0.7375
No log 3.5370 382 0.5484 0.3273 0.5484 0.7405
No log 3.5556 384 0.5405 0.2987 0.5405 0.7352
No log 3.5741 386 0.5429 0.2996 0.5429 0.7368
No log 3.5926 388 0.5399 0.2641 0.5399 0.7348
No log 3.6111 390 0.5373 0.2641 0.5373 0.7330
No log 3.6296 392 0.5325 0.2996 0.5325 0.7297
No log 3.6481 394 0.5277 0.3953 0.5277 0.7264
No log 3.6667 396 0.5433 0.3416 0.5433 0.7371
No log 3.6852 398 0.5704 0.3341 0.5704 0.7553
No log 3.7037 400 0.5767 0.3341 0.5767 0.7594
No log 3.7222 402 0.5726 0.3341 0.5726 0.7567
No log 3.7407 404 0.5866 0.3341 0.5866 0.7659
No log 3.7593 406 0.5951 0.3312 0.5951 0.7714
No log 3.7778 408 0.6172 0.3312 0.6172 0.7856
No log 3.7963 410 0.6595 0.3843 0.6595 0.8121
No log 3.8148 412 0.6781 0.3843 0.6781 0.8235
No log 3.8333 414 0.6525 0.4190 0.6525 0.8078
No log 3.8519 416 0.6357 0.4020 0.6357 0.7973
No log 3.8704 418 0.6030 0.3622 0.6030 0.7765
No log 3.8889 420 0.5870 0.3341 0.5870 0.7662
No log 3.9074 422 0.5679 0.3675 0.5679 0.7536
No log 3.9259 424 0.5573 0.3995 0.5573 0.7465
No log 3.9444 426 0.5627 0.4194 0.5627 0.7501
No log 3.9630 428 0.5972 0.4292 0.5972 0.7728
No log 3.9815 430 0.6792 0.4815 0.6792 0.8241
No log 4.0 432 0.7062 0.4644 0.7062 0.8404
No log 4.0185 434 0.6888 0.4644 0.6888 0.8299
No log 4.0370 436 0.6759 0.4409 0.6759 0.8221
No log 4.0556 438 0.6074 0.4044 0.6074 0.7793
No log 4.0741 440 0.5911 0.4027 0.5911 0.7689
No log 4.0926 442 0.5959 0.3782 0.5959 0.7719
No log 4.1111 444 0.5990 0.3494 0.5990 0.7740
No log 4.1296 446 0.6249 0.3465 0.6249 0.7905
No log 4.1481 448 0.6833 0.3789 0.6833 0.8266
No log 4.1667 450 0.6998 0.3789 0.6998 0.8365
No log 4.1852 452 0.6573 0.3465 0.6573 0.8108
No log 4.2037 454 0.6596 0.3465 0.6596 0.8122
No log 4.2222 456 0.6712 0.3387 0.6712 0.8193
No log 4.2407 458 0.6840 0.4052 0.6840 0.8270
No log 4.2593 460 0.6763 0.3444 0.6763 0.8224
No log 4.2778 462 0.6450 0.3387 0.6450 0.8031
No log 4.2963 464 0.6399 0.3387 0.6399 0.8000
No log 4.3148 466 0.6431 0.3387 0.6431 0.8019
No log 4.3333 468 0.6471 0.3167 0.6471 0.8044
No log 4.3519 470 0.6554 0.3789 0.6554 0.8096
No log 4.3704 472 0.6469 0.3471 0.6469 0.8043
No log 4.3889 474 0.6061 0.3976 0.6061 0.7785
No log 4.4074 476 0.5654 0.3754 0.5654 0.7519
No log 4.4259 478 0.5624 0.3258 0.5624 0.7499
No log 4.4444 480 0.5691 0.2923 0.5691 0.7544
No log 4.4630 482 0.5774 0.2963 0.5774 0.7599
No log 4.4815 484 0.5919 0.3575 0.5919 0.7693
No log 4.5 486 0.6617 0.3673 0.6617 0.8134
No log 4.5185 488 0.7257 0.3444 0.7257 0.8519
No log 4.5370 490 0.7068 0.3444 0.7068 0.8407
No log 4.5556 492 0.6779 0.3167 0.6779 0.8233
No log 4.5741 494 0.6681 0.3594 0.6681 0.8174
No log 4.5926 496 0.6884 0.3444 0.6884 0.8297
No log 4.6111 498 0.7111 0.3444 0.7111 0.8432
0.2421 4.6296 500 0.7278 0.3444 0.7278 0.8531
0.2421 4.6481 502 0.6692 0.3312 0.6692 0.8181
0.2421 4.6667 504 0.6313 0.3166 0.6313 0.7946
0.2421 4.6852 506 0.6121 0.3445 0.6121 0.7824
0.2421 4.7037 508 0.6076 0.3445 0.6076 0.7795
0.2421 4.7222 510 0.6405 0.3572 0.6405 0.8003
0.2421 4.7407 512 0.7083 0.4052 0.7083 0.8416
0.2421 4.7593 514 0.7393 0.4554 0.7393 0.8598
0.2421 4.7778 516 0.7042 0.4642 0.7042 0.8392
0.2421 4.7963 518 0.6464 0.3594 0.6464 0.8040
0.2421 4.8148 520 0.6130 0.3649 0.6130 0.7830
0.2421 4.8333 522 0.6089 0.3599 0.6089 0.7803
0.2421 4.8519 524 0.6280 0.3183 0.6280 0.7924
0.2421 4.8704 526 0.6584 0.3425 0.6584 0.8114
0.2421 4.8889 528 0.6523 0.3155 0.6523 0.8077
0.2421 4.9074 530 0.6549 0.2950 0.6549 0.8092
0.2421 4.9259 532 0.6785 0.2883 0.6785 0.8237

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
7
Safetensors
Model size
135M params
Tensor type
F32
·
Inference Providers NEW
This model is not currently available via any of the supported Inference Providers.

Model tree for MayBashendy/ArabicNewSplits7_B_usingWellWrittenEssays_FineTuningAraBERT_run999_AugV5_k20_task7_organization

Finetuned
(4222)
this model