ArabicNewSplits7_B_usingWellWrittenEssays_FineTuningAraBERT_run999_AugV5_k1_task7_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.4163
  • Qwk: 0.5267
  • Mse: 0.4163
  • Rmse: 0.6452

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.2857 2 2.5847 -0.0545 2.5847 1.6077
No log 0.5714 4 1.1679 0.0993 1.1679 1.0807
No log 0.8571 6 0.7084 0.0893 0.7084 0.8417
No log 1.1429 8 0.8818 0.2651 0.8818 0.9391
No log 1.4286 10 0.8770 0.2552 0.8770 0.9365
No log 1.7143 12 0.7326 0.2871 0.7326 0.8559
No log 2.0 14 0.6137 0.3197 0.6137 0.7834
No log 2.2857 16 0.5657 0.4161 0.5657 0.7521
No log 2.5714 18 0.5944 0.3416 0.5944 0.7710
No log 2.8571 20 0.5174 0.4561 0.5174 0.7193
No log 3.1429 22 0.5026 0.4354 0.5026 0.7090
No log 3.4286 24 0.4933 0.4444 0.4933 0.7023
No log 3.7143 26 0.5254 0.4370 0.5254 0.7249
No log 4.0 28 0.5601 0.4330 0.5601 0.7484
No log 4.2857 30 0.4951 0.5466 0.4951 0.7037
No log 4.5714 32 0.4624 0.5373 0.4624 0.6800
No log 4.8571 34 0.4230 0.6295 0.4230 0.6504
No log 5.1429 36 0.4533 0.5868 0.4533 0.6733
No log 5.4286 38 0.3891 0.6458 0.3891 0.6238
No log 5.7143 40 0.4106 0.6184 0.4106 0.6408
No log 6.0 42 0.5541 0.6587 0.5541 0.7444
No log 6.2857 44 0.5450 0.6263 0.5450 0.7383
No log 6.5714 46 0.4511 0.5798 0.4511 0.6717
No log 6.8571 48 0.5396 0.6765 0.5396 0.7346
No log 7.1429 50 0.3981 0.7123 0.3981 0.6309
No log 7.4286 52 0.5540 0.5657 0.5540 0.7443
No log 7.7143 54 1.0452 0.2990 1.0452 1.0223
No log 8.0 56 1.0185 0.3290 1.0185 1.0092
No log 8.2857 58 0.5688 0.5017 0.5688 0.7542
No log 8.5714 60 0.4385 0.6313 0.4385 0.6622
No log 8.8571 62 0.5364 0.5722 0.5364 0.7324
No log 9.1429 64 0.4873 0.6670 0.4873 0.6981
No log 9.4286 66 0.4862 0.5339 0.4862 0.6972
No log 9.7143 68 0.6790 0.4921 0.6790 0.8240
No log 10.0 70 0.6722 0.5160 0.6722 0.8199
No log 10.2857 72 0.5552 0.5498 0.5552 0.7451
No log 10.5714 74 0.4779 0.6010 0.4779 0.6913
No log 10.8571 76 0.4970 0.5817 0.4970 0.7050
No log 11.1429 78 0.4601 0.6076 0.4601 0.6783
No log 11.4286 80 0.5461 0.5672 0.5461 0.7390
No log 11.7143 82 0.6890 0.4667 0.6890 0.8301
No log 12.0 84 0.5512 0.5315 0.5512 0.7424
No log 12.2857 86 0.4524 0.5633 0.4524 0.6726
No log 12.5714 88 0.5360 0.5481 0.5360 0.7321
No log 12.8571 90 0.6668 0.5243 0.6668 0.8166
No log 13.1429 92 0.4830 0.5570 0.4830 0.6950
No log 13.4286 94 0.4624 0.5339 0.4624 0.6800
No log 13.7143 96 0.5056 0.5470 0.5056 0.7110
No log 14.0 98 0.4221 0.5475 0.4221 0.6497
No log 14.2857 100 0.4074 0.6596 0.4074 0.6383
No log 14.5714 102 0.4089 0.6596 0.4089 0.6395
No log 14.8571 104 0.4013 0.6060 0.4013 0.6335
No log 15.1429 106 0.4072 0.5722 0.4072 0.6381
No log 15.4286 108 0.4069 0.5479 0.4069 0.6379
No log 15.7143 110 0.3896 0.6201 0.3896 0.6242
No log 16.0 112 0.4330 0.6388 0.4330 0.6580
No log 16.2857 114 0.4085 0.6490 0.4085 0.6392
No log 16.5714 116 0.4058 0.6278 0.4058 0.6370
No log 16.8571 118 0.4000 0.6490 0.4000 0.6325
No log 17.1429 120 0.3925 0.5539 0.3925 0.6265
No log 17.4286 122 0.3918 0.5782 0.3918 0.6259
No log 17.7143 124 0.3939 0.6503 0.3939 0.6276
No log 18.0 126 0.4008 0.5853 0.4008 0.6331
No log 18.2857 128 0.4045 0.6701 0.4045 0.6360
No log 18.5714 130 0.4060 0.6142 0.4060 0.6371
No log 18.8571 132 0.4484 0.6169 0.4484 0.6696
No log 19.1429 134 0.4426 0.6169 0.4426 0.6653
No log 19.4286 136 0.4139 0.6678 0.4139 0.6433
No log 19.7143 138 0.3923 0.6643 0.3923 0.6263
No log 20.0 140 0.3947 0.6747 0.3947 0.6283
No log 20.2857 142 0.4032 0.6854 0.4032 0.6349
No log 20.5714 144 0.3916 0.6542 0.3916 0.6258
No log 20.8571 146 0.3908 0.6627 0.3908 0.6252
No log 21.1429 148 0.4166 0.7052 0.4166 0.6455
No log 21.4286 150 0.4734 0.5567 0.4734 0.6880
No log 21.7143 152 0.5077 0.6088 0.5077 0.7126
No log 22.0 154 0.4498 0.6287 0.4498 0.6707
No log 22.2857 156 0.4239 0.6968 0.4239 0.6511
No log 22.5714 158 0.4240 0.6975 0.4240 0.6511
No log 22.8571 160 0.4161 0.6643 0.4161 0.6451
No log 23.1429 162 0.4205 0.5698 0.4205 0.6485
No log 23.4286 164 0.4184 0.6229 0.4184 0.6468
No log 23.7143 166 0.4235 0.5698 0.4235 0.6508
No log 24.0 168 0.4586 0.5124 0.4586 0.6772
No log 24.2857 170 0.4622 0.4881 0.4622 0.6799
No log 24.5714 172 0.4639 0.5527 0.4639 0.6811
No log 24.8571 174 0.4502 0.5649 0.4502 0.6709
No log 25.1429 176 0.4411 0.5974 0.4411 0.6641
No log 25.4286 178 0.4654 0.6305 0.4654 0.6822
No log 25.7143 180 0.4581 0.6296 0.4581 0.6769
No log 26.0 182 0.4324 0.5926 0.4324 0.6576
No log 26.2857 184 0.4312 0.5656 0.4312 0.6567
No log 26.5714 186 0.4436 0.5831 0.4436 0.6660
No log 26.8571 188 0.4371 0.5731 0.4371 0.6611
No log 27.1429 190 0.4254 0.5860 0.4254 0.6522
No log 27.4286 192 0.4413 0.6201 0.4413 0.6643
No log 27.7143 194 0.4523 0.6495 0.4523 0.6725
No log 28.0 196 0.4151 0.6983 0.4151 0.6443
No log 28.2857 198 0.3907 0.6828 0.3907 0.6251
No log 28.5714 200 0.4017 0.6183 0.4017 0.6338
No log 28.8571 202 0.3992 0.6183 0.3992 0.6319
No log 29.1429 204 0.3900 0.7095 0.3900 0.6245
No log 29.4286 206 0.3955 0.7073 0.3955 0.6289
No log 29.7143 208 0.3990 0.6479 0.3990 0.6317
No log 30.0 210 0.4296 0.6127 0.4296 0.6555
No log 30.2857 212 0.4053 0.6292 0.4053 0.6366
No log 30.5714 214 0.3996 0.7073 0.3996 0.6322
No log 30.8571 216 0.4009 0.7073 0.4009 0.6331
No log 31.1429 218 0.3906 0.7003 0.3906 0.6250
No log 31.4286 220 0.4075 0.6402 0.4075 0.6384
No log 31.7143 222 0.4055 0.6407 0.4055 0.6368
No log 32.0 224 0.3925 0.6750 0.3925 0.6265
No log 32.2857 226 0.4021 0.6720 0.4021 0.6341
No log 32.5714 228 0.4088 0.6890 0.4088 0.6394
No log 32.8571 230 0.4200 0.6371 0.4200 0.6481
No log 33.1429 232 0.4313 0.6046 0.4313 0.6568
No log 33.4286 234 0.4369 0.6145 0.4369 0.6610
No log 33.7143 236 0.4467 0.6687 0.4467 0.6684
No log 34.0 238 0.4332 0.6973 0.4332 0.6582
No log 34.2857 240 0.4293 0.5649 0.4293 0.6552
No log 34.5714 242 0.4686 0.5528 0.4686 0.6845
No log 34.8571 244 0.4966 0.5808 0.4966 0.7047
No log 35.1429 246 0.4907 0.5883 0.4907 0.7005
No log 35.4286 248 0.4640 0.5672 0.4640 0.6812
No log 35.7143 250 0.4102 0.6395 0.4102 0.6405
No log 36.0 252 0.3968 0.6645 0.3968 0.6299
No log 36.2857 254 0.3963 0.6464 0.3963 0.6296
No log 36.5714 256 0.4017 0.6282 0.4017 0.6338
No log 36.8571 258 0.3942 0.6154 0.3942 0.6279
No log 37.1429 260 0.3802 0.7227 0.3802 0.6166
No log 37.4286 262 0.3829 0.7085 0.3829 0.6188
No log 37.7143 264 0.3833 0.7238 0.3833 0.6191
No log 38.0 266 0.3820 0.7588 0.3820 0.6180
No log 38.2857 268 0.4355 0.5908 0.4355 0.6600
No log 38.5714 270 0.4503 0.5908 0.4503 0.6710
No log 38.8571 272 0.4037 0.6771 0.4037 0.6354
No log 39.1429 274 0.3847 0.6542 0.3847 0.6202
No log 39.4286 276 0.4026 0.6264 0.4026 0.6345
No log 39.7143 278 0.4247 0.6156 0.4247 0.6517
No log 40.0 280 0.4182 0.6264 0.4182 0.6467
No log 40.2857 282 0.4135 0.6374 0.4135 0.6430
No log 40.5714 284 0.4195 0.5305 0.4195 0.6477
No log 40.8571 286 0.4320 0.5065 0.4320 0.6573
No log 41.1429 288 0.4285 0.5065 0.4285 0.6546
No log 41.4286 290 0.4202 0.5539 0.4202 0.6482
No log 41.7143 292 0.4184 0.5846 0.4184 0.6469
No log 42.0 294 0.4239 0.5580 0.4239 0.6511
No log 42.2857 296 0.4373 0.5266 0.4373 0.6613
No log 42.5714 298 0.4366 0.5195 0.4366 0.6608
No log 42.8571 300 0.4208 0.6184 0.4208 0.6487
No log 43.1429 302 0.4088 0.6634 0.4088 0.6394
No log 43.4286 304 0.4047 0.6344 0.4047 0.6362
No log 43.7143 306 0.4030 0.6555 0.4030 0.6348
No log 44.0 308 0.4029 0.7266 0.4029 0.6348
No log 44.2857 310 0.4053 0.7154 0.4053 0.6366
No log 44.5714 312 0.3944 0.6724 0.3944 0.6280
No log 44.8571 314 0.3886 0.6555 0.3886 0.6234
No log 45.1429 316 0.4266 0.5569 0.4266 0.6531
No log 45.4286 318 0.4584 0.5983 0.4584 0.6771
No log 45.7143 320 0.4464 0.5779 0.4464 0.6681
No log 46.0 322 0.4085 0.6282 0.4085 0.6392
No log 46.2857 324 0.3959 0.6648 0.3959 0.6292
No log 46.5714 326 0.3991 0.6648 0.3991 0.6317
No log 46.8571 328 0.4040 0.5930 0.4040 0.6356
No log 47.1429 330 0.4067 0.5915 0.4067 0.6377
No log 47.4286 332 0.4121 0.6046 0.4121 0.6420
No log 47.7143 334 0.4187 0.6530 0.4187 0.6471
No log 48.0 336 0.4140 0.6530 0.4140 0.6434
No log 48.2857 338 0.4044 0.6460 0.4044 0.6359
No log 48.5714 340 0.4065 0.5904 0.4065 0.6376
No log 48.8571 342 0.4200 0.5495 0.4200 0.6481
No log 49.1429 344 0.4343 0.5811 0.4343 0.6590
No log 49.4286 346 0.4375 0.5811 0.4375 0.6614
No log 49.7143 348 0.4254 0.5495 0.4254 0.6522
No log 50.0 350 0.4047 0.5714 0.4047 0.6361
No log 50.2857 352 0.4078 0.6820 0.4078 0.6386
No log 50.5714 354 0.4147 0.6506 0.4147 0.6440
No log 50.8571 356 0.4058 0.6712 0.4058 0.6370
No log 51.1429 358 0.3941 0.6942 0.3941 0.6278
No log 51.4286 360 0.3999 0.5985 0.3999 0.6324
No log 51.7143 362 0.4155 0.5841 0.4155 0.6446
No log 52.0 364 0.4258 0.5970 0.4258 0.6526
No log 52.2857 366 0.4243 0.6434 0.4243 0.6514
No log 52.5714 368 0.4150 0.6333 0.4150 0.6442
No log 52.8571 370 0.4219 0.6257 0.4219 0.6495
No log 53.1429 372 0.4235 0.6257 0.4235 0.6508
No log 53.4286 374 0.4163 0.6405 0.4163 0.6452
No log 53.7143 376 0.4130 0.6298 0.4130 0.6426
No log 54.0 378 0.4086 0.6452 0.4086 0.6392
No log 54.2857 380 0.4088 0.5714 0.4088 0.6394
No log 54.5714 382 0.4092 0.5227 0.4092 0.6397
No log 54.8571 384 0.4098 0.5440 0.4098 0.6401
No log 55.1429 386 0.4150 0.6096 0.4150 0.6442
No log 55.4286 388 0.4142 0.6096 0.4142 0.6436
No log 55.7143 390 0.4094 0.6326 0.4094 0.6398
No log 56.0 392 0.4043 0.6919 0.4043 0.6359
No log 56.2857 394 0.4043 0.6395 0.4043 0.6358
No log 56.5714 396 0.4169 0.6143 0.4169 0.6457
No log 56.8571 398 0.4338 0.5498 0.4338 0.6586
No log 57.1429 400 0.4236 0.5692 0.4236 0.6508
No log 57.4286 402 0.4065 0.6034 0.4065 0.6375
No log 57.7143 404 0.4081 0.5956 0.4081 0.6388
No log 58.0 406 0.4104 0.5956 0.4104 0.6406
No log 58.2857 408 0.3976 0.6860 0.3976 0.6305
No log 58.5714 410 0.3950 0.6672 0.3950 0.6285
No log 58.8571 412 0.3986 0.6389 0.3986 0.6314
No log 59.1429 414 0.4163 0.5841 0.4163 0.6452
No log 59.4286 416 0.4267 0.5569 0.4267 0.6532
No log 59.7143 418 0.4411 0.5569 0.4411 0.6641
No log 60.0 420 0.4347 0.5718 0.4347 0.6593
No log 60.2857 422 0.4149 0.5702 0.4149 0.6442
No log 60.5714 424 0.4089 0.5152 0.4089 0.6395
No log 60.8571 426 0.4099 0.5584 0.4099 0.6403
No log 61.1429 428 0.4133 0.5800 0.4133 0.6429
No log 61.4286 430 0.4166 0.5361 0.4166 0.6454
No log 61.7143 432 0.4188 0.5600 0.4188 0.6472
No log 62.0 434 0.4237 0.5152 0.4237 0.6509
No log 62.2857 436 0.4338 0.5098 0.4338 0.6586
No log 62.5714 438 0.4377 0.5495 0.4377 0.6616
No log 62.8571 440 0.4284 0.5028 0.4284 0.6545
No log 63.1429 442 0.4143 0.5152 0.4143 0.6437
No log 63.4286 444 0.4088 0.5600 0.4088 0.6393
No log 63.7143 446 0.4074 0.6076 0.4074 0.6383
No log 64.0 448 0.4074 0.6076 0.4074 0.6383
No log 64.2857 450 0.4071 0.6389 0.4071 0.6381
No log 64.5714 452 0.4040 0.6076 0.4040 0.6356
No log 64.8571 454 0.4029 0.5379 0.4029 0.6347
No log 65.1429 456 0.4061 0.5397 0.4061 0.6372
No log 65.4286 458 0.4078 0.6156 0.4078 0.6386
No log 65.7143 460 0.4095 0.6156 0.4095 0.6399
No log 66.0 462 0.4103 0.6156 0.4103 0.6405
No log 66.2857 464 0.4119 0.5941 0.4119 0.6418
No log 66.5714 466 0.4150 0.5522 0.4150 0.6442
No log 66.8571 468 0.4204 0.4703 0.4204 0.6484
No log 67.1429 470 0.4262 0.4703 0.4262 0.6529
No log 67.4286 472 0.4298 0.4774 0.4298 0.6556
No log 67.7143 474 0.4327 0.4774 0.4327 0.6578
No log 68.0 476 0.4353 0.5267 0.4353 0.6598
No log 68.2857 478 0.4361 0.5267 0.4361 0.6603
No log 68.5714 480 0.4356 0.5267 0.4356 0.6600
No log 68.8571 482 0.4320 0.5267 0.4320 0.6573
No log 69.1429 484 0.4276 0.5267 0.4276 0.6539
No log 69.4286 486 0.4218 0.5267 0.4218 0.6495
No log 69.7143 488 0.4213 0.5044 0.4213 0.6491
No log 70.0 490 0.4227 0.4970 0.4227 0.6502
No log 70.2857 492 0.4278 0.5227 0.4278 0.6541
No log 70.5714 494 0.4262 0.5475 0.4262 0.6528
No log 70.8571 496 0.4192 0.5227 0.4192 0.6475
No log 71.1429 498 0.4127 0.4970 0.4127 0.6424
0.1864 71.4286 500 0.4169 0.5397 0.4169 0.6457
0.1864 71.7143 502 0.4216 0.5397 0.4216 0.6493
0.1864 72.0 504 0.4212 0.5397 0.4212 0.6490
0.1864 72.2857 506 0.4146 0.5208 0.4146 0.6439
0.1864 72.5714 508 0.4128 0.5024 0.4128 0.6425
0.1864 72.8571 510 0.4163 0.5267 0.4163 0.6452

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
7
Safetensors
Model size
135M params
Tensor type
F32
·
Inference Providers NEW
This model is not currently available via any of the supported Inference Providers.

Model tree for MayBashendy/ArabicNewSplits7_B_usingWellWrittenEssays_FineTuningAraBERT_run999_AugV5_k1_task7_organization

Finetuned
(4222)
this model