Arabic_FineTuningAraBERT_AugV4-trial2_k3_task1_organization_fold0

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.6291
  • Qwk: 0.7355
  • Mse: 0.6291
  • Rmse: 0.7932

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 10

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0294 2 4.7708 -0.0064 4.7708 2.1842
No log 0.0588 4 2.7155 0.0354 2.7155 1.6479
No log 0.0882 6 1.6663 0.1663 1.6663 1.2909
No log 0.1176 8 1.4051 0.0742 1.4051 1.1854
No log 0.1471 10 1.3411 0.5082 1.3411 1.1581
No log 0.1765 12 1.2643 0.4388 1.2643 1.1244
No log 0.2059 14 1.2256 0.2775 1.2256 1.1071
No log 0.2353 16 1.2004 0.2364 1.2004 1.0956
No log 0.2647 18 1.2005 0.3860 1.2005 1.0957
No log 0.2941 20 1.9054 0.3636 1.9054 1.3804
No log 0.3235 22 1.8430 0.2857 1.8430 1.3576
No log 0.3529 24 1.6448 0.2145 1.6448 1.2825
No log 0.3824 26 1.1626 0.4310 1.1626 1.0782
No log 0.4118 28 1.0760 0.2791 1.0760 1.0373
No log 0.4412 30 1.0783 0.2102 1.0783 1.0384
No log 0.4706 32 1.0517 0.4074 1.0517 1.0255
No log 0.5 34 1.2797 0.4296 1.2797 1.1312
No log 0.5294 36 1.6840 0.2145 1.6840 1.2977
No log 0.5588 38 1.6378 0.2687 1.6378 1.2798
No log 0.5882 40 1.1658 0.4296 1.1658 1.0797
No log 0.6176 42 0.9437 0.4338 0.9437 0.9714
No log 0.6471 44 0.8457 0.4338 0.8457 0.9196
No log 0.6765 46 0.8095 0.4878 0.8095 0.8997
No log 0.7059 48 0.8412 0.4615 0.8412 0.9172
No log 0.7353 50 0.8986 0.5733 0.8986 0.9480
No log 0.7647 52 1.2473 0.6483 1.2473 1.1168
No log 0.7941 54 1.6095 0.6349 1.6095 1.2687
No log 0.8235 56 1.2945 0.6491 1.2945 1.1378
No log 0.8529 58 0.8479 0.7254 0.8479 0.9208
No log 0.8824 60 0.7385 0.7258 0.7385 0.8594
No log 0.9118 62 0.8407 0.7422 0.8407 0.9169
No log 0.9412 64 0.9508 0.7322 0.9508 0.9751
No log 0.9706 66 0.9185 0.7322 0.9185 0.9584
No log 1.0 68 1.1497 0.6866 1.1497 1.0722
No log 1.0294 70 1.0211 0.7607 1.0211 1.0105
No log 1.0588 72 0.7894 0.6813 0.7894 0.8885
No log 1.0882 74 0.6611 0.7439 0.6611 0.8131
No log 1.1176 76 0.6485 0.7154 0.6485 0.8053
No log 1.1471 78 0.7368 0.7232 0.7368 0.8584
No log 1.1765 80 0.7383 0.7232 0.7383 0.8593
No log 1.2059 82 0.7498 0.7435 0.7498 0.8659
No log 1.2353 84 0.9919 0.7342 0.9919 0.9959
No log 1.2647 86 1.0140 0.6917 1.0140 1.0070
No log 1.2941 88 0.7835 0.7430 0.7835 0.8852
No log 1.3235 90 0.6008 0.7277 0.6008 0.7751
No log 1.3529 92 0.6107 0.7042 0.6107 0.7815
No log 1.3824 94 0.6387 0.6915 0.6387 0.7992
No log 1.4118 96 0.7352 0.6915 0.7352 0.8574
No log 1.4412 98 0.8800 0.7525 0.8800 0.9381
No log 1.4706 100 1.0557 0.7898 1.0557 1.0275
No log 1.5 102 1.1079 0.7421 1.1079 1.0526
No log 1.5294 104 0.9951 0.7709 0.9951 0.9975
No log 1.5588 106 0.8768 0.7430 0.8768 0.9364
No log 1.5882 108 0.7697 0.7525 0.7697 0.8773
No log 1.6176 110 0.7325 0.7435 0.7325 0.8558
No log 1.6471 112 0.8225 0.6922 0.8225 0.9069
No log 1.6765 114 0.8557 0.6922 0.8557 0.9251
No log 1.7059 116 1.0520 0.7350 1.0520 1.0257
No log 1.7353 118 1.3502 0.7106 1.3502 1.1620
No log 1.7647 120 1.1961 0.7510 1.1961 1.0937
No log 1.7941 122 0.7029 0.7451 0.7029 0.8384
No log 1.8235 124 0.5732 0.7233 0.5732 0.7571
No log 1.8529 126 0.5899 0.6672 0.5899 0.7680
No log 1.8824 128 0.5537 0.6774 0.5537 0.7441
No log 1.9118 130 0.7903 0.8283 0.7903 0.8890
No log 1.9412 132 1.1521 0.6977 1.1521 1.0734
No log 1.9706 134 1.2742 0.6678 1.2742 1.1288
No log 2.0 136 1.0533 0.7106 1.0533 1.0263
No log 2.0294 138 0.8295 0.7733 0.8295 0.9108
No log 2.0588 140 0.7663 0.7529 0.7663 0.8754
No log 2.0882 142 0.7696 0.7733 0.7696 0.8773
No log 2.1176 144 0.7966 0.8019 0.7966 0.8925
No log 2.1471 146 0.7336 0.8019 0.7336 0.8565
No log 2.1765 148 0.7451 0.8019 0.7451 0.8632
No log 2.2059 150 0.7058 0.7232 0.7058 0.8401
No log 2.2353 152 0.6036 0.7277 0.6036 0.7769
No log 2.2647 154 0.5563 0.7533 0.5563 0.7459
No log 2.2941 156 0.5487 0.7533 0.5487 0.7407
No log 2.3235 158 0.5843 0.7769 0.5843 0.7644
No log 2.3529 160 0.5925 0.7769 0.5925 0.7697
No log 2.3824 162 0.5689 0.7185 0.5689 0.7542
No log 2.4118 164 0.6060 0.6719 0.6060 0.7784
No log 2.4412 166 0.5537 0.7017 0.5537 0.7441
No log 2.4706 168 0.5261 0.7298 0.5261 0.7253
No log 2.5 170 0.5306 0.7109 0.5306 0.7284
No log 2.5294 172 0.5522 0.6175 0.5522 0.7431
No log 2.5588 174 0.5963 0.6060 0.5963 0.7722
No log 2.5882 176 0.6546 0.7823 0.6546 0.8091
No log 2.6176 178 0.6297 0.7823 0.6297 0.7935
No log 2.6471 180 0.5320 0.6175 0.5320 0.7294
No log 2.6765 182 0.5168 0.6890 0.5168 0.7189
No log 2.7059 184 0.5045 0.6774 0.5045 0.7103
No log 2.7353 186 0.6097 0.7355 0.6097 0.7808
No log 2.7647 188 0.7433 0.7875 0.7433 0.8622
No log 2.7941 190 0.6817 0.7782 0.6817 0.8257
No log 2.8235 192 0.6018 0.7277 0.6018 0.7757
No log 2.8529 194 0.5647 0.6774 0.5647 0.7515
No log 2.8824 196 0.5665 0.6774 0.5665 0.7527
No log 2.9118 198 0.5749 0.7533 0.5749 0.7582
No log 2.9412 200 0.6150 0.7277 0.6150 0.7842
No log 2.9706 202 0.6257 0.7277 0.6257 0.7910
No log 3.0 204 0.5958 0.7277 0.5958 0.7719
No log 3.0294 206 0.5668 0.7277 0.5668 0.7529
No log 3.0588 208 0.5572 0.7277 0.5572 0.7465
No log 3.0882 210 0.6377 0.7355 0.6377 0.7986
No log 3.1176 212 0.7944 0.8094 0.7944 0.8913
No log 3.1471 214 0.8762 0.8161 0.8762 0.9361
No log 3.1765 216 0.7150 0.8349 0.7150 0.8455
No log 3.2059 218 0.5456 0.7277 0.5456 0.7386
No log 3.2353 220 0.5304 0.6310 0.5304 0.7283
No log 3.2647 222 0.5253 0.6942 0.5253 0.7248
No log 3.2941 224 0.5280 0.7204 0.5280 0.7266
No log 3.3235 226 0.5671 0.7529 0.5671 0.7530
No log 3.3529 228 0.5638 0.7529 0.5638 0.7509
No log 3.3824 230 0.5730 0.7529 0.5730 0.7570
No log 3.4118 232 0.5535 0.7614 0.5535 0.7439
No log 3.4412 234 0.5595 0.7355 0.5595 0.7480
No log 3.4706 236 0.6008 0.7801 0.6008 0.7751
No log 3.5 238 0.6312 0.7529 0.6312 0.7945
No log 3.5294 240 0.5995 0.7439 0.5995 0.7743
No log 3.5588 242 0.6180 0.7439 0.6180 0.7861
No log 3.5882 244 0.6965 0.8098 0.6966 0.8346
No log 3.6176 246 0.6489 0.7986 0.6489 0.8055
No log 3.6471 248 0.5539 0.7181 0.5539 0.7442
No log 3.6765 250 0.5450 0.6539 0.5450 0.7382
No log 3.7059 252 0.5745 0.7258 0.5745 0.7580
No log 3.7353 254 0.6357 0.7986 0.6357 0.7973
No log 3.7647 256 0.6362 0.7986 0.6362 0.7976
No log 3.7941 258 0.6934 0.8098 0.6934 0.8327
No log 3.8235 260 0.7875 0.8286 0.7875 0.8874
No log 3.8529 262 0.7463 0.8098 0.7463 0.8639
No log 3.8824 264 0.6134 0.7529 0.6134 0.7832
No log 3.9118 266 0.5277 0.7360 0.5277 0.7264
No log 3.9412 268 0.5234 0.7618 0.5234 0.7235
No log 3.9706 270 0.6094 0.7355 0.6094 0.7807
No log 4.0 272 0.6706 0.7232 0.6706 0.8189
No log 4.0294 274 0.7746 0.8094 0.7746 0.8801
No log 4.0588 276 0.9877 0.8161 0.9877 0.9938
No log 4.0882 278 1.0042 0.8161 1.0042 1.0021
No log 4.1176 280 0.8606 0.8019 0.8606 0.9277
No log 4.1471 282 0.7127 0.7232 0.7127 0.8442
No log 4.1765 284 0.6527 0.7232 0.6527 0.8079
No log 4.2059 286 0.6999 0.7232 0.6999 0.8366
No log 4.2353 288 0.8117 0.8283 0.8117 0.9009
No log 4.2647 290 0.9518 0.8161 0.9518 0.9756
No log 4.2941 292 0.8874 0.8161 0.8874 0.9420
No log 4.3235 294 0.6889 0.8349 0.6889 0.8300
No log 4.3529 296 0.5624 0.7618 0.5624 0.7499
No log 4.3824 298 0.5823 0.7355 0.5823 0.7631
No log 4.4118 300 0.7168 0.8094 0.7168 0.8466
No log 4.4412 302 0.8908 0.8161 0.8908 0.9438
No log 4.4706 304 0.8795 0.8161 0.8795 0.9378
No log 4.5 306 0.8384 0.8161 0.8384 0.9156
No log 4.5294 308 0.6965 0.6915 0.6965 0.8346
No log 4.5588 310 0.6481 0.6915 0.6481 0.8051
No log 4.5882 312 0.6367 0.6842 0.6367 0.7979
No log 4.6176 314 0.6392 0.6842 0.6392 0.7995
No log 4.6471 316 0.6307 0.7149 0.6307 0.7942
No log 4.6765 318 0.6115 0.7439 0.6115 0.7820
No log 4.7059 320 0.5819 0.7355 0.5819 0.7628
No log 4.7353 322 0.6024 0.7355 0.6024 0.7761
No log 4.7647 324 0.6100 0.7355 0.6100 0.7810
No log 4.7941 326 0.6469 0.7704 0.6469 0.8043
No log 4.8235 328 0.6093 0.7859 0.6093 0.7806
No log 4.8529 330 0.5753 0.7618 0.5753 0.7585
No log 4.8824 332 0.5501 0.7618 0.5501 0.7417
No log 4.9118 334 0.5780 0.7618 0.5780 0.7603
No log 4.9412 336 0.7205 0.7525 0.7205 0.8488
No log 4.9706 338 0.8262 0.8283 0.8262 0.9089
No log 5.0 340 0.7993 0.7823 0.7993 0.8941
No log 5.0294 342 0.7445 0.7232 0.7445 0.8629
No log 5.0588 344 0.6543 0.7805 0.6543 0.8089
No log 5.0882 346 0.6437 0.7805 0.6437 0.8023
No log 5.1176 348 0.6435 0.7805 0.6435 0.8022
No log 5.1471 350 0.5900 0.7618 0.5900 0.7681
No log 5.1765 352 0.5506 0.7621 0.5506 0.7420
No log 5.2059 354 0.5439 0.7621 0.5439 0.7375
No log 5.2353 356 0.5478 0.7618 0.5478 0.7402
No log 5.2647 358 0.5957 0.7618 0.5957 0.7718
No log 5.2941 360 0.6135 0.7618 0.6135 0.7833
No log 5.3235 362 0.5841 0.7618 0.5841 0.7643
No log 5.3529 364 0.5971 0.7618 0.5971 0.7727
No log 5.3824 366 0.6396 0.7618 0.6396 0.7997
No log 5.4118 368 0.7225 0.8408 0.7225 0.8500
No log 5.4412 370 0.7391 0.8408 0.7391 0.8597
No log 5.4706 372 0.6832 0.7882 0.6832 0.8266
No log 5.5 374 0.6324 0.7882 0.6324 0.7952
No log 5.5294 376 0.6228 0.7882 0.6228 0.7892
No log 5.5588 378 0.6379 0.7882 0.6379 0.7987
No log 5.5882 380 0.7365 0.8168 0.7365 0.8582
No log 5.6176 382 0.8938 0.8164 0.8938 0.9454
No log 5.6471 384 0.9027 0.8019 0.9027 0.9501
No log 5.6765 386 0.8355 0.7906 0.8355 0.9140
No log 5.7059 388 0.7524 0.7910 0.7524 0.8674
No log 5.7353 390 0.7058 0.6968 0.7058 0.8401
No log 5.7647 392 0.7009 0.7624 0.7009 0.8372
No log 5.7941 394 0.7417 0.7910 0.7417 0.8612
No log 5.8235 396 0.8032 0.7906 0.8032 0.8962
No log 5.8529 398 0.8715 0.8019 0.8715 0.9336
No log 5.8824 400 0.8773 0.7797 0.8773 0.9366
No log 5.9118 402 0.7718 0.8168 0.7718 0.8785
No log 5.9412 404 0.6469 0.7355 0.6469 0.8043
No log 5.9706 406 0.5872 0.7355 0.5872 0.7663
No log 6.0 408 0.5817 0.7277 0.5817 0.7627
No log 6.0294 410 0.6259 0.7355 0.6259 0.7912
No log 6.0588 412 0.7186 0.7273 0.7186 0.8477
No log 6.0882 414 0.7636 0.7782 0.7636 0.8738
No log 6.1176 416 0.7374 0.7273 0.7374 0.8587
No log 6.1471 418 0.6469 0.7355 0.6469 0.8043
No log 6.1765 420 0.6161 0.7355 0.6161 0.7850
No log 6.2059 422 0.6249 0.7355 0.6249 0.7905
No log 6.2353 424 0.6228 0.7355 0.6228 0.7892
No log 6.2647 426 0.6781 0.7355 0.6781 0.8235
No log 6.2941 428 0.7716 0.7717 0.7716 0.8784
No log 6.3235 430 0.7762 0.7717 0.7762 0.8810
No log 6.3529 432 0.7034 0.7355 0.7034 0.8387
No log 6.3824 434 0.6390 0.7355 0.6390 0.7994
No log 6.4118 436 0.6267 0.7355 0.6267 0.7917
No log 6.4412 438 0.6388 0.7355 0.6388 0.7993
No log 6.4706 440 0.7153 0.7986 0.7153 0.8458
No log 6.5 442 0.7920 0.7983 0.7920 0.8899
No log 6.5294 444 0.8097 0.8164 0.8097 0.8998
No log 6.5588 446 0.7178 0.7986 0.7178 0.8472
No log 6.5882 448 0.6140 0.7355 0.6140 0.7836
No log 6.6176 450 0.5622 0.7618 0.5622 0.7498
No log 6.6471 452 0.5629 0.7618 0.5629 0.7503
No log 6.6765 454 0.5880 0.7355 0.5880 0.7668
No log 6.7059 456 0.5940 0.7355 0.5940 0.7707
No log 6.7353 458 0.6038 0.7355 0.6038 0.7770
No log 6.7647 460 0.6029 0.7355 0.6029 0.7765
No log 6.7941 462 0.5854 0.7355 0.5854 0.7651
No log 6.8235 464 0.5922 0.7355 0.5922 0.7695
No log 6.8529 466 0.5817 0.7355 0.5817 0.7627
No log 6.8824 468 0.5523 0.7618 0.5523 0.7432
No log 6.9118 470 0.5523 0.7355 0.5523 0.7432
No log 6.9412 472 0.5695 0.7355 0.5695 0.7547
No log 6.9706 474 0.6146 0.7355 0.6146 0.7840
No log 7.0 476 0.6791 0.7439 0.6791 0.8241
No log 7.0294 478 0.7618 0.7149 0.7618 0.8728
No log 7.0588 480 0.8086 0.8019 0.8086 0.8992
No log 7.0882 482 0.8009 0.8019 0.8009 0.8949
No log 7.1176 484 0.7467 0.7439 0.7467 0.8641
No log 7.1471 486 0.7146 0.7439 0.7146 0.8453
No log 7.1765 488 0.6570 0.7439 0.6570 0.8105
No log 7.2059 490 0.6210 0.7439 0.6210 0.7880
No log 7.2353 492 0.6052 0.7355 0.6052 0.7780
No log 7.2647 494 0.6148 0.7439 0.6148 0.7841
No log 7.2941 496 0.6241 0.7439 0.6241 0.7900
No log 7.3235 498 0.6663 0.7439 0.6663 0.8163
0.4252 7.3529 500 0.6944 0.8232 0.6944 0.8333
0.4252 7.3824 502 0.6922 0.8232 0.6922 0.8320
0.4252 7.4118 504 0.7179 0.8232 0.7179 0.8473
0.4252 7.4412 506 0.7409 0.8408 0.7409 0.8608
0.4252 7.4706 508 0.7004 0.8232 0.7004 0.8369
0.4252 7.5 510 0.6671 0.7704 0.6671 0.8168
0.4252 7.5294 512 0.6709 0.7704 0.6709 0.8191
0.4252 7.5588 514 0.6414 0.7439 0.6414 0.8009
0.4252 7.5882 516 0.6238 0.7439 0.6238 0.7898
0.4252 7.6176 518 0.5984 0.7355 0.5984 0.7736
0.4252 7.6471 520 0.5898 0.7355 0.5898 0.7680
0.4252 7.6765 522 0.5902 0.7078 0.5902 0.7683
0.4252 7.7059 524 0.5812 0.7078 0.5812 0.7624
0.4252 7.7353 526 0.5912 0.7078 0.5912 0.7689
0.4252 7.7647 528 0.6279 0.7439 0.6279 0.7924
0.4252 7.7941 530 0.6974 0.7439 0.6974 0.8351
0.4252 7.8235 532 0.7938 0.8283 0.7938 0.8909
0.4252 7.8529 534 0.8381 0.8283 0.8381 0.9155
0.4252 7.8824 536 0.8766 0.8283 0.8766 0.9363
0.4252 7.9118 538 0.8456 0.8283 0.8456 0.9195
0.4252 7.9412 540 0.7646 0.8283 0.7646 0.8744
0.4252 7.9706 542 0.6686 0.7439 0.6686 0.8177
0.4252 8.0 544 0.6197 0.7355 0.6197 0.7872
0.4252 8.0294 546 0.5990 0.7355 0.5990 0.7739
0.4252 8.0588 548 0.6114 0.7355 0.6114 0.7819
0.4252 8.0882 550 0.6543 0.7439 0.6543 0.8089
0.4252 8.1176 552 0.6898 0.7439 0.6898 0.8305
0.4252 8.1471 554 0.7200 0.7529 0.7200 0.8485
0.4252 8.1765 556 0.7468 0.8098 0.7468 0.8642
0.4252 8.2059 558 0.7372 0.8098 0.7372 0.8586
0.4252 8.2353 560 0.7021 0.7529 0.7021 0.8379
0.4252 8.2647 562 0.6812 0.7529 0.6812 0.8254
0.4252 8.2941 564 0.6599 0.7529 0.6599 0.8124
0.4252 8.3235 566 0.6536 0.7529 0.6536 0.8085
0.4252 8.3529 568 0.6676 0.7529 0.6676 0.8171
0.4252 8.3824 570 0.7035 0.7529 0.7035 0.8387
0.4252 8.4118 572 0.7271 0.7823 0.7271 0.8527
0.4252 8.4412 574 0.7135 0.7529 0.7135 0.8447
0.4252 8.4706 576 0.6944 0.7529 0.6944 0.8333
0.4252 8.5 578 0.6609 0.7529 0.6609 0.8129
0.4252 8.5294 580 0.6293 0.7529 0.6293 0.7933
0.4252 8.5588 582 0.5997 0.7618 0.5997 0.7744
0.4252 8.5882 584 0.5806 0.7618 0.5806 0.7620
0.4252 8.6176 586 0.5670 0.7618 0.5670 0.7530
0.4252 8.6471 588 0.5724 0.7618 0.5724 0.7566
0.4252 8.6765 590 0.5761 0.7618 0.5761 0.7590
0.4252 8.7059 592 0.5941 0.7618 0.5941 0.7708
0.4252 8.7353 594 0.6197 0.7708 0.6197 0.7872
0.4252 8.7647 596 0.6532 0.7529 0.6532 0.8082
0.4252 8.7941 598 0.6863 0.7529 0.6863 0.8284
0.4252 8.8235 600 0.6884 0.7529 0.6884 0.8297
0.4252 8.8529 602 0.6670 0.7529 0.6670 0.8167
0.4252 8.8824 604 0.6451 0.7439 0.6451 0.8032
0.4252 8.9118 606 0.6355 0.7355 0.6355 0.7972
0.4252 8.9412 608 0.6134 0.7618 0.6134 0.7832
0.4252 8.9706 610 0.5828 0.7618 0.5828 0.7634
0.4252 9.0 612 0.5687 0.7618 0.5687 0.7541
0.4252 9.0294 614 0.5605 0.7862 0.5605 0.7487
0.4252 9.0588 616 0.5662 0.7618 0.5662 0.7525
0.4252 9.0882 618 0.5763 0.7618 0.5763 0.7591
0.4252 9.1176 620 0.5892 0.7618 0.5892 0.7676
0.4252 9.1471 622 0.6060 0.7618 0.6060 0.7785
0.4252 9.1765 624 0.6133 0.7355 0.6133 0.7831
0.4252 9.2059 626 0.6195 0.7355 0.6195 0.7871
0.4252 9.2353 628 0.6374 0.7355 0.6374 0.7984
0.4252 9.2647 630 0.6600 0.7439 0.6600 0.8124
0.4252 9.2941 632 0.6748 0.7529 0.6748 0.8215
0.4252 9.3235 634 0.6806 0.7529 0.6806 0.8250
0.4252 9.3529 636 0.6892 0.7529 0.6892 0.8302
0.4252 9.3824 638 0.6875 0.7529 0.6875 0.8292
0.4252 9.4118 640 0.6856 0.7529 0.6856 0.8280
0.4252 9.4412 642 0.6752 0.7529 0.6752 0.8217
0.4252 9.4706 644 0.6580 0.7439 0.6580 0.8112
0.4252 9.5 646 0.6418 0.7355 0.6418 0.8011
0.4252 9.5294 648 0.6266 0.7355 0.6266 0.7916
0.4252 9.5588 650 0.6207 0.7355 0.6207 0.7879
0.4252 9.5882 652 0.6185 0.7355 0.6185 0.7865
0.4252 9.6176 654 0.6191 0.7355 0.6191 0.7868
0.4252 9.6471 656 0.6193 0.7355 0.6193 0.7870
0.4252 9.6765 658 0.6184 0.7355 0.6184 0.7864
0.4252 9.7059 660 0.6210 0.7355 0.6210 0.7880
0.4252 9.7353 662 0.6255 0.7355 0.6255 0.7909
0.4252 9.7647 664 0.6282 0.7355 0.6282 0.7926
0.4252 9.7941 666 0.6257 0.7355 0.6257 0.7910
0.4252 9.8235 668 0.6242 0.7355 0.6242 0.7901
0.4252 9.8529 670 0.6218 0.7355 0.6218 0.7886
0.4252 9.8824 672 0.6226 0.7355 0.6226 0.7891
0.4252 9.9118 674 0.6240 0.7355 0.6240 0.7900
0.4252 9.9412 676 0.6265 0.7355 0.6265 0.7915
0.4252 9.9706 678 0.6282 0.7355 0.6282 0.7926
0.4252 10.0 680 0.6291 0.7355 0.6291 0.7932

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
163
Safetensors
Model size
135M params
Tensor type
F32
·
Inference Providers NEW
This model is not currently available via any of the supported Inference Providers.

Model tree for MayBashendy/Arabic_FineTuningAraBERT_AugV4-trial2_k3_task1_organization_fold0

Finetuned
(4222)
this model