efficientnet-b0
This model is a fine-tuned version of google/efficientnet-b0 on an unknown dataset. It achieves the following results on the evaluation set:
- Loss: 0.1329
- Accuracy: 0.9837
- Precision: 0.9907
- Recall: 0.9737
- F1: 0.9821
- Tp: 1595
- Tn: 1895
- Fp: 15
- Fn: 43
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0002
- train_batch_size: 256
- eval_batch_size: 256
- seed: 42
- optimizer: Use OptimizerNames.ADAMW_TORCH_FUSED with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 110
- num_epochs: 10
Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy | Precision | Recall | F1 | Tp | Tn | Fp | Fn |
|---|---|---|---|---|---|---|---|---|---|---|---|
| 1.3349 | 0.1964 | 11 | 1.4004 | 0.5397 | 0.5013 | 0.5702 | 0.5336 | 934 | 981 | 929 | 704 |
| 1.1370 | 0.3929 | 22 | 1.1454 | 0.7514 | 0.6654 | 0.9286 | 0.7752 | 1521 | 1145 | 765 | 117 |
| 0.8613 | 0.5893 | 33 | 0.9204 | 0.8089 | 0.7277 | 0.9365 | 0.8190 | 1534 | 1336 | 574 | 104 |
| 0.6781 | 0.7857 | 44 | 0.7617 | 0.8351 | 0.7610 | 0.9371 | 0.8399 | 1535 | 1428 | 482 | 103 |
| 0.5845 | 0.9821 | 55 | 0.7031 | 0.8548 | 0.7829 | 0.9487 | 0.8579 | 1554 | 1479 | 431 | 84 |
| 0.5412 | 1.1786 | 66 | 0.8732 | 0.8024 | 0.7091 | 0.9701 | 0.8193 | 1589 | 1258 | 652 | 49 |
| 0.4811 | 1.375 | 77 | 0.4708 | 0.9228 | 0.8915 | 0.9481 | 0.9189 | 1553 | 1721 | 189 | 85 |
| 0.4485 | 1.5714 | 88 | 0.7378 | 0.8520 | 0.7740 | 0.9597 | 0.8569 | 1572 | 1451 | 459 | 66 |
| 0.4350 | 1.7679 | 99 | 0.3992 | 0.9377 | 0.9141 | 0.9548 | 0.9340 | 1564 | 1763 | 147 | 74 |
| 0.4226 | 1.9643 | 110 | 0.4571 | 0.9202 | 0.8787 | 0.9597 | 0.9174 | 1572 | 1693 | 217 | 66 |
| 0.3743 | 2.1607 | 121 | 0.3237 | 0.9405 | 0.9136 | 0.9621 | 0.9373 | 1576 | 1761 | 149 | 62 |
| 0.3571 | 2.3571 | 132 | 0.3736 | 0.9422 | 0.9144 | 0.9652 | 0.9391 | 1581 | 1762 | 148 | 57 |
| 0.3744 | 2.5536 | 143 | 0.2479 | 0.9715 | 0.9753 | 0.9628 | 0.9690 | 1577 | 1870 | 40 | 61 |
| 0.3674 | 2.75 | 154 | 0.2033 | 0.9766 | 0.9838 | 0.9652 | 0.9744 | 1581 | 1884 | 26 | 57 |
| 0.3061 | 2.9464 | 165 | 0.1885 | 0.9732 | 0.9789 | 0.9628 | 0.9708 | 1577 | 1876 | 34 | 61 |
| 0.3311 | 3.1429 | 176 | 0.1790 | 0.9741 | 0.9783 | 0.9652 | 0.9717 | 1581 | 1875 | 35 | 57 |
| 0.3647 | 3.3393 | 187 | 0.1867 | 0.9755 | 0.9784 | 0.9683 | 0.9733 | 1586 | 1875 | 35 | 52 |
| 0.3031 | 3.5357 | 198 | 0.5063 | 0.9188 | 0.8689 | 0.9707 | 0.9170 | 1590 | 1670 | 240 | 48 |
| 0.3186 | 3.7321 | 209 | 0.1682 | 0.9786 | 0.9821 | 0.9713 | 0.9767 | 1591 | 1881 | 29 | 47 |
| 0.3246 | 3.9286 | 220 | 0.2225 | 0.9727 | 0.9695 | 0.9713 | 0.9704 | 1591 | 1860 | 50 | 47 |
| 0.3421 | 4.125 | 231 | 0.2672 | 0.9631 | 0.9493 | 0.9719 | 0.9605 | 1592 | 1825 | 85 | 46 |
| 0.3318 | 4.3214 | 242 | 0.2246 | 0.9715 | 0.9677 | 0.9707 | 0.9692 | 1590 | 1857 | 53 | 48 |
| 0.2790 | 4.5179 | 253 | 0.1860 | 0.9760 | 0.9767 | 0.9713 | 0.9740 | 1591 | 1872 | 38 | 47 |
| 0.3365 | 4.7143 | 264 | 0.2379 | 0.9639 | 0.9467 | 0.9768 | 0.9615 | 1600 | 1820 | 90 | 38 |
| 0.2756 | 4.9107 | 275 | 0.2062 | 0.9673 | 0.9568 | 0.9731 | 0.9649 | 1594 | 1838 | 72 | 44 |
| 0.2819 | 5.1071 | 286 | 0.1483 | 0.9808 | 0.9968 | 0.9615 | 0.9789 | 1575 | 1905 | 5 | 63 |
| 0.2779 | 5.3036 | 297 | 0.1609 | 0.9797 | 0.9888 | 0.9670 | 0.9778 | 1584 | 1892 | 18 | 54 |
| 0.2755 | 5.5 | 308 | 0.1355 | 0.9839 | 0.9907 | 0.9744 | 0.9825 | 1596 | 1895 | 15 | 42 |
| 0.2827 | 5.6964 | 319 | 0.1778 | 0.9729 | 0.9673 | 0.9744 | 0.9708 | 1596 | 1856 | 54 | 42 |
| 0.2922 | 5.8929 | 330 | 0.1379 | 0.9828 | 0.9882 | 0.9744 | 0.9812 | 1596 | 1891 | 19 | 42 |
| 0.2901 | 6.0893 | 341 | 0.6696 | 0.9008 | 0.8342 | 0.9799 | 0.9012 | 1605 | 1591 | 319 | 33 |
| 0.2770 | 6.2857 | 352 | 0.1327 | 0.9837 | 0.9962 | 0.9683 | 0.9820 | 1586 | 1904 | 6 | 52 |
| 0.3000 | 6.4821 | 363 | 0.1351 | 0.9848 | 0.9956 | 0.9713 | 0.9833 | 1591 | 1903 | 7 | 47 |
| 0.3076 | 6.6786 | 374 | 0.1507 | 0.9811 | 0.9882 | 0.9707 | 0.9794 | 1590 | 1891 | 19 | 48 |
| 0.3077 | 6.875 | 385 | 0.1286 | 0.9853 | 0.9981 | 0.9701 | 0.9839 | 1589 | 1907 | 3 | 49 |
| 0.2734 | 7.0714 | 396 | 0.1406 | 0.9839 | 0.9859 | 0.9792 | 0.9825 | 1604 | 1887 | 23 | 34 |
| 0.2986 | 7.2679 | 407 | 0.1655 | 0.9822 | 0.9840 | 0.9774 | 0.9807 | 1601 | 1884 | 26 | 37 |
| 0.3002 | 7.4643 | 418 | 0.1377 | 0.9834 | 0.9876 | 0.9762 | 0.9819 | 1599 | 1890 | 20 | 39 |
| 0.2972 | 7.6607 | 429 | 0.2116 | 0.9684 | 0.9526 | 0.9805 | 0.9663 | 1606 | 1830 | 80 | 32 |
| 0.2796 | 7.8571 | 440 | 0.1383 | 0.9853 | 0.9932 | 0.9750 | 0.9840 | 1597 | 1899 | 11 | 41 |
| 0.2678 | 8.0536 | 451 | 0.1483 | 0.9825 | 0.9894 | 0.9725 | 0.9809 | 1593 | 1893 | 17 | 45 |
| 0.2526 | 8.25 | 462 | 0.1413 | 0.9831 | 0.9907 | 0.9725 | 0.9815 | 1593 | 1895 | 15 | 45 |
| 0.3135 | 8.4464 | 473 | 0.2835 | 0.9557 | 0.9323 | 0.9750 | 0.9531 | 1597 | 1794 | 116 | 41 |
| 0.2698 | 8.6429 | 484 | 0.1726 | 0.9758 | 0.9732 | 0.9744 | 0.9738 | 1596 | 1866 | 44 | 42 |
| 0.2768 | 8.8393 | 495 | 0.1527 | 0.9808 | 0.9840 | 0.9744 | 0.9791 | 1596 | 1884 | 26 | 42 |
| 0.2596 | 9.0357 | 506 | 0.1653 | 0.9783 | 0.9791 | 0.9737 | 0.9764 | 1595 | 1876 | 34 | 43 |
| 0.2720 | 9.2321 | 517 | 0.1347 | 0.9851 | 0.9919 | 0.9756 | 0.9837 | 1598 | 1897 | 13 | 40 |
| 0.2530 | 9.4286 | 528 | 0.1626 | 0.9789 | 0.9803 | 0.9737 | 0.9770 | 1595 | 1878 | 32 | 43 |
| 0.2987 | 9.625 | 539 | 0.1398 | 0.9834 | 0.9907 | 0.9731 | 0.9818 | 1594 | 1895 | 15 | 44 |
| 0.2643 | 9.8214 | 550 | 0.1329 | 0.9837 | 0.9907 | 0.9737 | 0.9821 | 1595 | 1895 | 15 | 43 |
Framework versions
- Transformers 5.2.0
- Pytorch 2.9.0+cu126
- Datasets 4.0.0
- Tokenizers 0.22.2
- Downloads last month
- 24
Model tree for waelhasan/efficientnet-b0
Base model
google/efficientnet-b0