ViT_L16
This model is a fine-tuned version of google/mobilenet_v2_1.4_224 on an unknown dataset. It achieves the following results on the evaluation set:
- Loss: 0.1493
- Accuracy: 0.9586
- Precision: 0.9825
- Recall: 0.9267
- F1: 0.9538
- Tp: 1518
- Tn: 1883
- Fp: 27
- Fn: 120
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-06
- train_batch_size: 64
- eval_batch_size: 64
- seed: 42
- optimizer: Use OptimizerNames.ADAMW_TORCH_FUSED with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 552
- num_epochs: 10
Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy | Precision | Recall | F1 | Tp | Tn | Fp | Fn |
|---|---|---|---|---|---|---|---|---|---|---|---|
| 0.6498 | 0.2477 | 55 | 0.6233 | 0.5950 | 0.7445 | 0.1868 | 0.2987 | 306 | 1805 | 105 | 1332 |
| 0.6063 | 0.4955 | 110 | 0.6222 | 0.6206 | 0.7160 | 0.2955 | 0.4183 | 484 | 1718 | 192 | 1154 |
| 0.5307 | 0.7432 | 165 | 0.4872 | 0.8120 | 0.9036 | 0.6636 | 0.7652 | 1087 | 1794 | 116 | 551 |
| 0.4383 | 0.9910 | 220 | 0.4204 | 0.8695 | 0.8808 | 0.8297 | 0.8544 | 1359 | 1726 | 184 | 279 |
| 0.3821 | 1.2387 | 275 | 0.4293 | 0.8171 | 0.7416 | 0.9267 | 0.8239 | 1518 | 1381 | 529 | 120 |
| 0.3412 | 1.4865 | 330 | 0.3600 | 0.8763 | 0.8601 | 0.8742 | 0.8671 | 1432 | 1677 | 233 | 206 |
| 0.3323 | 1.7342 | 385 | 0.5002 | 0.7562 | 0.7125 | 0.7912 | 0.7498 | 1296 | 1387 | 523 | 342 |
| 0.3128 | 1.9820 | 440 | 0.3087 | 0.9073 | 0.9078 | 0.8895 | 0.8986 | 1457 | 1762 | 148 | 181 |
| 0.2916 | 2.2297 | 495 | 0.3092 | 0.9005 | 0.8640 | 0.9310 | 0.8963 | 1525 | 1670 | 240 | 113 |
| 0.2882 | 2.4775 | 550 | 0.4698 | 0.7802 | 0.6864 | 0.9646 | 0.8020 | 1580 | 1188 | 722 | 58 |
| 0.2775 | 2.7252 | 605 | 0.2448 | 0.9332 | 0.9420 | 0.9115 | 0.9265 | 1493 | 1818 | 92 | 145 |
| 0.2577 | 2.9730 | 660 | 0.2544 | 0.9239 | 0.9264 | 0.9072 | 0.9167 | 1486 | 1792 | 118 | 152 |
| 0.2541 | 3.2207 | 715 | 0.2914 | 0.9028 | 0.8542 | 0.9518 | 0.9004 | 1559 | 1644 | 266 | 79 |
| 0.2499 | 3.4685 | 770 | 0.2302 | 0.9281 | 0.9314 | 0.9115 | 0.9213 | 1493 | 1800 | 110 | 145 |
| 0.2356 | 3.7162 | 825 | 0.2430 | 0.9284 | 0.9109 | 0.9365 | 0.9235 | 1534 | 1760 | 150 | 104 |
| 0.2403 | 3.9640 | 880 | 0.2341 | 0.9169 | 0.8929 | 0.9316 | 0.9119 | 1526 | 1727 | 183 | 112 |
| 0.2454 | 4.2117 | 935 | 0.3786 | 0.8396 | 0.7642 | 0.9438 | 0.8446 | 1546 | 1433 | 477 | 92 |
| 0.2296 | 4.4595 | 990 | 0.3143 | 0.8591 | 0.8014 | 0.9237 | 0.8582 | 1513 | 1535 | 375 | 125 |
| 0.2311 | 4.7072 | 1045 | 0.3683 | 0.8238 | 0.7346 | 0.9683 | 0.8354 | 1586 | 1337 | 573 | 52 |
| 0.2181 | 4.9550 | 1100 | 0.1968 | 0.9380 | 0.9350 | 0.9304 | 0.9327 | 1524 | 1804 | 106 | 114 |
| 0.2119 | 5.2027 | 1155 | 0.3088 | 0.8661 | 0.7987 | 0.9493 | 0.8675 | 1555 | 1518 | 392 | 83 |
| 0.2222 | 5.4505 | 1210 | 0.3543 | 0.8503 | 0.7780 | 0.9457 | 0.8537 | 1549 | 1468 | 442 | 89 |
| 0.2047 | 5.6982 | 1265 | 0.1789 | 0.9462 | 0.9485 | 0.9341 | 0.9412 | 1530 | 1827 | 83 | 108 |
| 0.2169 | 5.9459 | 1320 | 0.1936 | 0.9414 | 0.9503 | 0.9212 | 0.9355 | 1509 | 1831 | 79 | 129 |
| 0.2233 | 6.1937 | 1375 | 0.2493 | 0.8949 | 0.8388 | 0.9560 | 0.8936 | 1566 | 1609 | 301 | 72 |
| 0.2245 | 6.4414 | 1430 | 0.2624 | 0.8797 | 0.8172 | 0.9524 | 0.8796 | 1560 | 1561 | 349 | 78 |
| 0.2220 | 6.6892 | 1485 | 0.2528 | 0.9101 | 0.8586 | 0.9640 | 0.9083 | 1579 | 1650 | 260 | 59 |
| 0.2158 | 6.9369 | 1540 | 0.2083 | 0.9290 | 0.9130 | 0.9353 | 0.9240 | 1532 | 1764 | 146 | 106 |
| 0.2151 | 7.1847 | 1595 | 0.1952 | 0.9394 | 0.9273 | 0.9426 | 0.9349 | 1544 | 1789 | 121 | 94 |
| 0.2192 | 7.4324 | 1650 | 0.2952 | 0.8670 | 0.7941 | 0.9609 | 0.8696 | 1574 | 1502 | 408 | 64 |
| 0.2162 | 7.6802 | 1705 | 0.2100 | 0.9247 | 0.9025 | 0.9383 | 0.9201 | 1537 | 1744 | 166 | 101 |
| 0.1981 | 7.9279 | 1760 | 0.1673 | 0.9487 | 0.9522 | 0.9359 | 0.9440 | 1533 | 1833 | 77 | 105 |
| 0.2019 | 8.1757 | 1815 | 0.2276 | 0.9146 | 0.8739 | 0.9524 | 0.9115 | 1560 | 1685 | 225 | 78 |
| 0.2292 | 8.4234 | 1870 | 0.1978 | 0.9377 | 0.9170 | 0.9512 | 0.9338 | 1558 | 1769 | 141 | 80 |
| 0.2045 | 8.6712 | 1925 | 0.1614 | 0.9546 | 0.9953 | 0.9060 | 0.9485 | 1484 | 1903 | 7 | 154 |
| 0.2145 | 8.9189 | 1980 | 0.1544 | 0.9580 | 0.9794 | 0.9286 | 0.9533 | 1521 | 1878 | 32 | 117 |
| 0.1937 | 9.1667 | 2035 | 0.1571 | 0.9515 | 0.9747 | 0.9188 | 0.9459 | 1505 | 1871 | 39 | 133 |
| 0.2071 | 9.4144 | 2090 | 0.1948 | 0.9374 | 0.9194 | 0.9475 | 0.9333 | 1552 | 1774 | 136 | 86 |
| 0.2136 | 9.6622 | 2145 | 0.2500 | 0.8861 | 0.8324 | 0.9432 | 0.8844 | 1545 | 1599 | 311 | 93 |
| 0.1980 | 9.9099 | 2200 | 0.1493 | 0.9586 | 0.9825 | 0.9267 | 0.9538 | 1518 | 1883 | 27 | 120 |
Framework versions
- Transformers 5.0.0
- Pytorch 2.10.0+cu128
- Datasets 4.0.0
- Tokenizers 0.22.2
- Downloads last month
- 16
Model tree for MoaazTalab/ViT_L16
Base model
google/mobilenet_v2_1.4_224