my-unixcoder-RQ3
This model is a fine-tuned version of microsoft/unixcoder-base on an unknown dataset. It achieves the following results on the evaluation set:
- Loss: 0.5170
- Accuracy: 0.9459
- F1 Macro: 0.6581
- F1 Weighted: 0.9465
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 1e-05
- train_batch_size: 64
- eval_batch_size: 64
- seed: 42
- optimizer: Use OptimizerNames.ADAMW_TORCH_FUSED with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 0.1
- num_epochs: 10
- mixed_precision_training: Native AMP
Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy | F1 Macro | F1 Weighted |
|---|---|---|---|---|---|---|
| 0.4074 | 1.0 | 539 | 0.4512 | 0.9250 | 0.2726 | 0.9164 |
| 0.3457 | 2.0 | 1078 | 0.3373 | 0.9445 | 0.5980 | 0.9414 |
| 0.2816 | 3.0 | 1617 | 0.3155 | 0.9452 | 0.6540 | 0.9452 |
| 0.2311 | 4.0 | 2156 | 0.3363 | 0.9459 | 0.6412 | 0.9453 |
| 0.1854 | 5.0 | 2695 | 0.3757 | 0.9445 | 0.6623 | 0.9460 |
| 0.1534 | 6.0 | 3234 | 0.4139 | 0.9464 | 0.6674 | 0.9473 |
| 0.1155 | 7.0 | 3773 | 0.4640 | 0.9457 | 0.6627 | 0.9468 |
| 0.1087 | 8.0 | 4312 | 0.4969 | 0.9448 | 0.6585 | 0.9457 |
| 0.0807 | 9.0 | 4851 | 0.5103 | 0.9457 | 0.6551 | 0.9461 |
| 0.0726 | 10.0 | 5390 | 0.5170 | 0.9459 | 0.6581 | 0.9465 |
Framework versions
- Transformers 5.0.0
- Pytorch 2.10.0+cu128
- Datasets 4.8.3
- Tokenizers 0.22.2
- Downloads last month
- 1
Model tree for DPhO05/my-unixcoder-RQ3
Base model
microsoft/unixcoder-base