code-search-net/code_search_net
Viewer • Updated • 4.14M • 23.8k • 329
How to use Katochh/falcon-code-generation with PEFT:
from peft import PeftModel
from transformers import AutoModelForCausalLM
base_model = AutoModelForCausalLM.from_pretrained("petals-team/falcon-rw-1b")
model = PeftModel.from_pretrained(base_model, "Katochh/falcon-code-generation")This model is a fine-tuned version of petals-team/falcon-rw-1b on the code_search_net dataset. It achieves the following results on the evaluation set:
More information needed
More information needed
More information needed
The following hyperparameters were used during training:
| Training Loss | Epoch | Step | Validation Loss |
|---|---|---|---|
| 1.231 | 0.01 | 20 | 1.2339 |
| 1.2932 | 0.02 | 40 | 1.1486 |
| 1.231 | 0.03 | 60 | 1.1240 |
| 1.0344 | 0.04 | 80 | 1.0872 |
| 1.3396 | 0.04 | 100 | 1.0973 |
| 0.9727 | 0.05 | 120 | 1.0608 |
| 1.1138 | 0.06 | 140 | 1.0520 |
| 1.1591 | 0.07 | 160 | 1.0442 |
| 0.9822 | 0.08 | 180 | 1.0286 |
| 1.1891 | 0.09 | 200 | 1.0345 |
| 1.0183 | 0.1 | 220 | 1.0194 |
| 1.0012 | 0.11 | 240 | 1.0142 |
| 1.1396 | 0.12 | 260 | 1.0116 |
| 1.0058 | 0.12 | 280 | 1.0074 |
| 1.1884 | 0.13 | 300 | 1.0072 |
| 0.9587 | 0.14 | 320 | 1.0068 |
Base model
petals-team/falcon-rw-1b
from peft import PeftModel from transformers import AutoModelForCausalLM base_model = AutoModelForCausalLM.from_pretrained("petals-team/falcon-rw-1b") model = PeftModel.from_pretrained(base_model, "Katochh/falcon-code-generation")