Fill-Mask
Transformers
Safetensors
esm
How to use from the
Use from the
Transformers library
# Use a pipeline as a high-level helper
from transformers import pipeline

pipe = pipeline("fill-mask", model="Dannyang/EvoNB_3")
# Load model directly
from transformers import AutoTokenizer, AutoModelForMaskedLM

tokenizer = AutoTokenizer.from_pretrained("Dannyang/EvoNB_3")
model = AutoModelForMaskedLM.from_pretrained("Dannyang/EvoNB_3")
Quick Links

This is one of the five models obtained by fine-tuning the esm2_t33_650M_UR50D on approximately 7.66 million nanobody sequences.

For more information, please visit https://github.com/DynaX-C/EvoNB.

Downloads last month
160
Safetensors
Model size
0.7B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for Dannyang/EvoNB_3

Finetuned
(32)
this model

Dataset used to train Dannyang/EvoNB_3

Collection including Dannyang/EvoNB_3