Instructions to use Integer-Ctrl/cross-encoder-bert-tiny-1gb-bs32 with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use Integer-Ctrl/cross-encoder-bert-tiny-1gb-bs32 with Transformers:
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("text-classification", model="Integer-Ctrl/cross-encoder-bert-tiny-1gb-bs32")# Load model directly from transformers import AutoTokenizer, AutoModelForSequenceClassification tokenizer = AutoTokenizer.from_pretrained("Integer-Ctrl/cross-encoder-bert-tiny-1gb-bs32") model = AutoModelForSequenceClassification.from_pretrained("Integer-Ctrl/cross-encoder-bert-tiny-1gb-bs32") - Notebooks
- Google Colab
- Kaggle
# Load model directly
from transformers import AutoTokenizer, AutoModelForSequenceClassification
tokenizer = AutoTokenizer.from_pretrained("Integer-Ctrl/cross-encoder-bert-tiny-1gb-bs32")
model = AutoModelForSequenceClassification.from_pretrained("Integer-Ctrl/cross-encoder-bert-tiny-1gb-bs32")Quick Links
YAML Metadata Warning:empty or missing yaml metadata in repo card
Check out the documentation for more information.
Trained on a subset of ~1GB from the msmarco dataset "Train Triples Small". Batch size: 32
- Downloads last month
- 7
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("text-classification", model="Integer-Ctrl/cross-encoder-bert-tiny-1gb-bs32")