Instructions to use joelb/custom-handler-tutorial with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use joelb/custom-handler-tutorial with Transformers:
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("text-classification", model="joelb/custom-handler-tutorial")# Load model directly from transformers import AutoTokenizer, AutoModelForSequenceClassification tokenizer = AutoTokenizer.from_pretrained("joelb/custom-handler-tutorial") model = AutoModelForSequenceClassification.from_pretrained("joelb/custom-handler-tutorial") - Notebooks
- Google Colab
- Kaggle
- Xet hash:
- b231d09fdc6b334ccbfb9da1044f91e6aac1839791ccbded3a6e4d99de807ca7
- Size of remote file:
- 268 MB
- SHA256:
- a2e46512c3fe95064cc92cd00d7c52a96be25ac6d753ee736e93438f8aa6a09e
·
Xet efficiently stores Large Files inside Git, intelligently splitting files into unique chunks and accelerating uploads and downloads. More info.