Instructions to use suno/bark with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use suno/bark with Transformers:
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("text-to-speech", model="suno/bark")# Load model directly from transformers import AutoProcessor, AutoModelForTextToWaveform processor = AutoProcessor.from_pretrained("suno/bark") model = AutoModelForTextToWaveform.from_pretrained("suno/bark") - Notebooks
- Google Colab
- Kaggle
add `inference: false` tag
#14
by reach-vb - opened
Since TTS isn't a supported pipeline within transformers, we'll turn inference off for now. As soon as the PR for TTS pipeline is merged we'll set it back to True.
georg-suno changed pull request status to merged