Instructions to use Ife/ES-PT with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use Ife/ES-PT with Transformers:
# Load model directly from transformers import AutoTokenizer, AutoModelForSeq2SeqLM tokenizer = AutoTokenizer.from_pretrained("Ife/ES-PT") model = AutoModelForSeq2SeqLM.from_pretrained("Ife/ES-PT") - Notebooks
- Google Colab
- Kaggle
| {"source_lang": "spa", "target_lang": "cat", "unk_token": "<unk>", "eos_token": "</s>", "pad_token": "<pad>", "model_max_length": 512, "sp_model_kwargs": {}, "name_or_path": "Helsinki-NLP/opus-mt-es-ca"} |