ES-Trans-Init

Spanish scientific T5 model initialized from EN-T5-Sci using WECHSEL and a language-specific SentencePiece 32k tokenizer.

Model Details

This is one of the non-English scientific T5 transfer models from the paper. The model keeps the EN-T5-Sci Transformer weights and reinitializes the language-specific embeddings with WECHSEL using a target SentencePiece tokenizer.

  • Paper name: ES-Trans-Init
  • Model role: main
  • Source/base model: EN-T5-Sci
  • Code and pipeline: GitHub repository
  • Architecture: T5 encoder-decoder
  • SciLaD dataset: scilons/SciLaD-all-text-v1
  • Evaluation benchmark: Global-MMLU
  • Target-language tokenizer: Spanish SciLaD split; language-specific SentencePiece 32k tokenizer

Evaluated against:

WECHSEL resources: English fastText embeddings + Spanish fastText embeddings (es) with the spanish bilingual dictionary.

Evaluation

Zero-shot Global-MMLU accuracy reported by the paper aggregation:

Metric Accuracy
Average 26.89
STEM 28.61
Humanities 24.17
Social Sciences 31.07
Other 25.14

Limitations

The model is evaluated primarily with zero-shot Global-MMLU. Downstream task-specific evaluation is recommended before deployment in specialized scientific workflows.

Citation

  • Title: Transferring Scientific English Pre-Trained Language Models to Multiple Languages Using Cross-Lingual Transfer
  • Authors: Nikolas Rauscher, Fabio Barth, Georg Rehm
  • Venue: LREC-COLING 2026, citation details TBA after publication
Downloads last month
13
Safetensors
Model size
0.2B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for rausch/es-t5-sci-transfer-init-spm32k

Finetuned
(6)
this model

Dataset used to train rausch/es-t5-sci-transfer-init-spm32k

Collection including rausch/es-t5-sci-transfer-init-spm32k