This collection contains 2 different size of ROBERTA model (base and large) pre-trained from zero with a scientific corpus in spanish