A Family of Pretrained Transformer Language Models for Russian
Paper
•
2309.10931
•
Published
•
5
The model architecture design, pretraining, and evaluation are documented in our preprint: A Family of Pretrained Transformer Language Models for Russian.
The model is pretrained by the SberDevices team.
mask fillingencoderBBPE50 257355 M 250 GB@misc{zmitrovich2023family,
title={A Family of Pretrained Transformer Language Models for Russian},
author={Dmitry Zmitrovich and Alexander Abramov and Andrey Kalmykov and Maria Tikhonova and Ekaterina Taktasheva and Danil Astafurov and Mark Baushenko and Artem Snegirev and Tatiana Shavrina and Sergey Markov and Vladislav Mikhailov and Alena Fenogenova},
year={2023},
eprint={2309.10931},
archivePrefix={arXiv},
primaryClass={cs.CL}
}