Don't settle for less. Elevate your Turkish applications with VBART today!
It comes in two sizes:
VBART-Large: 374M parameters
VBART-XLarge: 740M parameters
The default context length for the encoder and decoder is 2023; however, it can be easily extended for longer sequences.
The model is available for Tensorflow, PyTroch, and Huggingface, and can be fine-tunded using any of these frameworks.