Post
TURNA: the biggest Turkish encoder-decoder model up-to-date, based on UL2 architecture, comes in 1.1B params π¦ π
The researchers also released models fine-tuned on various downstream tasks including text categorization, NER, summarization and more! π€― Great models @onurgu @gokceuludogan @yirmibesogluz @furkanakkurt1618 @uskudarli π
Fine-tuned models are in this collection π boun-tabi-LMG/turna-ft-65b3f20aff5235e6cad07c1b
Pre-trained models are in this collection π boun-tabi-LMG/turna-65ad340e5df673eec66e48c7
The researchers also released models fine-tuned on various downstream tasks including text categorization, NER, summarization and more! π€― Great models @onurgu @gokceuludogan @yirmibesogluz @furkanakkurt1618 @uskudarli π
Fine-tuned models are in this collection π boun-tabi-LMG/turna-ft-65b3f20aff5235e6cad07c1b
Pre-trained models are in this collection π boun-tabi-LMG/turna-65ad340e5df673eec66e48c7