Edit model card

A BERT pre-trained language model for an ancient greek language.

We used GreekBERT from @nlpaueb and fine-tuned it with the MLM objective on several corpora of ancient greek texts. Later, we used it to train several classifiers to assist an author and style attribution of a couple of recently discovered texts.

If you use the model, please cite the following:

@inproceedings{Yamshchikov-etal-2022-plutarch,
    title = "BERT in Plutarch’s Shadows",
    author = "Ivan P. Yamshchikov and
      Alexey Tikhonov  and
      Yorgos Pantis  and
      Charlotte Schubert  and
      J{\"u}rgen Jost",
    booktitle = "Proceedings of the 2022 Conference on Empirical Methods in Natural Language Processing",
    year = "2022",
}
Downloads last month
245
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.