PULI-GPT-2 / README.md
fragata's picture
Update README.md
701b7d5
|
raw
history blame
1.45 kB
---
language:
- hu
tags:
- text-generation
license: cc-by-nc-4.0
widget:
- text: "Elmesélek egy történetet a nyelvtechnológiáról."
---
# PULI GPT-2
For further details, see [our demo site](https://juniper.nytud.hu/demo/gpt2).
- Hungarian GPT-2 model
- Trained with Megatron-DeepSpeed [github](https://github.com/microsoft/Megatron-DeepSpeed)
- Dataset: 36.3 billion words
- Checkpoint: 500 000 steps
## Limitations
- max_seq_length = 1024
## Citation
If you use this model, please cite the following paper:
```
@inproceedings {yang-gpt3,
title = {Jönnek a nagyok! GPT-3, GPT-2 és BERT large nyelvmodellek magyar nyelvre},
booktitle = {XIX. Magyar Számítógépes Nyelvészeti Konferencia (MSZNY 2023)},
year = {2023},
publisher = {Szegedi Tudományegyetem},
address = {Szeged, Hungary},
author = {Yang, Zijian Győző},
pages = {0}
}
```
## Usage
```python
from transformers import GPT2Tokenizer, GPT2Model
tokenizer = GPT2Tokenizer.from_pretrained('NYTK/PULI-GPT-2')
model = GPT2Model.from_pretrained('NYTK/PULI-GPT-2')
text = "Replace me by any text you'd like."
encoded_input = tokenizer(text, return_tensors='pt')
output = model(**encoded_input)
```
## Usage with pipeline
```python
from transformers import pipeline
prompt = "Elmesélek egy történetet a nyelvtechnológiáról."
generator = pipeline(task="text-generation", model="NYTK/PULI-GPT-2")
print(generator(prompt)[0]["generated_text"])
```