|
--- |
|
license: apache-2.0 |
|
--- |
|
|
|
## Model description |
|
|
|
Cased fine-tuned BERT model for Hungarian, trained on a dataset... |
|
|
|
## Intended uses & limitations |
|
|
|
The model can be used as any other (cased) BERT model. It has been tested recognizing ..., where: |
|
* |
|
* |
|
|
|
## Training |
|
|
|
Fine-tuned version of the original huBERT model (`SZTAKI-HLT/hubert-base-cc`), trained on ... |
|
|
|
## Eval results |
|
|
|
| Class | Precision | Recall | F-Score | |
|
|-----|------------|------------|------| |
|
|
|
|
|
## Usage |
|
|
|
```py |
|
from transformers import AutoTokenizer, AutoModelForSequenceClassification |
|
|
|
tokenizer = AutoTokenizer.from_pretrained("") |
|
model = AutoModelForSequenceClassification.from_pretrained("") |
|
``` |
|
|
|
### BibTeX entry and citation info |
|
|
|
If you use the model, please cite the following paper: |
|
|
|
Bibtex: |
|
```bibtex |
|
@{ |
|
} |
|
``` |