poltextlab
commited on
Commit
•
dae7392
1
Parent(s):
4892a57
Update README.md
Browse files
README.md
CHANGED
@@ -26,18 +26,18 @@ widget:
|
|
26 |
|
27 |
## Model description
|
28 |
|
29 |
-
Cased fine-tuned BERT model for Hungarian, trained on (
|
30 |
|
31 |
## Intended uses & limitations
|
32 |
|
33 |
-
The model can be used as any other (cased) BERT model. It has been tested recognizing positive, negative and neutral sentences in (parliamentary) pre-agenda speeches, where:
|
34 |
* 'Label_0': Neutral
|
35 |
* 'Label_1': Positive
|
36 |
* 'Label_2': Negative
|
37 |
|
38 |
## Training
|
39 |
|
40 |
-
|
41 |
|
42 |
## Eval results
|
43 |
|
|
|
26 |
|
27 |
## Model description
|
28 |
|
29 |
+
Cased fine-tuned BERT model for Hungarian, trained on (manually annotated) parliamentary pre-agenda speeches scraped from `parlament.hu`.
|
30 |
|
31 |
## Intended uses & limitations
|
32 |
|
33 |
+
The model can be used as any other (cased) BERT model. It has been tested recognizing positive, negative, and neutral sentences in (parliamentary) pre-agenda speeches, where:
|
34 |
* 'Label_0': Neutral
|
35 |
* 'Label_1': Positive
|
36 |
* 'Label_2': Negative
|
37 |
|
38 |
## Training
|
39 |
|
40 |
+
The fine-tuned version of the original huBERT model (`SZTAKI-HLT/hubert-base-cc`), trained on HunEmPoli corpus.
|
41 |
|
42 |
## Eval results
|
43 |
|