GuCuChiara commited on
Commit
be1bcda
1 Parent(s): 2b86173

Training complete

Browse files
Files changed (1) hide show
  1. README.md +14 -11
README.md CHANGED
@@ -20,11 +20,11 @@ should probably proofread and complete it, then remove this comment. -->
20
 
21
  This model is a fine-tuned version of [bert-base-multilingual-cased](https://huggingface.co/bert-base-multilingual-cased) on the None dataset.
22
  It achieves the following results on the evaluation set:
23
- - Loss: 0.1367
24
- - Precision: 0.5765
25
- - Recall: 0.4800
26
- - F1: 0.5238
27
- - Accuracy: 0.9535
28
 
29
  ## Model description
30
 
@@ -49,20 +49,23 @@ The following hyperparameters were used during training:
49
  - seed: 42
50
  - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
51
  - lr_scheduler_type: linear
52
- - num_epochs: 3
53
 
54
  ### Training results
55
 
56
  | Training Loss | Epoch | Step | Validation Loss | Precision | Recall | F1 | Accuracy |
57
  |:-------------:|:-----:|:----:|:---------------:|:---------:|:------:|:------:|:--------:|
58
- | No log | 1.0 | 71 | 0.1588 | 0.4782 | 0.3308 | 0.3911 | 0.9408 |
59
- | No log | 2.0 | 142 | 0.1397 | 0.5760 | 0.4213 | 0.4866 | 0.9517 |
60
- | No log | 3.0 | 213 | 0.1367 | 0.5765 | 0.4800 | 0.5238 | 0.9535 |
 
 
 
61
 
62
 
63
  ### Framework versions
64
 
65
- - Transformers 4.33.2
66
  - Pytorch 2.0.1+cu118
67
  - Datasets 2.14.5
68
- - Tokenizers 0.13.3
 
20
 
21
  This model is a fine-tuned version of [bert-base-multilingual-cased](https://huggingface.co/bert-base-multilingual-cased) on the None dataset.
22
  It achieves the following results on the evaluation set:
23
+ - Loss: 0.1620
24
+ - Precision: 0.6121
25
+ - Recall: 0.5161
26
+ - F1: 0.5600
27
+ - Accuracy: 0.9541
28
 
29
  ## Model description
30
 
 
49
  - seed: 42
50
  - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
51
  - lr_scheduler_type: linear
52
+ - num_epochs: 6
53
 
54
  ### Training results
55
 
56
  | Training Loss | Epoch | Step | Validation Loss | Precision | Recall | F1 | Accuracy |
57
  |:-------------:|:-----:|:----:|:---------------:|:---------:|:------:|:------:|:--------:|
58
+ | No log | 1.0 | 71 | 0.1704 | 0.4558 | 0.3635 | 0.4045 | 0.9353 |
59
+ | No log | 2.0 | 142 | 0.1572 | 0.5925 | 0.3518 | 0.4415 | 0.9433 |
60
+ | No log | 3.0 | 213 | 0.1386 | 0.5932 | 0.4774 | 0.5290 | 0.9531 |
61
+ | No log | 4.0 | 284 | 0.1427 | 0.5945 | 0.5175 | 0.5534 | 0.9533 |
62
+ | No log | 5.0 | 355 | 0.1653 | 0.6354 | 0.4788 | 0.5461 | 0.9540 |
63
+ | No log | 6.0 | 426 | 0.1620 | 0.6121 | 0.5161 | 0.5600 | 0.9541 |
64
 
65
 
66
  ### Framework versions
67
 
68
+ - Transformers 4.34.0
69
  - Pytorch 2.0.1+cu118
70
  - Datasets 2.14.5
71
+ - Tokenizers 0.14.1