File size: 6,066 Bytes
8c200c5
 
 
 
66b1e21
8c200c5
 
 
 
 
 
 
 
 
 
 
 
 
 
c1b5626
8c200c5
c1b5626
 
 
 
 
 
 
 
 
 
 
 
 
 
 
8c200c5
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
c1b5626
 
 
 
 
 
 
 
 
 
 
 
 
 
 
8c200c5
 
 
 
 
 
c1b5626
8c200c5
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
---
license: apache-2.0
tags:
- generated_from_trainer
base_model: facebook/bart-large
metrics:
- accuracy
- precision
- recall
model-index:
- name: bart-base-lora
  results: []
---

<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->

# bart-base-lora

This model is a fine-tuned version of [facebook/bart-large](https://huggingface.co/facebook/bart-large) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 0.6884
- Accuracy: 0.8172
- Precision: 0.8132
- Recall: 0.8172
- Precision Macro: 0.7584
- Recall Macro: 0.7412
- Macro Fpr: 0.0164
- Weighted Fpr: 0.0157
- Weighted Specificity: 0.9755
- Macro Specificity: 0.9862
- Weighted Sensitivity: 0.8172
- Macro Sensitivity: 0.7412
- F1 Micro: 0.8172
- F1 Macro: 0.7417
- F1 Weighted: 0.8124

## Model description

More information needed

## Intended uses & limitations

More information needed

## Training and evaluation data

More information needed

## Training procedure

### Training hyperparameters

The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- gradient_accumulation_steps: 4
- total_train_batch_size: 32
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 15

### Training results

| Training Loss | Epoch | Step | Validation Loss | Accuracy | Precision | Recall | Precision Macro | Recall Macro | Macro Fpr | Weighted Fpr | Weighted Specificity | Macro Specificity | Weighted Sensitivity | Macro Sensitivity | F1 Micro | F1 Macro | F1 Weighted |
|:-------------:|:-----:|:----:|:---------------:|:--------:|:---------:|:------:|:---------------:|:------------:|:---------:|:------------:|:--------------------:|:-----------------:|:--------------------:|:-----------------:|:--------:|:--------:|:-----------:|
| No log        | 1.0   | 160  | 0.9525          | 0.7157   | 0.6788    | 0.7157 | 0.3875          | 0.4416       | 0.0285    | 0.0276       | 0.9642               | 0.9787            | 0.7157               | 0.4416            | 0.7157   | 0.3958   | 0.6835      |
| No log        | 2.0   | 321  | 0.7733          | 0.7413   | 0.7296    | 0.7413 | 0.4491          | 0.4687       | 0.0252    | 0.0243       | 0.9668               | 0.9805            | 0.7413               | 0.4687            | 0.7413   | 0.4337   | 0.7231      |
| No log        | 3.0   | 482  | 0.7105          | 0.7738   | 0.7631    | 0.7738 | 0.5565          | 0.5408       | 0.0212    | 0.0205       | 0.9725               | 0.9831            | 0.7738               | 0.5408            | 0.7738   | 0.5271   | 0.7611      |
| 1.08          | 4.0   | 643  | 0.7539          | 0.7576   | 0.7584    | 0.7576 | 0.5791          | 0.5613       | 0.0234    | 0.0223       | 0.9681               | 0.9817            | 0.7576               | 0.5613            | 0.7576   | 0.5497   | 0.7438      |
| 1.08          | 5.0   | 803  | 0.6978          | 0.7831   | 0.7900    | 0.7831 | 0.7410          | 0.6492       | 0.0203    | 0.0194       | 0.9710               | 0.9836            | 0.7831               | 0.6492            | 0.7831   | 0.6354   | 0.7703      |
| 1.08          | 6.0   | 964  | 0.5920          | 0.8156   | 0.8053    | 0.8156 | 0.7051          | 0.6889       | 0.0166    | 0.0159       | 0.9746               | 0.9860            | 0.8156               | 0.6889            | 0.8156   | 0.6860   | 0.8088      |
| 0.5581        | 7.0   | 1125 | 0.6231          | 0.8187   | 0.8178    | 0.8187 | 0.7627          | 0.7425       | 0.0162    | 0.0156       | 0.9766               | 0.9864            | 0.8187               | 0.7425            | 0.8187   | 0.7393   | 0.8147      |
| 0.5581        | 8.0   | 1286 | 0.6291          | 0.8141   | 0.8134    | 0.8141 | 0.7636          | 0.7307       | 0.0167    | 0.0160       | 0.9758               | 0.9860            | 0.8141               | 0.7307            | 0.8141   | 0.7329   | 0.8089      |
| 0.5581        | 9.0   | 1446 | 0.6226          | 0.8242   | 0.8212    | 0.8242 | 0.7666          | 0.7340       | 0.0158    | 0.0150       | 0.9760               | 0.9867            | 0.8242               | 0.7340            | 0.8242   | 0.7365   | 0.8191      |
| 0.3924        | 10.0  | 1607 | 0.6728          | 0.8110   | 0.8123    | 0.8110 | 0.7418          | 0.7289       | 0.0170    | 0.0164       | 0.9762               | 0.9858            | 0.8110               | 0.7289            | 0.8110   | 0.7240   | 0.8048      |
| 0.3924        | 11.0  | 1768 | 0.6805          | 0.8095   | 0.8123    | 0.8095 | 0.7390          | 0.7303       | 0.0173    | 0.0165       | 0.9752               | 0.9856            | 0.8095               | 0.7303            | 0.8095   | 0.7263   | 0.8026      |
| 0.3924        | 12.0  | 1929 | 0.6710          | 0.8133   | 0.8137    | 0.8133 | 0.7396          | 0.7306       | 0.0168    | 0.0161       | 0.9759               | 0.9859            | 0.8133               | 0.7306            | 0.8133   | 0.7284   | 0.8090      |
| 0.2929        | 13.0  | 2089 | 0.6740          | 0.8187   | 0.8170    | 0.8187 | 0.7644          | 0.7360       | 0.0162    | 0.0156       | 0.9761               | 0.9863            | 0.8187               | 0.7360            | 0.8187   | 0.7368   | 0.8151      |
| 0.2929        | 14.0  | 2250 | 0.6823          | 0.8180   | 0.8159    | 0.8180 | 0.7657          | 0.7336       | 0.0164    | 0.0156       | 0.9753               | 0.9862            | 0.8180               | 0.7336            | 0.8180   | 0.7361   | 0.8137      |
| 0.2929        | 14.93 | 2400 | 0.6884          | 0.8172   | 0.8132    | 0.8172 | 0.7584          | 0.7412       | 0.0164    | 0.0157       | 0.9755               | 0.9862            | 0.8172               | 0.7412            | 0.8172   | 0.7417   | 0.8124      |


### Framework versions

- Transformers 4.35.2
- Pytorch 2.1.0+cu121
- Datasets 2.19.0
- Tokenizers 0.15.1