File size: 10,019 Bytes
4b6c8ec
 
 
 
 
 
 
 
9325eaa
4b6c8ec
 
 
 
 
 
 
 
 
 
 
 
9325eaa
 
 
 
4b6c8ec
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
---
license: mit
tags:
- generated_from_trainer
metrics:
- accuracy
- precision
- recall
base_model: law-ai/InLegalBERT
model-index:
- name: legal-bert-lora-no-grad
  results: []
---

<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->

# legal-bert-lora-no-grad

This model is a fine-tuned version of [law-ai/InLegalBERT](https://huggingface.co/law-ai/InLegalBERT) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 1.5075
- Accuracy: 0.8280
- Precision: 0.8290
- Recall: 0.8280
- Precision Macro: 0.7852
- Recall Macro: 0.7756
- Macro Fpr: 0.0151
- Weighted Fpr: 0.0145
- Weighted Specificity: 0.9775
- Macro Specificity: 0.9871
- Weighted Sensitivity: 0.8288
- Macro Sensitivity: 0.7756
- F1 Micro: 0.8288
- F1 Macro: 0.7761
- F1 Weighted: 0.8279

## Model description

More information needed

## Intended uses & limitations

More information needed

## Training and evaluation data

More information needed

## Training procedure

### Training hyperparameters

The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 30

### Training results

| Training Loss | Epoch | Step  | Validation Loss | Accuracy | Precision | Recall | Precision Macro | Recall Macro | Macro Fpr | Weighted Fpr | Weighted Specificity | Macro Specificity | Weighted Sensitivity | Macro Sensitivity | F1 Micro | F1 Macro | F1 Weighted |
|:-------------:|:-----:|:-----:|:---------------:|:--------:|:---------:|:------:|:---------------:|:------------:|:---------:|:------------:|:--------------------:|:-----------------:|:--------------------:|:-----------------:|:--------:|:--------:|:-----------:|
| 1.6412        | 1.0   | 643   | 0.7925          | 0.7514   | 0.7190    | 0.7514 | 0.4123          | 0.4707       | 0.0237    | 0.0231       | 0.9699               | 0.9814            | 0.7514               | 0.4707            | 0.7514   | 0.4277   | 0.7283      |
| 0.7481        | 2.0   | 1286  | 0.6772          | 0.7901   | 0.7726    | 0.7901 | 0.5958          | 0.6252       | 0.0192    | 0.0186       | 0.9741               | 0.9843            | 0.7901               | 0.6252            | 0.7901   | 0.5998   | 0.7769      |
| 0.6465        | 3.0   | 1929  | 0.6500          | 0.8048   | 0.7931    | 0.8048 | 0.6216          | 0.6414       | 0.0176    | 0.0170       | 0.9764               | 0.9854            | 0.8048               | 0.6414            | 0.8048   | 0.6110   | 0.7904      |
| 0.4707        | 4.0   | 2572  | 0.6704          | 0.8095   | 0.8008    | 0.8095 | 0.6322          | 0.6689       | 0.0173    | 0.0165       | 0.9745               | 0.9856            | 0.8095               | 0.6689            | 0.8095   | 0.6425   | 0.8018      |
| 0.4021        | 5.0   | 3215  | 0.7320          | 0.8280   | 0.8269    | 0.8280 | 0.7782          | 0.7573       | 0.0154    | 0.0146       | 0.9765               | 0.9870            | 0.8280               | 0.7573            | 0.8280   | 0.7571   | 0.8219      |
| 0.3627        | 6.0   | 3858  | 0.6892          | 0.8242   | 0.8227    | 0.8242 | 0.7431          | 0.7365       | 0.0156    | 0.0150       | 0.9768               | 0.9867            | 0.8242               | 0.7365            | 0.8242   | 0.7374   | 0.8223      |
| 0.2866        | 7.0   | 4501  | 0.8756          | 0.8180   | 0.8171    | 0.8180 | 0.7748          | 0.7410       | 0.0166    | 0.0156       | 0.9718               | 0.9860            | 0.8180               | 0.7410            | 0.8180   | 0.7444   | 0.8122      |
| 0.2639        | 8.0   | 5144  | 0.8580          | 0.8265   | 0.8259    | 0.8265 | 0.7989          | 0.7428       | 0.0155    | 0.0148       | 0.9756               | 0.9868            | 0.8265               | 0.7428            | 0.8265   | 0.7480   | 0.8217      |
| 0.2295        | 9.0   | 5787  | 0.9366          | 0.8257   | 0.8231    | 0.8257 | 0.7725          | 0.7465       | 0.0155    | 0.0149       | 0.9762               | 0.9868            | 0.8257               | 0.7465            | 0.8257   | 0.7524   | 0.8223      |
| 0.195         | 10.0  | 6430  | 0.9685          | 0.8273   | 0.8236    | 0.8273 | 0.7595          | 0.7515       | 0.0153    | 0.0147       | 0.9767               | 0.9869            | 0.8273               | 0.7515            | 0.8273   | 0.7528   | 0.8241      |
| 0.1617        | 11.0  | 7073  | 1.0406          | 0.8311   | 0.8263    | 0.8311 | 0.7615          | 0.7552       | 0.0149    | 0.0143       | 0.9776               | 0.9872            | 0.8311               | 0.7552            | 0.8311   | 0.7543   | 0.8265      |
| 0.1421        | 12.0  | 7716  | 1.0713          | 0.8319   | 0.8276    | 0.8319 | 0.7626          | 0.7533       | 0.0148    | 0.0142       | 0.9773               | 0.9873            | 0.8319               | 0.7533            | 0.8319   | 0.7546   | 0.8287      |
| 0.1184        | 13.0  | 8359  | 1.1125          | 0.8257   | 0.8209    | 0.8257 | 0.7569          | 0.7504       | 0.0155    | 0.0149       | 0.9765               | 0.9868            | 0.8257               | 0.7504            | 0.8257   | 0.7510   | 0.8219      |
| 0.1017        | 14.0  | 9002  | 1.1926          | 0.8211   | 0.8215    | 0.8211 | 0.7675          | 0.7815       | 0.0159    | 0.0153       | 0.9776               | 0.9866            | 0.8211               | 0.7815            | 0.8211   | 0.7727   | 0.8196      |
| 0.0752        | 15.0  | 9645  | 1.2508          | 0.8164   | 0.8121    | 0.8164 | 0.7479          | 0.7377       | 0.0164    | 0.0158       | 0.9753               | 0.9861            | 0.8164               | 0.7377            | 0.8164   | 0.7402   | 0.8133      |
| 0.0787        | 16.0  | 10288 | 1.3247          | 0.8218   | 0.8199    | 0.8218 | 0.8034          | 0.7585       | 0.0160    | 0.0152       | 0.9752               | 0.9865            | 0.8218               | 0.7585            | 0.8218   | 0.7698   | 0.8188      |
| 0.0668        | 17.0  | 10931 | 1.3497          | 0.8211   | 0.8201    | 0.8211 | 0.7500          | 0.7487       | 0.0158    | 0.0153       | 0.9778               | 0.9866            | 0.8211               | 0.7487            | 0.8211   | 0.7468   | 0.8198      |
| 0.0471        | 18.0  | 11574 | 1.4278          | 0.8164   | 0.8174    | 0.8164 | 0.7672          | 0.7670       | 0.0165    | 0.0158       | 0.9759               | 0.9862            | 0.8164               | 0.7670            | 0.8164   | 0.7644   | 0.8159      |
| 0.0492        | 19.0  | 12217 | 1.4784          | 0.8180   | 0.8178    | 0.8180 | 0.7631          | 0.7431       | 0.0162    | 0.0156       | 0.9763               | 0.9863            | 0.8180               | 0.7431            | 0.8180   | 0.7453   | 0.8156      |
| 0.0368        | 20.0  | 12860 | 1.4747          | 0.8195   | 0.8183    | 0.8195 | 0.7729          | 0.7568       | 0.0161    | 0.0155       | 0.9760               | 0.9864            | 0.8195               | 0.7568            | 0.8195   | 0.7622   | 0.8180      |
| 0.0329        | 21.0  | 13503 | 1.5075          | 0.8280   | 0.8290    | 0.8280 | 0.7825          | 0.7845       | 0.0152    | 0.0146       | 0.9782               | 0.9871            | 0.8280               | 0.7845            | 0.8280   | 0.7798   | 0.8268      |
| 0.0266        | 22.0  | 14146 | 1.4783          | 0.8273   | 0.8262    | 0.8273 | 0.7780          | 0.7612       | 0.0153    | 0.0147       | 0.9779               | 0.9870            | 0.8273               | 0.7612            | 0.8273   | 0.7651   | 0.8247      |
| 0.0302        | 23.0  | 14789 | 1.5281          | 0.8234   | 0.8246    | 0.8234 | 0.7745          | 0.7699       | 0.0158    | 0.0151       | 0.9760               | 0.9866            | 0.8234               | 0.7699            | 0.8234   | 0.7679   | 0.8224      |
| 0.0207        | 24.0  | 15432 | 1.5475          | 0.8265   | 0.8262    | 0.8265 | 0.7809          | 0.7727       | 0.0155    | 0.0148       | 0.9768               | 0.9869            | 0.8265               | 0.7727            | 0.8265   | 0.7721   | 0.8248      |
| 0.0168        | 25.0  | 16075 | 1.5237          | 0.8242   | 0.8237    | 0.8242 | 0.7726          | 0.7619       | 0.0155    | 0.0150       | 0.9775               | 0.9868            | 0.8242               | 0.7619            | 0.8242   | 0.7629   | 0.8231      |
| 0.0167        | 26.0  | 16718 | 1.5815          | 0.8234   | 0.8255    | 0.8234 | 0.7766          | 0.7728       | 0.0156    | 0.0151       | 0.9775               | 0.9867            | 0.8234               | 0.7728            | 0.8234   | 0.7707   | 0.8232      |
| 0.0127        | 27.0  | 17361 | 1.6010          | 0.8218   | 0.8228    | 0.8218 | 0.7790          | 0.7716       | 0.0158    | 0.0152       | 0.9769               | 0.9866            | 0.8218               | 0.7716            | 0.8218   | 0.7709   | 0.8211      |
| 0.0094        | 28.0  | 18004 | 1.5774          | 0.8265   | 0.8269    | 0.8265 | 0.7788          | 0.7739       | 0.0153    | 0.0148       | 0.9778               | 0.9870            | 0.8265               | 0.7739            | 0.8265   | 0.7728   | 0.8258      |
| 0.0063        | 29.0  | 18647 | 1.5894          | 0.8304   | 0.8306    | 0.8304 | 0.7825          | 0.7764       | 0.0150    | 0.0144       | 0.9779               | 0.9872            | 0.8304               | 0.7764            | 0.8304   | 0.7759   | 0.8296      |
| 0.0126        | 30.0  | 19290 | 1.5927          | 0.8288   | 0.8291    | 0.8288 | 0.7852          | 0.7756       | 0.0151    | 0.0145       | 0.9775               | 0.9871            | 0.8288               | 0.7756            | 0.8288   | 0.7761   | 0.8279      |


### Framework versions

- Transformers 4.35.2
- Pytorch 2.1.0+cu121
- Datasets 2.18.0
- Tokenizers 0.15.1