|
--- |
|
license: apache-2.0 |
|
base_model: bert-base-cased |
|
tags: |
|
- generated_from_trainer |
|
metrics: |
|
- precision |
|
- recall |
|
- f1 |
|
- accuracy |
|
model-index: |
|
- name: bert-base-cased-finetuned-ner-cadec |
|
results: [] |
|
--- |
|
|
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You |
|
should probably proofread and complete it, then remove this comment. --> |
|
|
|
# bert-base-cased-finetuned-ner-cadec |
|
|
|
This model is a fine-tuned version of [bert-base-cased](https://huggingface.co/bert-base-cased) on an unknown dataset. |
|
It achieves the following results on the evaluation set: |
|
- Loss: 0.3476 |
|
- Precision: 0.5870 |
|
- Recall: 0.6866 |
|
- F1: 0.6329 |
|
- Accuracy: 0.9193 |
|
- Adr Precision: 0.5614 |
|
- Adr Recall: 0.6881 |
|
- Adr F1: 0.6183 |
|
- Disease Precision: 0.0 |
|
- Disease Recall: 0.0 |
|
- Disease F1: 0.0 |
|
- Drug Precision: 0.8988 |
|
- Drug Recall: 0.9152 |
|
- Drug F1: 0.9069 |
|
- Finding Precision: 0.2295 |
|
- Finding Recall: 0.3111 |
|
- Finding F1: 0.2642 |
|
- Symptom Precision: 0.4762 |
|
- Symptom Recall: 0.3704 |
|
- Symptom F1: 0.4167 |
|
- B-adr Precision: 0.7133 |
|
- B-adr Recall: 0.8119 |
|
- B-adr F1: 0.7594 |
|
- B-disease Precision: 0.0 |
|
- B-disease Recall: 0.0 |
|
- B-disease F1: 0.0 |
|
- B-drug Precision: 0.9639 |
|
- B-drug Recall: 0.9697 |
|
- B-drug F1: 0.9668 |
|
- B-finding Precision: 0.3469 |
|
- B-finding Recall: 0.3778 |
|
- B-finding F1: 0.3617 |
|
- B-symptom Precision: 0.7857 |
|
- B-symptom Recall: 0.44 |
|
- B-symptom F1: 0.5641 |
|
- I-adr Precision: 0.5799 |
|
- I-adr Recall: 0.6991 |
|
- I-adr F1: 0.6340 |
|
- I-disease Precision: 0.0 |
|
- I-disease Recall: 0.0 |
|
- I-disease F1: 0.0 |
|
- I-drug Precision: 0.9042 |
|
- I-drug Recall: 0.9152 |
|
- I-drug F1: 0.9096 |
|
- I-finding Precision: 0.2979 |
|
- I-finding Recall: 0.3684 |
|
- I-finding F1: 0.3294 |
|
- I-symptom Precision: 0.3333 |
|
- I-symptom Recall: 0.2 |
|
- I-symptom F1: 0.25 |
|
- Macro Avg F1: 0.4775 |
|
- Weighted Avg F1: 0.7087 |
|
|
|
## Model description |
|
|
|
More information needed |
|
|
|
## Intended uses & limitations |
|
|
|
More information needed |
|
|
|
## Training and evaluation data |
|
|
|
More information needed |
|
|
|
## Training procedure |
|
|
|
### Training hyperparameters |
|
|
|
The following hyperparameters were used during training: |
|
- learning_rate: 2e-05 |
|
- train_batch_size: 8 |
|
- eval_batch_size: 8 |
|
- seed: 42 |
|
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 |
|
- lr_scheduler_type: linear |
|
- num_epochs: 10 |
|
|
|
### Training results |
|
|
|
| Training Loss | Epoch | Step | Validation Loss | Precision | Recall | F1 | Accuracy | Adr Precision | Adr Recall | Adr F1 | Disease Precision | Disease Recall | Disease F1 | Drug Precision | Drug Recall | Drug F1 | Finding Precision | Finding Recall | Finding F1 | Symptom Precision | Symptom Recall | Symptom F1 | B-adr Precision | B-adr Recall | B-adr F1 | B-disease Precision | B-disease Recall | B-disease F1 | B-drug Precision | B-drug Recall | B-drug F1 | B-finding Precision | B-finding Recall | B-finding F1 | B-symptom Precision | B-symptom Recall | B-symptom F1 | I-adr Precision | I-adr Recall | I-adr F1 | I-disease Precision | I-disease Recall | I-disease F1 | I-drug Precision | I-drug Recall | I-drug F1 | I-finding Precision | I-finding Recall | I-finding F1 | I-symptom Precision | I-symptom Recall | I-symptom F1 | Macro Avg F1 | Weighted Avg F1 | |
|
|:-------------:|:-----:|:----:|:---------------:|:---------:|:------:|:------:|:--------:|:-------------:|:----------:|:------:|:-----------------:|:--------------:|:----------:|:--------------:|:-----------:|:-------:|:-----------------:|:--------------:|:----------:|:-----------------:|:--------------:|:----------:|:---------------:|:------------:|:--------:|:-------------------:|:----------------:|:------------:|:----------------:|:-------------:|:---------:|:-------------------:|:----------------:|:------------:|:-------------------:|:----------------:|:------------:|:---------------:|:------------:|:--------:|:-------------------:|:----------------:|:------------:|:----------------:|:-------------:|:---------:|:-------------------:|:----------------:|:------------:|:-------------------:|:----------------:|:------------:|:------------:|:---------------:| |
|
| No log | 1.0 | 127 | 0.2830 | 0.4796 | 0.6005 | 0.5333 | 0.9082 | 0.4248 | 0.6220 | 0.5048 | 0.0 | 0.0 | 0.0 | 0.7966 | 0.8545 | 0.8246 | 0.1 | 0.0222 | 0.0364 | 0.0 | 0.0 | 0.0 | 0.6122 | 0.7908 | 0.6901 | 0.0 | 0.0 | 0.0 | 0.9157 | 0.9212 | 0.9184 | 0.5714 | 0.0889 | 0.1538 | 0.0 | 0.0 | 0.0 | 0.4687 | 0.6472 | 0.5436 | 0.0556 | 0.0625 | 0.0588 | 0.8161 | 0.8606 | 0.8378 | 0.2857 | 0.0526 | 0.0889 | 0.0 | 0.0 | 0.0 | 0.3291 | 0.6177 | |
|
| No log | 2.0 | 254 | 0.2472 | 0.5073 | 0.6092 | 0.5536 | 0.9125 | 0.4913 | 0.6183 | 0.5475 | 0.0227 | 0.0526 | 0.0317 | 0.8571 | 0.8727 | 0.8649 | 0.0984 | 0.1333 | 0.1132 | 0.0 | 0.0 | 0.0 | 0.7092 | 0.7582 | 0.7328 | 0.3333 | 0.0526 | 0.0909 | 0.9568 | 0.9394 | 0.9480 | 0.3542 | 0.3778 | 0.3656 | 0.0 | 0.0 | 0.0 | 0.5275 | 0.6429 | 0.5795 | 0.0714 | 0.1875 | 0.1034 | 0.8788 | 0.8788 | 0.8788 | 0.1667 | 0.1316 | 0.1471 | 0.0 | 0.0 | 0.0 | 0.3846 | 0.6615 | |
|
| No log | 3.0 | 381 | 0.2629 | 0.5733 | 0.6542 | 0.6111 | 0.9177 | 0.5495 | 0.6624 | 0.6007 | 0.075 | 0.1579 | 0.1017 | 0.8982 | 0.9091 | 0.9036 | 0.125 | 0.1111 | 0.1176 | 0.5 | 0.1852 | 0.2703 | 0.7105 | 0.7774 | 0.7424 | 0.2174 | 0.2632 | 0.2381 | 0.9578 | 0.9636 | 0.9607 | 0.2963 | 0.1778 | 0.2222 | 0.5 | 0.2 | 0.2857 | 0.5783 | 0.6797 | 0.6249 | 0.0882 | 0.1875 | 0.12 | 0.9146 | 0.9091 | 0.9119 | 0.2609 | 0.1579 | 0.1967 | 0.0 | 0.0 | 0.0 | 0.4303 | 0.6880 | |
|
| 0.2709 | 4.0 | 508 | 0.2630 | 0.5877 | 0.6567 | 0.6203 | 0.9177 | 0.5499 | 0.6569 | 0.5987 | 0.0 | 0.0 | 0.0 | 0.8922 | 0.9030 | 0.8976 | 0.2459 | 0.3333 | 0.2830 | 0.5 | 0.1481 | 0.2286 | 0.7219 | 0.7774 | 0.7486 | 0.0 | 0.0 | 0.0 | 0.9518 | 0.9576 | 0.9547 | 0.3061 | 0.3333 | 0.3191 | 0.5 | 0.16 | 0.2424 | 0.5759 | 0.6818 | 0.6244 | 0.0 | 0.0 | 0.0 | 0.9146 | 0.9091 | 0.9119 | 0.3333 | 0.4737 | 0.3913 | 0.0 | 0.0 | 0.0 | 0.4192 | 0.6923 | |
|
| 0.2709 | 5.0 | 635 | 0.2856 | 0.5714 | 0.6542 | 0.6100 | 0.9180 | 0.5455 | 0.6606 | 0.5975 | 0.075 | 0.1579 | 0.1017 | 0.9085 | 0.9030 | 0.9058 | 0.1667 | 0.1333 | 0.1481 | 0.3529 | 0.2222 | 0.2727 | 0.7284 | 0.7774 | 0.7521 | 0.1429 | 0.2105 | 0.1702 | 0.9693 | 0.9576 | 0.9634 | 0.2917 | 0.1556 | 0.2029 | 0.5 | 0.24 | 0.3243 | 0.5616 | 0.6905 | 0.6194 | 0.1176 | 0.25 | 0.1600 | 0.9202 | 0.9091 | 0.9146 | 0.25 | 0.1579 | 0.1935 | 0.5 | 0.15 | 0.2308 | 0.4531 | 0.6930 | |
|
| 0.2709 | 6.0 | 762 | 0.3053 | 0.5488 | 0.6529 | 0.5964 | 0.9140 | 0.5331 | 0.6642 | 0.5915 | 0.0 | 0.0 | 0.0 | 0.8976 | 0.9030 | 0.9003 | 0.0962 | 0.1111 | 0.1031 | 0.4667 | 0.2593 | 0.3333 | 0.7073 | 0.8023 | 0.7518 | 0.0 | 0.0 | 0.0 | 0.9636 | 0.9636 | 0.9636 | 0.2927 | 0.2667 | 0.2791 | 0.7273 | 0.32 | 0.4444 | 0.5554 | 0.6732 | 0.6086 | 0.1053 | 0.25 | 0.1481 | 0.9030 | 0.9030 | 0.9030 | 0.2222 | 0.1579 | 0.1846 | 0.6 | 0.15 | 0.24 | 0.4523 | 0.6902 | |
|
| 0.2709 | 7.0 | 889 | 0.3162 | 0.5816 | 0.6717 | 0.6234 | 0.9200 | 0.5605 | 0.6716 | 0.6110 | 0.0 | 0.0 | 0.0 | 0.9102 | 0.9212 | 0.9157 | 0.1607 | 0.2 | 0.1782 | 0.5 | 0.4074 | 0.4490 | 0.7207 | 0.8023 | 0.7593 | 0.1667 | 0.0526 | 0.08 | 0.9639 | 0.9697 | 0.9668 | 0.3261 | 0.3333 | 0.3297 | 0.6875 | 0.44 | 0.5366 | 0.5769 | 0.6818 | 0.6250 | 0.0385 | 0.0625 | 0.0476 | 0.9268 | 0.9212 | 0.9240 | 0.2 | 0.2105 | 0.2051 | 0.4545 | 0.25 | 0.3226 | 0.4797 | 0.7054 | |
|
| 0.0894 | 8.0 | 1016 | 0.3347 | 0.5935 | 0.6891 | 0.6378 | 0.9181 | 0.5595 | 0.6899 | 0.6179 | 0.0 | 0.0 | 0.0 | 0.8876 | 0.9091 | 0.8982 | 0.2712 | 0.3556 | 0.3077 | 0.5556 | 0.3704 | 0.4444 | 0.7167 | 0.8157 | 0.7630 | 0.0 | 0.0 | 0.0 | 0.9581 | 0.9697 | 0.9639 | 0.3404 | 0.3556 | 0.3478 | 0.8462 | 0.44 | 0.5789 | 0.5786 | 0.7013 | 0.6341 | 0.0 | 0.0 | 0.0 | 0.8929 | 0.9091 | 0.9009 | 0.3265 | 0.4211 | 0.3678 | 0.4444 | 0.2 | 0.2759 | 0.4832 | 0.7099 | |
|
| 0.0894 | 9.0 | 1143 | 0.3441 | 0.5813 | 0.6742 | 0.6243 | 0.9194 | 0.5549 | 0.6771 | 0.6099 | 0.0 | 0.0 | 0.0 | 0.8817 | 0.9030 | 0.8922 | 0.2182 | 0.2667 | 0.2400 | 0.5263 | 0.3704 | 0.4348 | 0.7197 | 0.8081 | 0.7613 | 0.0 | 0.0 | 0.0 | 0.9524 | 0.9697 | 0.9610 | 0.3478 | 0.3556 | 0.3516 | 0.8462 | 0.44 | 0.5789 | 0.5727 | 0.6905 | 0.6261 | 0.0 | 0.0 | 0.0 | 0.8976 | 0.9030 | 0.9003 | 0.2683 | 0.2895 | 0.2785 | 0.4 | 0.2 | 0.2667 | 0.4724 | 0.7041 | |
|
| 0.0894 | 10.0 | 1270 | 0.3476 | 0.5870 | 0.6866 | 0.6329 | 0.9193 | 0.5614 | 0.6881 | 0.6183 | 0.0 | 0.0 | 0.0 | 0.8988 | 0.9152 | 0.9069 | 0.2295 | 0.3111 | 0.2642 | 0.4762 | 0.3704 | 0.4167 | 0.7133 | 0.8119 | 0.7594 | 0.0 | 0.0 | 0.0 | 0.9639 | 0.9697 | 0.9668 | 0.3469 | 0.3778 | 0.3617 | 0.7857 | 0.44 | 0.5641 | 0.5799 | 0.6991 | 0.6340 | 0.0 | 0.0 | 0.0 | 0.9042 | 0.9152 | 0.9096 | 0.2979 | 0.3684 | 0.3294 | 0.3333 | 0.2 | 0.25 | 0.4775 | 0.7087 | |
|
|
|
|
|
### Framework versions |
|
|
|
- Transformers 4.35.2 |
|
- Pytorch 2.1.0+cu118 |
|
- Datasets 2.15.0 |
|
- Tokenizers 0.15.0 |
|
|