bert-finetuned-ner / README.md
mireiaplalis's picture
Training complete
aa1b99a
|
raw
history blame
8.89 kB
---
license: apache-2.0
base_model: bert-base-cased
tags:
- generated_from_trainer
datasets:
- wnut_17
metrics:
- precision
- recall
- f1
- accuracy
model-index:
- name: bert-finetuned-ner
results:
- task:
name: Token Classification
type: token-classification
dataset:
name: wnut_17
type: wnut_17
config: wnut_17
split: test
args: wnut_17
metrics:
- name: Precision
type: precision
value: 0.5180180180180181
- name: Recall
type: recall
value: 0.31974050046339203
- name: F1
type: f1
value: 0.39541547277936967
- name: Accuracy
type: accuracy
value: 0.9357035175879397
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# bert-finetuned-ner
This model is a fine-tuned version of [bert-base-cased](https://huggingface.co/bert-base-cased) on the wnut_17 dataset.
It achieves the following results on the evaluation set:
- Loss: 0.4235
- Precision: 0.5180
- Recall: 0.3197
- F1: 0.3954
- Accuracy: 0.9357
- Corporation Precision: 0.2222
- Corporation Recall: 0.2121
- Corporation F1: 0.2171
- Creative-work Precision: 0.4462
- Creative-work Recall: 0.2042
- Creative-work F1: 0.2802
- Group Precision: 0.4030
- Group Recall: 0.1636
- Group F1: 0.2328
- Location Precision: 0.5161
- Location Recall: 0.4267
- Location F1: 0.4672
- Person Precision: 0.7747
- Person Recall: 0.4569
- Person F1: 0.5748
- Product Precision: 0.1596
- Product Recall: 0.1181
- Product F1: 0.1357
- B-corporation Precision: 0.3696
- B-corporation Recall: 0.2576
- B-corporation F1: 0.3036
- B-creative-work Precision: 0.75
- B-creative-work Recall: 0.2535
- B-creative-work F1: 0.3789
- B-group Precision: 0.5
- B-group Recall: 0.1636
- B-group F1: 0.2466
- B-location Precision: 0.6293
- B-location Recall: 0.4867
- B-location F1: 0.5489
- B-person Precision: 0.8608
- B-person Recall: 0.4755
- B-person F1: 0.6126
- B-product Precision: 0.4545
- B-product Recall: 0.1969
- B-product F1: 0.2747
- I-corporation Precision: 0.3333
- I-corporation Recall: 0.2727
- I-corporation F1: 0.3
- I-creative-work Precision: 0.4262
- I-creative-work Recall: 0.2016
- I-creative-work F1: 0.2737
- I-group Precision: 0.3478
- I-group Recall: 0.1416
- I-group F1: 0.2013
- I-location Precision: 0.5932
- I-location Recall: 0.3684
- I-location F1: 0.4545
- I-person Precision: 0.7625
- I-person Recall: 0.3631
- I-person F1: 0.4919
- I-product Precision: 0.2222
- I-product Recall: 0.1488
- I-product F1: 0.1782
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 3
### Training results
| Training Loss | Epoch | Step | Validation Loss | Precision | Recall | F1 | Accuracy | Corporation Precision | Corporation Recall | Corporation F1 | Creative-work Precision | Creative-work Recall | Creative-work F1 | Group Precision | Group Recall | Group F1 | Location Precision | Location Recall | Location F1 | Person Precision | Person Recall | Person F1 | Product Precision | Product Recall | Product F1 | B-corporation Precision | B-corporation Recall | B-corporation F1 | B-creative-work Precision | B-creative-work Recall | B-creative-work F1 | B-group Precision | B-group Recall | B-group F1 | B-location Precision | B-location Recall | B-location F1 | B-person Precision | B-person Recall | B-person F1 | B-product Precision | B-product Recall | B-product F1 | I-corporation Precision | I-corporation Recall | I-corporation F1 | I-creative-work Precision | I-creative-work Recall | I-creative-work F1 | I-group Precision | I-group Recall | I-group F1 | I-location Precision | I-location Recall | I-location F1 | I-person Precision | I-person Recall | I-person F1 | I-product Precision | I-product Recall | I-product F1 |
|:-------------:|:-----:|:----:|:---------------:|:---------:|:------:|:------:|:--------:|:---------------------:|:------------------:|:--------------:|:-----------------------:|:--------------------:|:----------------:|:---------------:|:------------:|:--------:|:------------------:|:---------------:|:-----------:|:----------------:|:-------------:|:---------:|:-----------------:|:--------------:|:----------:|:-----------------------:|:--------------------:|:----------------:|:-------------------------:|:----------------------:|:------------------:|:-----------------:|:--------------:|:----------:|:--------------------:|:-----------------:|:-------------:|:------------------:|:---------------:|:-----------:|:-------------------:|:----------------:|:------------:|:-----------------------:|:--------------------:|:----------------:|:-------------------------:|:----------------------:|:------------------:|:-----------------:|:--------------:|:----------:|:--------------------:|:-----------------:|:-------------:|:------------------:|:---------------:|:-----------:|:-------------------:|:----------------:|:------------:|
| No log | 1.0 | 425 | 0.3858 | 0.4406 | 0.2576 | 0.3251 | 0.9303 | 0.0741 | 0.0606 | 0.0667 | 0.0667 | 0.0141 | 0.0233 | 0.1458 | 0.0848 | 0.1073 | 0.3829 | 0.4467 | 0.4123 | 0.7235 | 0.4452 | 0.5512 | 0.0 | 0.0 | 0.0 | 0.2391 | 0.1667 | 0.1964 | 0.0 | 0.0 | 0.0 | 0.375 | 0.0909 | 0.1463 | 0.5137 | 0.5 | 0.5068 | 0.8675 | 0.4732 | 0.6124 | 0.0 | 0.0 | 0.0 | 0.1923 | 0.0909 | 0.1235 | 0.3 | 0.0698 | 0.1132 | 0.1447 | 0.0973 | 0.1164 | 0.3636 | 0.3789 | 0.3711 | 0.7184 | 0.3720 | 0.4902 | 0.0 | 0.0 | 0.0 |
| 0.199 | 2.0 | 850 | 0.4265 | 0.5295 | 0.2743 | 0.3614 | 0.9326 | 0.1444 | 0.1970 | 0.1667 | 0.4583 | 0.1549 | 0.2316 | 0.4483 | 0.0788 | 0.1340 | 0.5263 | 0.4 | 0.4545 | 0.7839 | 0.4312 | 0.5564 | 0.0714 | 0.0236 | 0.0355 | 0.2969 | 0.2879 | 0.2923 | 0.7297 | 0.1901 | 0.3017 | 0.7368 | 0.0848 | 0.1522 | 0.6635 | 0.46 | 0.5433 | 0.8981 | 0.4522 | 0.6016 | 0.5 | 0.0630 | 0.1119 | 0.2090 | 0.2545 | 0.2295 | 0.5581 | 0.1860 | 0.2791 | 0.3 | 0.0531 | 0.0902 | 0.5536 | 0.3263 | 0.4106 | 0.7619 | 0.3333 | 0.4638 | 0.1538 | 0.0496 | 0.075 |
| 0.0799 | 3.0 | 1275 | 0.4235 | 0.5180 | 0.3197 | 0.3954 | 0.9357 | 0.2222 | 0.2121 | 0.2171 | 0.4462 | 0.2042 | 0.2802 | 0.4030 | 0.1636 | 0.2328 | 0.5161 | 0.4267 | 0.4672 | 0.7747 | 0.4569 | 0.5748 | 0.1596 | 0.1181 | 0.1357 | 0.3696 | 0.2576 | 0.3036 | 0.75 | 0.2535 | 0.3789 | 0.5 | 0.1636 | 0.2466 | 0.6293 | 0.4867 | 0.5489 | 0.8608 | 0.4755 | 0.6126 | 0.4545 | 0.1969 | 0.2747 | 0.3333 | 0.2727 | 0.3 | 0.4262 | 0.2016 | 0.2737 | 0.3478 | 0.1416 | 0.2013 | 0.5932 | 0.3684 | 0.4545 | 0.7625 | 0.3631 | 0.4919 | 0.2222 | 0.1488 | 0.1782 |
### Framework versions
- Transformers 4.34.1
- Pytorch 2.1.0+cu118
- Datasets 2.14.6
- Tokenizers 0.14.1