cehongw's picture
Training complete
93c6962
metadata
license: mit
base_model: dslim/bert-base-NER
tags:
  - generated_from_trainer
metrics:
  - precision
  - recall
  - f1
  - accuracy
model-index:
  - name: ner-fine-tune-bert-ner
    results: []

ner-fine-tune-bert-ner

This model is a fine-tuned version of dslim/bert-base-NER on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.3662
  • Precision: 0.2383
  • Recall: 0.2818
  • F1: 0.2582
  • Accuracy: 0.9406

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 1e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 30

Training results

Training Loss Epoch Step Validation Loss Precision Recall F1 Accuracy
No log 1.0 122 0.2295 0.1255 0.0716 0.0912 0.9514
No log 2.0 244 0.2152 0.2022 0.1270 0.1560 0.9514
No log 3.0 366 0.2044 0.1696 0.1547 0.1618 0.9497
No log 4.0 488 0.2269 0.1980 0.1363 0.1614 0.9536
0.2142 5.0 610 0.2335 0.1931 0.1547 0.1718 0.9521
0.2142 6.0 732 0.2516 0.1959 0.1778 0.1864 0.9491
0.2142 7.0 854 0.2446 0.2565 0.2517 0.2541 0.9542
0.2142 8.0 976 0.2527 0.2273 0.2656 0.2449 0.9481
0.0658 9.0 1098 0.2724 0.2459 0.2055 0.2239 0.9526
0.0658 10.0 1220 0.2620 0.2895 0.2748 0.2820 0.9549
0.0658 11.0 1342 0.2846 0.2102 0.2748 0.2382 0.9416
0.0658 12.0 1464 0.2943 0.2292 0.2610 0.2441 0.9450
0.0273 13.0 1586 0.3154 0.2064 0.2679 0.2332 0.9381
0.0273 14.0 1708 0.3097 0.2254 0.2217 0.2235 0.9464
0.0273 15.0 1830 0.3313 0.2375 0.2517 0.2444 0.9426
0.0273 16.0 1952 0.3256 0.2098 0.2864 0.2422 0.9361
0.0155 17.0 2074 0.3333 0.2162 0.2656 0.2383 0.9393
0.0155 18.0 2196 0.3073 0.2446 0.2864 0.2638 0.9449
0.0155 19.0 2318 0.3241 0.2418 0.2725 0.2562 0.9437
0.0155 20.0 2440 0.3348 0.2338 0.2587 0.2456 0.9446
0.0091 21.0 2562 0.3595 0.234 0.2702 0.2508 0.9402
0.0091 22.0 2684 0.3658 0.2263 0.2818 0.2510 0.9387
0.0091 23.0 2806 0.3495 0.2391 0.2794 0.2577 0.9419
0.0091 24.0 2928 0.3545 0.2398 0.2841 0.2600 0.9409
0.0066 25.0 3050 0.3557 0.2309 0.2864 0.2557 0.9402
0.0066 26.0 3172 0.3498 0.2449 0.2748 0.2590 0.9432
0.0066 27.0 3294 0.3586 0.2375 0.2841 0.2587 0.9416
0.0066 28.0 3416 0.3676 0.2389 0.2725 0.2546 0.9417
0.005 29.0 3538 0.3663 0.2412 0.2864 0.2619 0.9404
0.005 30.0 3660 0.3662 0.2383 0.2818 0.2582 0.9406

Framework versions

  • Transformers 4.34.1
  • Pytorch 2.1.0+cu118
  • Datasets 2.14.6
  • Tokenizers 0.14.1