Edit model card

GerMedBERT_NER_V01_BRONCO_CARDIO

This model is a fine-tuned version of GerMedBERT/medbert-512 on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 0.0306
  • Diag: {'precision': 0.7065217391304348, 'recall': 0.6345885634588564, 'f1': 0.6686260102865541, 'number': 717}
  • Med: {'precision': 0.8060029282576867, 'recall': 0.7315614617940199, 'f1': 0.7669801462904912, 'number': 1505}
  • Treat: {'precision': 0.8133640552995391, 'recall': 0.7431578947368421, 'f1': 0.7766776677667767, 'number': 475}
  • Overall Precision: 0.7811
  • Overall Recall: 0.7078
  • Overall F1: 0.7427
  • Overall Accuracy: 0.9903
  • Num Input Tokens Seen: 11575975

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 16
  • eval_batch_size: 16
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 4

Training results

Training Loss Epoch Step Validation Loss Diag Med Treat Overall Precision Overall Recall Overall F1 Overall Accuracy Input Tokens Seen
0.0611 0.2496 303 0.0509 {'precision': 0.6265060240963856, 'recall': 0.2900976290097629, 'f1': 0.3965681601525262, 'number': 717} {'precision': 0.7679127725856698, 'recall': 0.3275747508305648, 'f1': 0.45924545877969264, 'number': 1505} {'precision': 0.8493150684931506, 'recall': 0.5221052631578947, 'f1': 0.6466753585397653, 'number': 475} 0.7496 0.3519 0.4789 0.9841 725328
0.0532 0.4992 606 0.0430 {'precision': 0.7558139534883721, 'recall': 0.36262203626220363, 'f1': 0.4901036757775683, 'number': 717} {'precision': 0.8224076281287247, 'recall': 0.4584717607973422, 'f1': 0.5887372013651877, 'number': 1505} {'precision': 0.7891566265060241, 'recall': 0.5515789473684211, 'f1': 0.6493184634448574, 'number': 475} 0.8 0.4494 0.5755 0.9860 1436640
0.0488 0.7488 909 0.0394 {'precision': 0.6588486140724946, 'recall': 0.4309623430962343, 'f1': 0.521079258010118, 'number': 717} {'precision': 0.803639846743295, 'recall': 0.5574750830564784, 'f1': 0.6582973715182425, 'number': 1505} {'precision': 0.8328445747800587, 'recall': 0.5978947368421053, 'f1': 0.696078431372549, 'number': 475} 0.7724 0.5310 0.6293 0.9872 2157328
0.0342 0.9984 1212 0.0361 {'precision': 0.6908713692946058, 'recall': 0.46443514644351463, 'f1': 0.5554628857381151, 'number': 717} {'precision': 0.76010101010101, 'recall': 0.6, 'f1': 0.6706275529149647, 'number': 1505} {'precision': 0.8910256410256411, 'recall': 0.5852631578947368, 'f1': 0.7064803049555274, 'number': 475} 0.7639 0.5614 0.6471 0.9873 2891248
0.0347 1.2479 1515 0.0368 {'precision': 0.6760828625235404, 'recall': 0.500697350069735, 'f1': 0.5753205128205129, 'number': 717} {'precision': 0.7350936967632027, 'recall': 0.573421926910299, 'f1': 0.6442702500933185, 'number': 1505} {'precision': 0.7641277641277642, 'recall': 0.6547368421052632, 'f1': 0.7052154195011338, 'number': 475} 0.7259 0.5684 0.6376 0.9871 3607825
0.0283 1.4975 1818 0.0351 {'precision': 0.6774193548387096, 'recall': 0.5564853556485355, 'f1': 0.6110260336906584, 'number': 717} {'precision': 0.7513134851138353, 'recall': 0.5700996677740864, 'f1': 0.6482810729127314, 'number': 1505} {'precision': 0.8045685279187818, 'recall': 0.6673684210526316, 'f1': 0.7295742232451093, 'number': 475} 0.7407 0.5836 0.6528 0.9872 4320401
0.0319 1.7471 2121 0.0329 {'precision': 0.6723809523809524, 'recall': 0.49232914923291493, 'f1': 0.5684380032206119, 'number': 717} {'precision': 0.7881619937694704, 'recall': 0.6724252491694352, 'f1': 0.7257081391179634, 'number': 1505} {'precision': 0.8387978142076503, 'recall': 0.6463157894736842, 'f1': 0.7300832342449465, 'number': 475} 0.7687 0.6199 0.6864 0.9885 5050561
0.0269 1.9967 2424 0.0311 {'precision': 0.720353982300885, 'recall': 0.5676429567642957, 'f1': 0.6349453978159126, 'number': 717} {'precision': 0.7833850931677019, 'recall': 0.6704318936877076, 'f1': 0.7225205871822412, 'number': 1505} {'precision': 0.8696883852691218, 'recall': 0.6463157894736842, 'f1': 0.7415458937198067, 'number': 475} 0.7811 0.6389 0.7028 0.9891 5776705
0.0268 2.2463 2727 0.0309 {'precision': 0.6769706336939721, 'recall': 0.6108786610878661, 'f1': 0.6422287390029325, 'number': 717} {'precision': 0.7624466571834992, 'recall': 0.7122923588039867, 'f1': 0.7365166609412571, 'number': 1505} {'precision': 0.8233830845771144, 'recall': 0.6968421052631579, 'f1': 0.7548460661345495, 'number': 475} 0.7499 0.6826 0.7147 0.9891 6493709
0.0265 2.4959 3030 0.0319 {'precision': 0.7138103161397671, 'recall': 0.5983263598326359, 'f1': 0.6509863429438543, 'number': 717} {'precision': 0.7537202380952381, 'recall': 0.6730897009966778, 'f1': 0.7111267111267112, 'number': 1505} {'precision': 0.8165829145728644, 'recall': 0.6842105263157895, 'f1': 0.7445589919816724, 'number': 475} 0.7542 0.6552 0.7012 0.9888 7214269
0.0255 2.7455 3333 0.0314 {'precision': 0.6806853582554517, 'recall': 0.6094839609483961, 'f1': 0.643119941133186, 'number': 717} {'precision': 0.7615062761506276, 'recall': 0.7255813953488373, 'f1': 0.7431099013269821, 'number': 1505} {'precision': 0.7866666666666666, 'recall': 0.7452631578947368, 'f1': 0.7654054054054054, 'number': 475} 0.7454 0.6982 0.7210 0.9892 7947645
0.0221 2.9951 3636 0.0295 {'precision': 0.723916532905297, 'recall': 0.6290097629009763, 'f1': 0.673134328358209, 'number': 717} {'precision': 0.8135464231354642, 'recall': 0.7102990033222591, 'f1': 0.7584249733948208, 'number': 1505} {'precision': 0.85, 'recall': 0.7157894736842105, 'f1': 0.7771428571428571, 'number': 475} 0.7959 0.6897 0.7390 0.9903 8667437
0.018 3.2446 3939 0.0307 {'precision': 0.7097288676236044, 'recall': 0.6206415620641562, 'f1': 0.6622023809523809, 'number': 717} {'precision': 0.7909156452775775, 'recall': 0.7289036544850498, 'f1': 0.7586445366528355, 'number': 1505} {'precision': 0.8165137614678899, 'recall': 0.7494736842105263, 'f1': 0.7815587266739846, 'number': 475} 0.7747 0.7037 0.7375 0.9901 9388513
0.0238 3.4942 4242 0.0312 {'precision': 0.7024922118380063, 'recall': 0.6290097629009763, 'f1': 0.6637233259749816, 'number': 717} {'precision': 0.781895937277263, 'recall': 0.7289036544850498, 'f1': 0.7544704264099036, 'number': 1505} {'precision': 0.8235294117647058, 'recall': 0.7368421052631579, 'f1': 0.7777777777777778, 'number': 475} 0.7684 0.7037 0.7347 0.9898 10103889
0.0196 3.7438 4545 0.0303 {'precision': 0.7142857142857143, 'recall': 0.6276150627615062, 'f1': 0.6681514476614699, 'number': 717} {'precision': 0.7932761087267525, 'recall': 0.7368770764119601, 'f1': 0.7640372028935583, 'number': 1505} {'precision': 0.8273381294964028, 'recall': 0.7263157894736842, 'f1': 0.773542600896861, 'number': 475} 0.7787 0.7060 0.7406 0.9902 10831905
0.0184 3.9934 4848 0.0306 {'precision': 0.7065217391304348, 'recall': 0.6345885634588564, 'f1': 0.6686260102865541, 'number': 717} {'precision': 0.8054133138258961, 'recall': 0.7315614617940199, 'f1': 0.7667130919220054, 'number': 1505} {'precision': 0.8133640552995391, 'recall': 0.7431578947368421, 'f1': 0.7766776677667767, 'number': 475} 0.7808 0.7078 0.7425 0.9903 11559985

Framework versions

  • Transformers 4.40.1
  • Pytorch 2.3.0+cu121
  • Datasets 2.19.0
  • Tokenizers 0.19.1
Downloads last month
1
Safetensors
Model size
108M params
Tensor type
F32
·
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Finetuned from

Datasets used to train BachelorThesis/GerMedBERT_NER_V01_BRONCO_CARDIO

Collection including BachelorThesis/GerMedBERT_NER_V01_BRONCO_CARDIO