ashhadahsan's picture
Training in progress epoch 35
8ae3d20
|
raw
history blame
No virus
4.48 kB
metadata
license: apache-2.0
base_model: bert-base-uncased
tags:
  - generated_from_keras_callback
model-index:
  - name: ashhadahsan/amazon-theme-bert-base-finetuned
    results: []

ashhadahsan/amazon-theme-bert-base-finetuned

This model is a fine-tuned version of bert-base-uncased on an unknown dataset. It achieves the following results on the evaluation set:

  • Train Loss: 0.0130
  • Train Accuracy: 0.9918
  • Validation Loss: 0.8177
  • Validation Accuracy: 0.8759
  • Epoch: 35

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • optimizer: {'name': 'Adam', 'weight_decay': None, 'clipnorm': 1.0, 'global_clipnorm': None, 'clipvalue': None, 'use_ema': False, 'ema_momentum': 0.99, 'ema_overwrite_frequency': None, 'jit_compile': True, 'is_legacy_optimizer': False, 'learning_rate': 3e-05, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-08, 'amsgrad': False}
  • training_precision: float32

Training results

Train Loss Train Accuracy Validation Loss Validation Accuracy Epoch
1.3910 0.5974 0.8022 0.8008 0
0.2739 0.9554 0.6211 0.8609 1
0.0782 0.9885 0.5895 0.8609 2
0.0418 0.9913 0.5456 0.8797 3
0.0318 0.9908 0.5729 0.8797 4
0.0251 0.9906 0.5747 0.8797 5
0.0211 0.9913 0.5994 0.8797 6
0.0195 0.9906 0.6241 0.8797 7
0.0184 0.9911 0.6244 0.8797 8
0.0170 0.9904 0.6235 0.8797 9
0.0159 0.9913 0.6619 0.8797 10
0.0164 0.9913 0.6501 0.8797 11
0.0165 0.9911 0.6452 0.8835 12
0.0155 0.9908 0.6727 0.8872 13
0.0149 0.9904 0.6798 0.8835 14
0.0144 0.9906 0.6905 0.8797 15
0.0142 0.9923 0.7089 0.8797 16
0.0140 0.9923 0.7335 0.8722 17
0.0138 0.9915 0.7297 0.8722 18
0.0143 0.9908 0.7030 0.8759 19
0.0140 0.9906 0.7420 0.8759 20
0.0134 0.9915 0.7419 0.8759 21
0.0134 0.9913 0.7448 0.8835 22
0.0132 0.9915 0.7791 0.8722 23
0.0131 0.9923 0.7567 0.8797 24
0.0134 0.9915 0.7809 0.8797 25
0.0125 0.9925 0.7941 0.8797 26
0.0126 0.9923 0.7943 0.8759 27
0.0126 0.9915 0.8071 0.8797 28
0.0127 0.9915 0.8057 0.8722 29
0.0126 0.9915 0.8030 0.8797 30
0.0125 0.9915 0.8364 0.8797 31
0.0123 0.9920 0.8350 0.8797 32
0.0125 0.9913 0.8298 0.8797 33
0.0126 0.9918 0.8337 0.8797 34
0.0130 0.9918 0.8177 0.8759 35

Framework versions

  • Transformers 4.31.0
  • TensorFlow 2.12.0
  • Tokenizers 0.13.3