--- license: apache-2.0 base_model: distilbert-base-cased-distilled-squad tags: - generated_from_keras_callback model-index: - name: Ahmed-Zakaria/distilbert-base-cased-finetuned-squad results: [] --- # Ahmed-Zakaria/distilbert-base-cased-finetuned-squad This model is a fine-tuned version of [distilbert-base-cased-distilled-squad](https://huggingface.co/distilbert-base-cased-distilled-squad) on an unknown dataset. It achieves the following results on the evaluation set: - Train Loss: 0.3370 - Train End Logits Accuracy: 0.8992 - Train Start Logits Accuracy: 0.8669 - Validation Loss: 1.3575 - Validation End Logits Accuracy: 0.7082 - Validation Start Logits Accuracy: 0.6769 - Epoch: 2 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - optimizer: {'name': 'AdamWeightDecay', 'learning_rate': {'module': 'keras.optimizers.schedules', 'class_name': 'PolynomialDecay', 'config': {'initial_learning_rate': 2e-05, 'decay_steps': 16635, 'end_learning_rate': 0.0, 'power': 1.0, 'cycle': False, 'name': None}, 'registered_name': None}, 'decay': 0.0, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-08, 'amsgrad': False, 'weight_decay_rate': 0.01} - training_precision: float32 ### Training results | Train Loss | Train End Logits Accuracy | Train Start Logits Accuracy | Validation Loss | Validation End Logits Accuracy | Validation Start Logits Accuracy | Epoch | |:----------:|:-------------------------:|:---------------------------:|:---------------:|:------------------------------:|:--------------------------------:|:-----:| | 0.6238 | 0.8207 | 0.7789 | 1.1257 | 0.7123 | 0.6807 | 0 | | 0.4553 | 0.8655 | 0.8283 | 1.2318 | 0.7094 | 0.6784 | 1 | | 0.3370 | 0.8992 | 0.8669 | 1.3575 | 0.7082 | 0.6769 | 2 | ### Framework versions - Transformers 4.36.0 - TensorFlow 2.13.0 - Datasets 2.1.0 - Tokenizers 0.15.0