--- library_name: peft tags: - generated_from_trainer base_model: daryl149/llama-2-13b-chat-hf model-index: - name: peft-claim-detection-training-1718016334 results: [] --- # peft-claim-detection-training-1718016334 This model is a fine-tuned version of [daryl149/llama-2-13b-chat-hf](https://huggingface.co/daryl149/llama-2-13b-chat-hf) on the None dataset. It achieves the following results on the evaluation set: - Loss: 1.7537 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 0.0002 - train_batch_size: 1 - eval_batch_size: 8 - seed: 42 - gradient_accumulation_steps: 4 - total_train_batch_size: 4 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_steps: 1 - training_steps: 1000 ### Training results | Training Loss | Epoch | Step | Validation Loss | |:-------------:|:------:|:----:|:---------------:| | 1.6754 | 0.1953 | 25 | 1.5439 | | 1.5234 | 0.3906 | 50 | 1.5111 | | 1.4658 | 0.5859 | 75 | 1.4981 | | 1.4833 | 0.7812 | 100 | 1.4938 | | 1.5195 | 0.9766 | 125 | 1.4862 | | 1.4119 | 1.1719 | 150 | 1.4849 | | 1.3982 | 1.3672 | 175 | 1.4881 | | 1.4369 | 1.5625 | 200 | 1.4929 | | 1.427 | 1.7578 | 225 | 1.4900 | | 1.463 | 1.9531 | 250 | 1.4858 | | 1.3686 | 2.1484 | 275 | 1.4965 | | 1.2803 | 2.3438 | 300 | 1.5105 | | 1.3467 | 2.5391 | 325 | 1.5215 | | 1.3511 | 2.7344 | 350 | 1.5079 | | 1.3662 | 2.9297 | 375 | 1.5067 | | 1.2864 | 3.125 | 400 | 1.5295 | | 1.1724 | 3.3203 | 425 | 1.5497 | | 1.1846 | 3.5156 | 450 | 1.6101 | | 1.2683 | 3.7109 | 475 | 1.5444 | | 1.2701 | 3.9062 | 500 | 1.5333 | | 1.1946 | 4.1016 | 525 | 1.5805 | | 0.9993 | 4.2969 | 550 | 1.6382 | | 1.1423 | 4.4922 | 575 | 1.6158 | | 1.1734 | 4.6875 | 600 | 1.5982 | | 1.107 | 4.8828 | 625 | 1.5630 | | 1.0531 | 5.0781 | 650 | 1.6544 | | 0.9469 | 5.2734 | 675 | 1.7376 | | 1.0501 | 5.4688 | 700 | 1.6753 | | 1.0237 | 5.6641 | 725 | 1.6146 | | 0.9976 | 5.8594 | 750 | 1.6472 | | 0.8801 | 6.0547 | 775 | 1.7507 | | 0.8045 | 6.25 | 800 | 1.7875 | | 0.9957 | 6.4453 | 825 | 1.6558 | | 0.961 | 6.6406 | 850 | 1.7039 | | 0.8889 | 6.8359 | 875 | 1.7554 | | 0.8674 | 7.0312 | 900 | 1.6885 | | 0.8393 | 7.2266 | 925 | 1.7469 | | 0.8802 | 7.4219 | 950 | 1.7618 | | 0.8384 | 7.6172 | 975 | 1.7559 | | 0.8324 | 7.8125 | 1000 | 1.7537 | ### Framework versions - PEFT 0.11.1 - Transformers 4.41.2 - Pytorch 2.3.0+cu121 - Datasets 2.19.2 - Tokenizers 0.19.1