--- license: apache-2.0 library_name: peft tags: - generated_from_trainer metrics: - accuracy base_model: google/flan-t5-base model-index: - name: emotions_flan_tf results: [] --- # emotions_flan_tf This model is a fine-tuned version of [google/flan-t5-base](https://huggingface.co/google/flan-t5-base) on an unknown dataset. It achieves the following results on the evaluation set: - Loss: 0.4970 - F1 Micro: 0.6980 - F1 Macro: 0.6126 - Accuracy: 0.2188 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 0.0001 - train_batch_size: 16 - eval_batch_size: 16 - seed: 42 - gradient_accumulation_steps: 4 - total_train_batch_size: 64 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 10 ### Training results | Training Loss | Epoch | Step | Validation Loss | F1 Micro | F1 Macro | Accuracy | |:-------------:|:-----:|:----:|:---------------:|:--------:|:--------:|:--------:| | 0.8313 | 0.21 | 20 | 0.7916 | 0.4215 | 0.1586 | 0.0123 | | 0.7813 | 0.41 | 40 | 0.7840 | 0.4717 | 0.2286 | 0.0201 | | 0.7702 | 0.62 | 60 | 0.7485 | 0.4935 | 0.2379 | 0.0971 | | 0.7218 | 0.83 | 80 | 0.6327 | 0.6045 | 0.3866 | 0.1256 | | 0.6401 | 1.03 | 100 | 0.5907 | 0.6269 | 0.4310 | 0.1586 | | 0.5951 | 1.24 | 120 | 0.5668 | 0.6459 | 0.4981 | 0.1502 | | 0.5686 | 1.45 | 140 | 0.5458 | 0.6593 | 0.5372 | 0.1683 | | 0.5576 | 1.65 | 160 | 0.5332 | 0.6675 | 0.5403 | 0.1722 | | 0.5465 | 1.86 | 180 | 0.5224 | 0.6734 | 0.5667 | 0.1812 | | 0.5436 | 2.07 | 200 | 0.5164 | 0.6807 | 0.5751 | 0.1877 | | 0.5297 | 2.27 | 220 | 0.5149 | 0.6742 | 0.5793 | 0.1741 | | 0.5109 | 2.48 | 240 | 0.5049 | 0.6845 | 0.5824 | 0.1929 | | 0.5265 | 2.69 | 260 | 0.5070 | 0.6846 | 0.5859 | 0.1799 | | 0.5028 | 2.89 | 280 | 0.5068 | 0.6847 | 0.5870 | 0.1864 | | 0.5097 | 3.1 | 300 | 0.5025 | 0.6892 | 0.5940 | 0.2084 | | 0.4971 | 3.31 | 320 | 0.5032 | 0.6843 | 0.5995 | 0.1890 | | 0.4762 | 3.51 | 340 | 0.5069 | 0.6955 | 0.5928 | 0.2129 | | 0.4811 | 3.72 | 360 | 0.4954 | 0.6898 | 0.5996 | 0.2026 | | 0.5065 | 3.93 | 380 | 0.4961 | 0.6918 | 0.6038 | 0.1838 | | 0.4746 | 4.13 | 400 | 0.4992 | 0.6956 | 0.6009 | 0.2142 | | 0.4786 | 4.34 | 420 | 0.5013 | 0.6918 | 0.6018 | 0.2026 | | 0.4832 | 4.55 | 440 | 0.4935 | 0.6904 | 0.6031 | 0.2155 | | 0.465 | 4.75 | 460 | 0.4984 | 0.6938 | 0.6027 | 0.2071 | | 0.4683 | 4.96 | 480 | 0.4977 | 0.6960 | 0.6011 | 0.2091 | | 0.4573 | 5.17 | 500 | 0.4985 | 0.6915 | 0.6076 | 0.2006 | | 0.4619 | 5.37 | 520 | 0.4952 | 0.6945 | 0.6044 | 0.2129 | | 0.4535 | 5.58 | 540 | 0.4983 | 0.6927 | 0.6024 | 0.2078 | | 0.4475 | 5.79 | 560 | 0.4967 | 0.6970 | 0.6064 | 0.2194 | | 0.454 | 5.99 | 580 | 0.5027 | 0.6941 | 0.6090 | 0.1994 | | 0.4479 | 6.2 | 600 | 0.4940 | 0.6919 | 0.6041 | 0.2117 | | 0.4304 | 6.41 | 620 | 0.5002 | 0.6982 | 0.6114 | 0.2006 | | 0.445 | 6.61 | 640 | 0.4970 | 0.6951 | 0.6098 | 0.2071 | | 0.4434 | 6.82 | 660 | 0.4964 | 0.6976 | 0.6075 | 0.2136 | | 0.4543 | 7.03 | 680 | 0.4904 | 0.6936 | 0.6086 | 0.2013 | | 0.4474 | 7.24 | 700 | 0.4969 | 0.6960 | 0.6108 | 0.2071 | | 0.4325 | 7.44 | 720 | 0.4998 | 0.7013 | 0.6123 | 0.2123 | | 0.4362 | 7.65 | 740 | 0.4947 | 0.6953 | 0.6101 | 0.2091 | | 0.4276 | 7.86 | 760 | 0.4978 | 0.6955 | 0.6119 | 0.2052 | | 0.4392 | 8.06 | 780 | 0.4944 | 0.6967 | 0.6078 | 0.2104 | | 0.4167 | 8.27 | 800 | 0.4987 | 0.6966 | 0.6080 | 0.2097 | | 0.4309 | 8.48 | 820 | 0.4970 | 0.6980 | 0.6126 | 0.2188 | | 0.42 | 8.68 | 840 | 0.4999 | 0.6977 | 0.6105 | 0.2129 | | 0.423 | 8.89 | 860 | 0.5003 | 0.6975 | 0.6087 | 0.2142 | | 0.4382 | 9.1 | 880 | 0.4977 | 0.6975 | 0.6115 | 0.2136 | | 0.4182 | 9.3 | 900 | 0.4976 | 0.6981 | 0.6123 | 0.2155 | | 0.4153 | 9.51 | 920 | 0.5000 | 0.6978 | 0.6108 | 0.2175 | | 0.4277 | 9.72 | 940 | 0.5003 | 0.6982 | 0.6092 | 0.2168 | | 0.4246 | 9.92 | 960 | 0.5000 | 0.6976 | 0.6093 | 0.2168 | ### Framework versions - PEFT 0.10.0 - Transformers 4.39.3 - Pytorch 2.2.1+cu121 - Datasets 2.18.0 - Tokenizers 0.15.2