File size: 363 Bytes
0593b3c 3efb348 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 |
---
datasets:
- tatsu-lab/alpaca
language:
- en
metrics:
- accuracy
base_model: openai-community/gpt2
pipeline_tag: text-generation
library_name: transformers
---
This model is a fine-tuned version of gpt2 on an [tatsu-lab/alpaca](https://huggingface.co/datasets/tatsu-lab/alpaca) dataset. It achieves the following results on the evaluation set:
Loss: 1.826895
|