jeff31415's picture
Update README.md
97062b2
|
raw
history blame
No virus
530 Bytes
---
license: apache-2.0
datasets:
- Open-Orca/OpenOrca
- tiiuae/falcon-refinedweb
- bigcode/starcoderdata
language:
- en
---
Base model from PY007/TinyLlama-1.1B-intermediate-step-480k-1T
Fine tuned on OpenOrca GPT4 subset for 1 epoch.
Using CHATML format
Model License: Apache 2.0, following the TinyLlama base model.
Hardware and training spec:
Hardware: 1*RTX A5000, ~16 hours to complete 1 epoch. GPU from autodl.com, cost around $3 for this finetuning.
Training details:https://wandb.ai/jeff200402/TinyLlama-Orca?workspace=