Hugging Face
Models
Datasets
Spaces
Posts
Docs
Solutions
Pricing
Log In
Sign Up
NousResearch
/
Nous-Hermes-2-Mixtral-8x7B-DPO-adapter
like
15
Safetensors
teknium/OpenHermes-2.5
English
Mixtral
instruct
finetune
chatml
DPO
RLHF
gpt4
synthetic data
distillation
License:
apache-2.0
Model card
Files
Files and versions
Community
9225072
Nous-Hermes-2-Mixtral-8x7B-DPO-adapter
Commit History
Update README.md
9225072
verified
teknium
commited on
Jan 15
Update README.md
519da25
verified
teknium
commited on
Jan 15
Upload model
9c1395d
verified
emozilla
commited on
Jan 11
initial commit
2d6c16d
verified
emozilla
commited on
Jan 11