Hugging Face
Models
Datasets
Spaces
Posts
Docs
Solutions
Pricing
Log In
Sign Up
v000000
/
L3.1-Celestial-Stone-2x8B-DPO
like
1
Text Generation
Transformers
Safetensors
jondurbin/gutenberg-dpo-v0.1
mixtral
Merge
llama
dpo
conversational
text-generation-inference
Inference Endpoints
Model card
Files
Files and versions
Community
Train
Deploy
Use this model
v000000
commited on
4 days ago
Commit
b6e9e72
•
1 Parent(s):
4a1fc94
Update README.md
Browse files
Files changed (1)
hide
show
README.md
+2
-0
README.md
CHANGED
Viewed
@@ -15,6 +15,8 @@ tags:
15
- llama
16
- mixtral
17
- dpo
18
---
19
20
> [!WARNING]
15
- llama
16
- mixtral
17
- dpo
18
+
datasets:
19
+
- jondurbin/gutenberg-dpo-v0.1
20
---
21
22
> [!WARNING]