v000000 commited on
Commit
f961357
1 Parent(s): e75200c

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +6 -2
README.md CHANGED
@@ -12,8 +12,12 @@ tags:
12
 
13
  # Llama-3.1-Celestial-Stone-2x8B-DPO
14
 
 
 
15
  * <b>Direct Preference Optimization - DPO</b>
16
 
17
- [L3.1-Celestial-Stone-2x8B](https://huggingface.co/v000000/L3.1-Celestial-Stone-2x8B) Finetuned on Nvidia A100,
 
 
18
 
19
- 0.5 Epoch of [jondurbin/gutenberg-dpo-v0.1](https://huggingface.co/datasets/jondurbin/gutenberg-dpo-v0.1)
 
12
 
13
  # Llama-3.1-Celestial-Stone-2x8B-DPO
14
 
15
+ ![image/png](https://cdn-uploads.huggingface.co/production/uploads/64f74b6e6389380c77562762/lyRa7z5maTqAaa43sxC2J.png)
16
+
17
  * <b>Direct Preference Optimization - DPO</b>
18
 
19
+ ---------------------------------------------------------------------------------
20
+
21
+ [L3.1-Celestial-Stone-2x8B](https://huggingface.co/v000000/L3.1-Celestial-Stone-2x8B) Finetuned on Nvidia A100.
22
 
23
+ 0.5 Epoch completed of dataset [jondurbin/gutenberg-dpo-v0.1](https://huggingface.co/datasets/jondurbin/gutenberg-dpo-v0.1)