Update README.md
Browse files
README.md
CHANGED
@@ -8,6 +8,8 @@ library_name: transformers
|
|
8 |
|
9 |
This model was created with the help of several members of Anthracite.
|
10 |
|
|
|
|
|
11 |
This is a 4B parameter Minitron derivative healed and instruct/RP tuned on 120M high quality tokens. This model was tuned at 8k context. This model should perform well as a general assistant and RP model.
|
12 |
|
13 |
Recommended Character:
|
|
|
8 |
|
9 |
This model was created with the help of several members of Anthracite.
|
10 |
|
11 |
+
NeuroCom v2 fixes several issues with the original NeuroCom train. I have deduplicated several datasets and applied a noisy training approach devised by Kalomaze. This model should have better generalization capabilities than the original, and in subjective testing of 3 variants, this was my favorite.
|
12 |
+
|
13 |
This is a 4B parameter Minitron derivative healed and instruct/RP tuned on 120M high quality tokens. This model was tuned at 8k context. This model should perform well as a general assistant and RP model.
|
14 |
|
15 |
Recommended Character:
|