File size: 2,318 Bytes
3b5c870
 
 
 
 
 
 
 
 
 
 
 
7574917
3b5c870
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
5bbd10b
 
 
3b5c870
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
---
base_model: []
library_name: transformers
tags:
- mergekit
- merge

---
# Stellar Odyssey 12b v0.0

*We will see... Come with me, take the journey~*

Listen to the song on Youtube: https://www.youtube.com/watch?v=3FEFtFMBREA

<iframe width="1280" height="895" src="https://www.youtube.com/embed/3FEFtFMBREA" title="Take the Journey" frameborder="0" allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture; web-share" referrerpolicy="strict-origin-when-cross-origin" allowfullscreen></iframe>

Soo... after I failed the first time, I took a crack at merging again. This time, these models were used

mistralai/Mistral-Nemo-Base-2407
Sao10K/MN-12B-Lyra-v4
nothingiisreal/MN-12B-Starcannon-v2
Gryphe/Pantheon-RP-1.5-12b-Nemo

License for this model is: cc-by-nc-4.0

~~I hope this was worth the time I spent to create this merge, lol~~

Gated access for now, gated access will be disabled when testing is done, and thanks to all who have interest.

Details


This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit).

## Merge Details
### Merge Method

This model was merged using the della_linear merge method using C:\Users\lg911\Downloads\Mergekit-Fixed\mergekit\mistralai_Mistral-Nemo-Base-2407 as a base.

### Models Merged

The following models were included in the merge:
* C:\Users\\Downloads\Mergekit-Fixed\mergekit\Sao10K_MN-12B-Lyra-v4
* C:\Users\\Downloads\Mergekit-Fixed\mergekit\Gryphe_Pantheon-RP-1.5-12b-Nemo
* C:\Users\\Downloads\Mergekit-Fixed\mergekit\nothingiisreal_MN-12B-Starcannon-v2

### Configuration

The following YAML configuration was used to produce this model:

```yaml
models:
  - model: C:\Users\\Downloads\Mergekit-Fixed\mergekit\Sao10K_MN-12B-Lyra-v4
    parameters:
      weight: 0.3
      density: 0.25
  - model: C:\Users\\Downloads\Mergekit-Fixed\mergekit\nothingiisreal_MN-12B-Starcannon-v2
    parameters:
      weight: 0.1
      density: 0.4
  - model: C:\Users\\Downloads\Mergekit-Fixed\mergekit\Gryphe_Pantheon-RP-1.5-12b-Nemo
    parameters:
      weight: 0.4
      density: 0.5
merge_method: della_linear
base_model: C:\Users\\Downloads\Mergekit-Fixed\mergekit\mistralai_Mistral-Nemo-Base-2407
parameters:
  epsilon: 0.05
  lambda: 1
merge_method: della_linear
dtype: bfloat16
```