File size: 1,367 Bytes
c4d0001
 
 
 
 
 
 
 
 
 
35dcb3c
 
 
 
 
 
 
 
2d9642b
8143bde
c4d0001
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
---
base_model:
- nbeerbower/Mistral-Small-Gutenberg-Doppel-22B
- TheDrummer/Cydonia-22B-v1
library_name: transformers
tags:
- mergekit
- merge

---
A merge of Cydonia and Mistral Small Gutenberg.

This will hopefully make it and even better story teller.

Mistral or ChatML format.

Much appreciation to the original model creators The Drummer and nbeerbower.


https://ko-fi.com/dazzlingxeno
# merge

This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit).

## Merge Details
### Merge Method

This model was merged using the SLERP merge method.

### Models Merged

The following models were included in the merge:
* [nbeerbower/Mistral-Small-Gutenberg-Doppel-22B](https://huggingface.co/nbeerbower/Mistral-Small-Gutenberg-Doppel-22B)
* [TheDrummer/Cydonia-22B-v1](https://huggingface.co/TheDrummer/Cydonia-22B-v1)

### Configuration

The following YAML configuration was used to produce this model:

```yaml
base_model: nbeerbower/Mistral-Small-Gutenberg-Doppel-22B
dtype: bfloat16
merge_method: slerp
parameters:
  t:
  - filter: self_attn
    value: [0.0, 0.5, 0.3, 0.7, 1.0]
  - filter: mlp
    value: [1.0, 0.5, 0.7, 0.3, 0.0]
  - value: 0.5
slices:
- sources:
  - layer_range: [0, 56]
    model: TheDrummer/Cydonia-22B-v1
  - layer_range: [0, 56]
    model: nbeerbower/Mistral-Small-Gutenberg-Doppel-22B
```