File size: 1,402 Bytes
b2517d6
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
a0f6643
 
ad9419f
a0f6643
ccdb36f
 
b2517d6
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
---
base_model:
- inflatebot/helide-beta-r0
- inflatebot/helide-beta-r4
- Sao10K/L3-8B-Stheno-v3.2
- inflatebot/helide-beta-r1
library_name: transformers
tags:
- mergekit
- merge

---
# helium-3-r5

This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit).

## Merge Details

Helium3, but the base is Stheno. A bit stupid. If Helium3-baseLlama is too dry and regular Helium3 is too horny, this *might* be a good middle ground.
Emphasis on "might."

[GGUFs by mradermacher](https://huggingface.co/mradermacher/L3-8B-Helium3-baseStheno-GGUF)

### Merge Method

This model was merged using the [Model Stock](https://arxiv.org/abs/2403.19522) merge method using [Sao10K/L3-8B-Stheno-v3.2](https://huggingface.co/Sao10K/L3-8B-Stheno-v3.2) as a base.

### Models Merged

The following models were included in the merge:
* [inflatebot/helide-beta-r0](https://huggingface.co/inflatebot/helide-beta-r0)
* [inflatebot/helide-beta-r4](https://huggingface.co/inflatebot/helide-beta-r4)
* [inflatebot/helide-beta-r1](https://huggingface.co/inflatebot/helide-beta-r1)

### Configuration

The following YAML configuration was used to produce this model:

```yaml
models:
  - model: inflatebot/helide-beta-r4
  - model: inflatebot/helide-beta-r1
  - model: inflatebot/helide-beta-r0

merge_method: model_stock
base_model: Sao10K/L3-8B-Stheno-v3.2
dtype: bfloat16

```