rpbase
This is a merge of pre-trained language models created using mergekit.
Merge Details
Merge Method
This model was merged using the DARE TIES merge method using TheBloke/Llama-2-13B-fp16 as a base.
Models Merged
The following models were included in the merge:
- PygmalionAI/mythalion-13b
- jondurbin/airoboros-l2-13b-2.2.1
- microsoft/Orca-2-13b
- ChaiML/phase2_winner_13b2
- NousResearch/Nous-Hermes-Llama2-13b
- Norquinal/OpenCAI-13B
Configuration
The following YAML configuration was used to produce this model:
models:
- model: Norquinal/OpenCAI-13B
parameters:
density: 0.5
weight: 0.2
- model: PygmalionAI/mythalion-13b
parameters:
density: 0.5
weight: 0.3
- model: microsoft/Orca-2-13b
parameters:
density: 0.5
weight: 0.1
- model: ChaiML/phase2_winner_13b2
parameters:
density: 0.5
weight: 0.4
- model: jondurbin/airoboros-l2-13b-2.2.1
parameters:
density: 0.5
weight: 0.4
- model: NousResearch/Nous-Hermes-Llama2-13b
parameters:
density: 0.5
weight: 0.4
base_model: TheBloke/Llama-2-13B-fp16
merge_method: dare_ties
parameters:
normalize: 1.0
- Downloads last month
- 6
This model does not have enough activity to be deployed to Inference API (serverless) yet.
Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated)
instead.