Edit model card
YAML Metadata Warning: empty or missing yaml metadata in repo card (https://huggingface.co/docs/hub/model-cards#model-card-metadata)

Quantization made by Richard Erkhov.

Github

Discord

Request more models

Mistral-7B-german-assistant-v4 - GGUF

Original model description:

datasets: - flozi00/conversations language: - de

This project is sponsored by PrimeLine

Model Card

This model is an finetuned version for german instructions and conversations in style of Alpaca. "### Assistant:" "### User:", trained with a context length of 8k tokens. The dataset used is deduplicated and cleaned, with no codes inside and uncensored. The focus is on instruction following and conversational tasks.

The model archictecture is based on Mistral v0.1 with 7B parameters, trained on 100% renewable energy powered hardware.

This work is contributed by private research of flozi00

Downloads last month
209
GGUF

2-bit

3-bit

4-bit

5-bit

6-bit

8-bit

Inference API
Unable to determine this model's library. Check the docs .