|
# MiquMaid-v1-70B IQ2 |
|
|
|
## Description |
|
2bit imatrix GGUF quants of [NeverSleep/MiquMaid-v1-70B](https://huggingface.co/NeverSleep/MiquMaid-v1-70B) |
|
|
|
[Imatrix](https://huggingface.co/Kooten/MiquMaid-v1-70B-IQ2-GGUF/blob/main/Imatrix/imatrix-MiquMaid-c2000-ctx500-wikitext.dat) generated from q8 of MiquMaid, 2000 chunks at 500 ctx. Dataset was Wikitext. |
|
|
|
These quants take a while to do so please leave a like or a comment on the repo so that i know if there is interest. |
|
|
|
## Other quants: |
|
|
|
EXL2: [3.5bpw](https://huggingface.co/Kooten/MiquMaid-v1-70B-3.5bpw-exl2), [3bpw](https://huggingface.co/Kooten/MiquMaid-v1-70B-3bpw-exl2), [2.4bpw](https://huggingface.co/Kooten/MiquMaid-v1-70B-2.4bpw-exl2) |
|
|
|
[2bit Imatrix GGUF](https://huggingface.co/Kooten/MiquMaid-v1-70B-IQ2-GGUF): [IQ2-XS](https://huggingface.co/Kooten/MiquMaid-v1-70B-IQ2-GGUF/blob/main/MiquMaid-v1-70B-IQ2_XS.gguf), [IQ2-XXS](https://huggingface.co/Kooten/MiquMaid-v1-70B-IQ2-GGUF/blob/main/MiquMaid-v1-70B-IQ2_XXS.gguf), [IQ3-XXS](https://huggingface.co/Kooten/MiquMaid-v1-70B-IQ2-GGUF/blob/main/MiquMaid-v1-70B-IQ3_XXS.gguf) |
|
|
|
### Custom format: |
|
``` |
|
### Instruction: |
|
{system prompt} |
|
|
|
### Input: |
|
{input} |
|
|
|
### Response: |
|
{reply} |
|
``` |
|
|
|
## Contact |
|
Kooten on discord |
|
|
|
[ko-fi.com/kooten](https://ko-fi.com/kooten) |