--- tags: - not-for-all-audiences - nsfw license: other language: - en --- [EXL2](https://github.com/turboderp/exllamav2/tree/master#exllamav2) Quantization of [Undi95's's MXLewd-L2-20B](https://huggingface.co/Undi95/MXLewd-L2-20B). ## Model details First attempt to quantize a 20B model so it can run on 16GB VRAM with the highest quality possible. Quantized at 3.23bpw with hb 6 Perplexity: Base = 6.4744 3.23 h6 = 6.5369 Dataset = [wikitext](https://huggingface.co/datasets/wikitext/resolve/refs%2Fconvert%2Fparquet/wikitext-2-v1/test/0000.parquet) ## Prompt Format ``` Below is an instruction that describes a task. Write a response that appropriately completes the request. ### Instruction: {prompt} ### Response: ```