Memgpt-3x7b-MOE-AWQ / README.md
Suparious's picture
Update README.md
e005654 verified
metadata
license: apache-2.0
library_name: transformers
language:
  - en
tags:
  - 4-bit
  - AWQ
  - text-generation
  - autotrain_compatible
  - endpoints_compatible
  - safetensors
  - moe
  - frankenmoe
  - merge
  - mergekit
  - lazymergekit
  - starsnatched/MemGPT-DPO
  - starsnatched/MemGPT-3
  - starsnatched/MemGPT
base_model:
  - starsnatched/MemGPT-DPO
  - starsnatched/MemGPT-3
  - starsnatched/MemGPT
pipeline_tag: text-generation
inference: false
quantized_by: Suparious

liminerity/Memgpt-3x7b-MOE AWQ

Model Summary

Memgpt-3x7b-MOE is a Mixure of Experts (MoE) made with the following models using LazyMergekit: