|
--- |
|
license: mit |
|
language: |
|
- en |
|
--- |
|
|
|
# Mamba |
|
|
|
<!-- Provide a quick summary of what the model is/does. --> |
|
This repository contains the `transfromers` compatible `mamba-2.8b`. The checkpoints are untouched, but the full `config.json` and tokenizer are pushed to this repo. |
|
|
|
# Usage |
|
|
|
You need to install `transformers` from `main` until `transformers=4.39.0` is released. |
|
```bash |
|
pip install git+https://github.com/huggingface/transformers@main |
|
``` |
|
|
|
We also recommend you to install both `causal_conv_1d` and `mamba-ssm` using: |
|
|
|
```bash |
|
pip install causal-conv1d>=1.2.0 |
|
pip install mamba-ssm |
|
``` |
|
|
|
If any of these two is not installed, the "eager" implementation will be used. Otherwise the more optimised `cuda` kernels will be used. |
|
|
|
## Generation |
|
You can use the classic `generate` API: |
|
```python |
|
>>> from transformers import MambaConfig, MambaForCausalLM, AutoTokenizer |
|
>>> import torch |
|
|
|
>>> tokenizer = AutoTokenizer.from_pretrained("state-spaces/mamba-790m-hf") |
|
>>> model = MambaForCausalLM.from_pretrained("state-spaces/mamba-790m-hf") |
|
>>> input_ids = tokenizer("Hey how are you doing?", return_tensors="pt")["input_ids"] |
|
|
|
>>> out = model.generate(input_ids, max_new_tokens=10) |
|
>>> print(tokenizer.batch_decode(out)) |
|
["Hey how are you doing?\n\nI'm good.\n\nHow are"] |
|
``` |
|
|