|
--- |
|
license: apache-2.0 |
|
tags: |
|
- mistral |
|
- PC |
|
- laptop |
|
- mobile |
|
- chat |
|
- pdf |
|
- local |
|
- small size |
|
- quantization |
|
- PCs |
|
- mobiles |
|
- devices |
|
- gguf |
|
- chatpdf |
|
- chatpdflocal |
|
- Mac |
|
- llama.cpp |
|
--- |
|
Mistral Nemo is a 12-billion-parameter model, designed to handle instructive tasks and general conversational use. It can be further customized for specific applications through fine-tuning or prompt engineering |
|
|
|
This repo includes two different quantizaion sizes of mistral nemo gguf models, which are very applicable for deploying and using in PCs, laptops or mobiles. |
|
|
|
If you are a Mac user, you can download the beautiful ChatPDFLocal MacOS app from [here](https://www.chatpdflocal.com), load one or batch PDF files at will, and quickly experience the effect of the model through chat reading. |
|
|
|
Subscription or inviting friends in ChatPDFLocal MacOS app can earn credits to use freely. |