Nondzu's picture
Update README.md
d3a59b6
|
raw
history blame
No virus
4.05 kB
---
license: mit
language:
- pl
---
## Model Overview
Quantized version exllamav2 https://github.com/turboderp/exllamav2/
Original model: https://huggingface.co/Nondzu/zephyr-7b-beta-pl
<a href="https://huggingface.co/Nondzu/zephyr-7b-beta-pl-exl2/tree/3.0">3.0 bits per weight</a>
<a href="https://huggingface.co/Nondzu/zephyr-7b-beta-pl-exl2/tree/3.5">3.5 bits per weight</a>
<a href="https://huggingface.co/Nondzu/zephyr-7b-beta-pl-exl2/tree/4.0">4.0 bits per weight</a>
<a href="https://huggingface.co/Nondzu/zephyr-7b-beta-pl-exl2/tree/5.0">5.0 bits per weight</a>
<a href="https://huggingface.co/Nondzu/zephyr-7b-beta-pl-exl2/tree/6.0">6.0 bits per weight</a>
<a href="https://huggingface.co/Nondzu/zephyr-7b-beta-pl-exl2/tree/7.0">7.0 bits per weight</a>
<a href="https://huggingface.co/Nondzu/zephyr-7b-beta-pl-exl2/tree/8.0">8.0 bits per weight</a>
## Download instructions
With git:
```shell
git clone --single-branch --branch 4.0 https://huggingface.co/Nondzu/zephyr-7b-beta-pl-exl2
```
With huggingface hub (credit to TheBloke for instructions):
```shell
pip3 install huggingface-hub
```
To download the `main` (only useful if you only care about measurement.json) branch to a folder called `zephyr-7b-beta-pl-exl2`:
```shell
mkdir zephyr-7b-beta-pl-exl2
huggingface-cli download Nondzu/zephyr-7b-beta-pl-exl2 --local-dir zephyr-7b-beta-pl-exl2 --local-dir-use-symlinks False
```
To download from a different branch, add the `--revision` parameter:
```shell
mkdir zephyr-7b-beta-pl-exl2
huggingface-cli download Nondzu/zephyr-7b-beta-pl-exl2 --revision 8.0 --local-dir zephyr-7b-beta-pl-exl2 --local-dir-use-symlinks False
```
## Current Status: Alpha
- **Stage**: Alpha-Alpaca
## Training Details
I trained the model using 3xRTX 3090 for 163 hours.
[![Built with Axolotl](https://raw.githubusercontent.com/OpenAccess-AI-Collective/axolotl/main/image/axolotl-badge-web.png)](https://github.com/OpenAccess-AI-Collective/axolotl)
## Quantised Model Links:
1. https://huggingface.co/Nondzu/zephyr-7b-beta-pl-exl2
2.
3.
4.
## Model Specifics
- **Base Model**: HuggingFaceH4/zephyr-7b-beta
- **Fine-Tuning Method**: QLORA
- **Primary Focus**: Polish language datasets
## Datasets:
- Dataset 1 Name: Lajonbot/alpaca-dolly-chrisociepa-instruction-only-polish
- Dataset 1 Link: [Lajonbot/alpaca-dolly-chrisociepa-instruction-only-polish](https://huggingface.co/datasets/Lajonbot/alpaca-dolly-chrisociepa-instruction-only-polish?row=16)
- Dataset 2 Name: klima7/polish-prose
- Dataset 2 Link: [klima7/polish-prose](https://huggingface.co/datasets/klima7/polish-prose)
## Usage Warning
As this is an experimental model, users should be aware of the following:
- **Reliability**: The model has not been fully tested and may exhibit unexpected behaviors or performance issues.
- **Updates**: The model is subject to change based on ongoing testing and feedback.
- **Data Sensitivity**: Users should exercise caution when using sensitive or private data, as the model's output and behavior are not fully predictable at this stage.
## Prompt template: Alpaca
```
Below is an instruction that describes a task. Write a response that appropriately completes the request.
### Instruction:
{prompt}
### Response:
```
## Example
![image/png](https://cdn-uploads.huggingface.co/production/uploads/63729f35acef705233c87909/1WYp9Su1NYvYCIU-2J7TG.png)
## Feedback and Contribution
User feedback is crucial during this testing phase. We encourage users to provide feedback on model performance, issues encountered, and any suggestions for improvements. Contributions in terms of shared test results, datasets, or code improvements are also welcome.
---
**Disclaimer**: This experimental model is provided 'as is', without warranty of any kind. Users should use the model at their own risk. The creators or maintainers of the model are not responsible for any consequences arising from its use.
![image/png](https://cdn-uploads.huggingface.co/production/uploads/63729f35acef705233c87909/CPClYNIMp3Qswt2F0Y9B3.png)