File size: 588 Bytes
c520de5
 
1d8bad2
 
 
 
 
 
 
 
c520de5
1d8bad2
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
---
license: apache-2.0
language:
  - en
pipeline_tag: text-generation
inference: true
tags:
  - pytorch
  - mistral
  - finetuned
---
# Mistral 7B - Holodeck
## Model Description
Mistral 7B-Holodeck is a finetune created using Mistral's 7B model.
## Training data
The training data contains around 3000 ebooks in various genres.
Most parts of the dataset have been prepended using the following text: `[Genre: <genre1>, <genre2>]`
```
### Limitations and Biases
Based on known problems with NLP technology, potential relevant factors include bias (gender, profession, race and religion).