crumb commited on
Commit
85b4b66
1 Parent(s): 69b6667

Create README.md

Browse files
Files changed (1) hide show
  1. README.md +23 -0
README.md ADDED
@@ -0,0 +1,23 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ # Tinystories-30m-UL2
2
+
3
+ *GPT-4 generated model card*
4
+
5
+ ## Model Details
6
+
7
+ - **Model Name**: GPTNeoX/flan-ul2-tinystories
8
+ - **Model Type**: GPTNeoXForCausalLM (Language Modeling)
9
+ - **Model Training Details**: The model is trained using [crumb/flan-ul2-tinystories](https://huggingface.co/datasets/crumb/flan-ul2-tinystories) which contains around a quarter of a million examples generated from Flan-UL2 (20b) with the prompt "Write a short story using the vocabulary of a first-grader."
10
+
11
+ ## Model Description
12
+
13
+ This model is trained with the specific purpose of generating short narratives using a vocabulary limited to the level of a first-grader. In terms of complexity and language usage, the model is designed to produce simplistic and easily comprehensible text.
14
+
15
+ Learning from text generated by Flan-UL2 (20b), the model adopts a simple storyline layout and a minimalistic vocabulary, which it recognizes are easier to learn and replicate.
16
+
17
+ ## Training Data
18
+
19
+ The model is trained on the [crumb/flan-ul2-tinystories](https://huggingface.co/datasets/crumb/flan-ul2-tinystories) dataset, created with the help of Flan-UL2 (20b). The data is designed to follow the format of a simple, first-grader-level narrative, which aids the model in learning simple vocabulary and sentence structure.
20
+
21
+ ## Usage
22
+
23
+ This model serves as a meaningful research tool in exploring the learning tendencies of smaller language models and their ability to grasp simplified language constructs. Its specific training set effectively maps the idea that a constrained vocabulary and simplistic story layouts are inherently easier to learn.