abokbot commited on
Commit
193795e
1 Parent(s): 359b687

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +7 -0
README.md CHANGED
@@ -19,6 +19,13 @@ def get_first_paragraph(example):
19
 
20
  dataset = dataset.map(get_first_paragraph)
21
  ```
 
 
 
 
 
 
 
22
  # How to load dataset
23
 
24
  You can load it by runnning:
 
19
 
20
  dataset = dataset.map(get_first_paragraph)
21
  ```
22
+
23
+ # Why use this dataset?
24
+ The size of the original English Wikipedia dataset is over 20GB. Tt takes 20min to load it on a Google Colab notebook and running computations on that dataset can be costly.
25
+ If you want to create a use case that mostly needs the information in the first paragraph of a Wikipedia article (which is the paragraph with the most important information), this 'wikipedia-first-paragraph' dataset is for you.
26
+ Its size is 1.39GB and it takes 5 min to load it on a Google colab notebook.
27
+
28
+
29
  # How to load dataset
30
 
31
  You can load it by runnning: