abokbot commited on
Commit
c299a18
1 Parent(s): 193795e

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +2 -0
README.md CHANGED
@@ -22,7 +22,9 @@ dataset = dataset.map(get_first_paragraph)
22
 
23
  # Why use this dataset?
24
  The size of the original English Wikipedia dataset is over 20GB. Tt takes 20min to load it on a Google Colab notebook and running computations on that dataset can be costly.
 
25
  If you want to create a use case that mostly needs the information in the first paragraph of a Wikipedia article (which is the paragraph with the most important information), this 'wikipedia-first-paragraph' dataset is for you.
 
26
  Its size is 1.39GB and it takes 5 min to load it on a Google colab notebook.
27
 
28
 
 
22
 
23
  # Why use this dataset?
24
  The size of the original English Wikipedia dataset is over 20GB. Tt takes 20min to load it on a Google Colab notebook and running computations on that dataset can be costly.
25
+
26
  If you want to create a use case that mostly needs the information in the first paragraph of a Wikipedia article (which is the paragraph with the most important information), this 'wikipedia-first-paragraph' dataset is for you.
27
+
28
  Its size is 1.39GB and it takes 5 min to load it on a Google colab notebook.
29
 
30