abokbot commited on
Commit
40f5bd5
1 Parent(s): f474829

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +1 -1
README.md CHANGED
@@ -23,7 +23,7 @@ dataset = dataset.map(get_first_paragraph)
23
  ```
24
 
25
  # Why use this dataset?
26
- The size of the original English Wikipedia dataset is over 20GB. Tt takes 20min to load it on a Google Colab notebook and running computations on that dataset can be costly.
27
 
28
  If you want to create a use case that mostly needs the information in the first paragraph of a Wikipedia article (which is the paragraph with the most important information), this 'wikipedia-first-paragraph' dataset is for you.
29
 
 
23
  ```
24
 
25
  # Why use this dataset?
26
+ The size of the original English Wikipedia dataset is over 20GB. It takes 20min to load it on a Google Colab notebook and running computations on that dataset can be costly.
27
 
28
  If you want to create a use case that mostly needs the information in the first paragraph of a Wikipedia article (which is the paragraph with the most important information), this 'wikipedia-first-paragraph' dataset is for you.
29