Update README.md
Browse files
README.md
CHANGED
@@ -23,7 +23,7 @@ dataset = dataset.map(get_first_paragraph)
|
|
23 |
```
|
24 |
|
25 |
# Why use this dataset?
|
26 |
-
The size of the original English Wikipedia dataset is over 20GB.
|
27 |
|
28 |
If you want to create a use case that mostly needs the information in the first paragraph of a Wikipedia article (which is the paragraph with the most important information), this 'wikipedia-first-paragraph' dataset is for you.
|
29 |
|
|
|
23 |
```
|
24 |
|
25 |
# Why use this dataset?
|
26 |
+
The size of the original English Wikipedia dataset is over 20GB. It takes 20min to load it on a Google Colab notebook and running computations on that dataset can be costly.
|
27 |
|
28 |
If you want to create a use case that mostly needs the information in the first paragraph of a Wikipedia article (which is the paragraph with the most important information), this 'wikipedia-first-paragraph' dataset is for you.
|
29 |
|