xiaowu0162 commited on
Commit
26b70be
1 Parent(s): 7fe2c56

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +27 -0
README.md CHANGED
@@ -1,3 +1,30 @@
1
  ---
2
  license: mit
3
  ---
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
  ---
2
  license: mit
3
  ---
4
+
5
+ **Note: please check [DeepKPG](https://github.com/uclanlp/DeepKPG#scibart) for usage of this model**
6
+
7
+ Paper: [Pre-trained Language Models for Keyphrase Generation: A Thorough Empirical Study](https://arxiv.org/abs/2212.10233)
8
+
9
+ ```
10
+ @article{https://doi.org/10.48550/arxiv.2212.10233,
11
+ doi = {10.48550/ARXIV.2212.10233},
12
+ url = {https://arxiv.org/abs/2212.10233},
13
+ author = {Wu, Di and Ahmad, Wasi Uddin and Chang, Kai-Wei},
14
+ keywords = {Computation and Language (cs.CL), FOS: Computer and information sciences, FOS: Computer and information sciences},
15
+ title = {Pre-trained Language Models for Keyphrase Generation: A Thorough Empirical Study},
16
+ publisher = {arXiv},
17
+ year = {2022},
18
+ copyright = {Creative Commons Attribution 4.0 International}
19
+ }
20
+ ```
21
+
22
+ Pre-training Corpus: [S2ORC (titles and abstracts)](https://github.com/allenai/s2orc)
23
+
24
+ Pre-training Details:
25
+ - **Pre-trained from scratch with science vocabulary**
26
+ - Batch size: 2048
27
+ - Total steps: 250k
28
+ - Learning rate: 3e-4
29
+ - LR schedule: polynomial with 10k warmup steps
30
+ - Masking ratio: 30%, Poisson lambda = 3.5