changyeyu commited on
Commit
63276c1
1 Parent(s): a554da7

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +3 -3
README.md CHANGED
@@ -8,15 +8,14 @@ license_link: LICENSE
8
  <h1>
9
  <font size="7">Index-1.9B-32K</font>
10
  </h1>
11
- </div>
12
 
13
  [Switch to English](https://huggingface.co/IndexTeam/Index-1.9B-32K/blob/main/README.md)
14
 
15
  [切换到中文](https://huggingface.co/IndexTeam/Index-1.9B-32K/blob/main/README_zh.md)
 
16
 
17
 
18
 
19
- # Introduction
20
  ## Model Overview
21
  Index-1.9B-32K is a language model with only 1.9 billion parameters, yet it supports a context length of 32K (meaning this extremely small model can read documents of over 35,000 words in one go). The model has undergone Continue Pre-Training and Supervised Fine-Tuning (SFT) specifically for texts longer than 32K tokens, based on carefully curated long-text training data and self-built long-text instruction sets. The model is now open-source on both Hugging Face and ModelScope.
22
 
@@ -35,10 +34,11 @@ In a 32K-length needle-in-a-haystack test, Index-1.9B-32K achieved excellent res
35
  ## Index-1.9B-32K Model Download, Usage, and Technical Report:
36
  For details on downloading, usage, and the technical report for Index-1.9B-32K, see:
37
 
 
38
  <a href="https://github.com/bilibili/Index-1.9B/blob/main/Index-1.9B-32K_Long_Context_Technical_Report.md" style="color:blue; font-size:30px;">
39
  <strong>Index-1.9B-32K Long Context Technical Report.md</strong>
40
  </a>
41
-
42
 
43
 
44
  ---
 
8
  <h1>
9
  <font size="7">Index-1.9B-32K</font>
10
  </h1>
 
11
 
12
  [Switch to English](https://huggingface.co/IndexTeam/Index-1.9B-32K/blob/main/README.md)
13
 
14
  [切换到中文](https://huggingface.co/IndexTeam/Index-1.9B-32K/blob/main/README_zh.md)
15
+ </div>
16
 
17
 
18
 
 
19
  ## Model Overview
20
  Index-1.9B-32K is a language model with only 1.9 billion parameters, yet it supports a context length of 32K (meaning this extremely small model can read documents of over 35,000 words in one go). The model has undergone Continue Pre-Training and Supervised Fine-Tuning (SFT) specifically for texts longer than 32K tokens, based on carefully curated long-text training data and self-built long-text instruction sets. The model is now open-source on both Hugging Face and ModelScope.
21
 
 
34
  ## Index-1.9B-32K Model Download, Usage, and Technical Report:
35
  For details on downloading, usage, and the technical report for Index-1.9B-32K, see:
36
 
37
+ <div align="center">
38
  <a href="https://github.com/bilibili/Index-1.9B/blob/main/Index-1.9B-32K_Long_Context_Technical_Report.md" style="color:blue; font-size:30px;">
39
  <strong>Index-1.9B-32K Long Context Technical Report.md</strong>
40
  </a>
41
+ </div>
42
 
43
 
44
  ---