AliMaatouk commited on
Commit
c26e1bb
1 Parent(s): 539e8e7

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +14 -11
README.md CHANGED
@@ -26,17 +26,17 @@ Phi-1.5-Tele is a base model best suited for fine-tuning on applications related
26
  ```markdown
27
  Write me a poem about telecommunications.
28
 
29
- Answer: This world so vast and wide, we send our thoughts fast,
30
- With technology that allows us to be ever part of it.
31
- We connect, we share, we unite,
32
- Through the web of information, so vast and complete.
33
  ```
34
 
35
  where the model generates the text after "Answer:".
36
 
37
  ## Sample Code
38
 
39
- Below we share some code snippets on how to get quickly started with running the model. First, make sure to `pip install -U transformers`, then copy the snippet corresponding to your hardware and adapt it to your usecase.
40
 
41
  #### Running the model on a CPU
42
 
@@ -76,13 +76,16 @@ print(response)
76
 
77
  ## Citation
78
 
79
- You can find the paper with all details about the model at https://arxiv.org/abs/2309.05463. Please cite it as follows:
80
 
81
  ```bib
82
- @article{textbooks2,
83
- title={Textbooks Are All You Need II: \textbf{phi-1.5} technical report},
84
- author={Li, Yuanzhi and Bubeck, S{\'e}bastien and Eldan, Ronen and Del Giorno, Allie and Gunasekar, Suriya and Lee, Yin Tat},
85
- journal={arXiv preprint arXiv:2309.05463},
86
- year={2023}
 
 
 
87
  }
88
  ```
 
26
  ```markdown
27
  Write me a poem about telecommunications.
28
 
29
+ Answer: Our world is a network of digital streams,
30
+ Connecting every voice and thought,
31
+ Through the wires and fibers that transmit,
32
+ Bringing us closer to the end of the road.
33
  ```
34
 
35
  where the model generates the text after "Answer:".
36
 
37
  ## Sample Code
38
 
39
+ Below we share some code snippets on how to get quickly started with running the model. First, make sure to `pip install transformers`, then copy the snippet corresponding to your hardware and adapt it to your usecase.
40
 
41
  #### Running the model on a CPU
42
 
 
76
 
77
  ## Citation
78
 
79
+ You can find the paper with all details about the model at https://arxiv.org/abs/2409.05314. Please cite it as follows:
80
 
81
  ```bib
82
+ @misc{maatouk2024telellmsseriesspecializedlarge,
83
+ title={Tele-LLMs: A Series of Specialized Large Language Models for Telecommunications},
84
+ author={Ali Maatouk and Kenny Chirino Ampudia and Rex Ying and Leandros Tassiulas},
85
+ year={2024},
86
+ eprint={2409.05314},
87
+ archivePrefix={arXiv},
88
+ primaryClass={cs.IT},
89
+ url={https://arxiv.org/abs/2409.05314},
90
  }
91
  ```