File size: 894 Bytes
47a8df4
 
 
1ed8aa4
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
d1a5fc1
 
 
 
 
 
 
1ed8aa4
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
---
license: apache-2.0
---

# Overview

<p align="center">
  <img src="https://avatars.githubusercontent.com/u/12619994?s=200&v=4" width="150">
</p>

<!-- -------------------------------------------------------------------------------- -->

This model is **only compatible** with the code in [this github repo](https://github.com/huawei-noah/Pretrained-Language-Model/tree/master/JABER-PyTorch) (not supported by the [Transformers](https://github.com/huggingface/transformers) library)
 
## Citation

Please cite the following paper when using our code and model:

``` bibtex
@misc{ghaddar2024importance,
      title={On the importance of Data Scale in Pretraining Arabic Language Models}, 
      author={Abbas Ghaddar and Philippe Langlais and Mehdi Rezagholizadeh and Boxing Chen},
      year={2024},
      eprint={2401.07760},
      archivePrefix={arXiv},
      primaryClass={cs.CL}
}
```