Create README.md
Browse files
README.md
ADDED
@@ -0,0 +1,11 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
# Charmen-Electra
|
2 |
+
|
3 |
+
A byte-based transformer model trained on Hungarian language. In order to use the model you will need a custom Tokenizer which is available at: [https://github.com/szegedai/byte-offset-tokenizer](https://github.com/szegedai/byte-offset-tokenizer).
|
4 |
+
|
5 |
+
Since we use a custom architecture with Gradient Boosting, Down- and Up-Sampling, you have to enable Trusted Code like:
|
6 |
+
|
7 |
+
```python
|
8 |
+
model = AutoModel.from_pretrained("SzegedAI/charmen-electra", trust_remote_code=True)
|
9 |
+
```
|
10 |
+
|
11 |
+
[![Artificial Intelligence - National Laboratory - Hungary](https://milab.tk.hu/uploads/images/milab_logo_en.png)](https://mi.nemzetilabor.hu/)
|