Hugging Face
Models
Datasets
Spaces
Posts
Docs
Solutions
Pricing
Log In
Sign Up
FacebookAI
/
xlm-roberta-large-finetuned-conll03-english
like
131
Token Classification
Transformers
PyTorch
Rust
ONNX
Safetensors
94 languages
xlm-roberta
Inference Endpoints
arxiv:
1911.02116
arxiv:
2008.03415
arxiv:
1910.09700
Model card
Files
Files and versions
Community
14
Train
Deploy
Use this model
main
xlm-roberta-large-finetuned-conll03-english
6 contributors
History:
12 commits
lysandre
HF staff
Adds the tokenizer configuration file (
#11
)
18f95e9
verified
8 months ago
onnx
Adding ONNX file of this model (#8)
11 months ago
.gitattributes
577 Bytes
Adding `safetensors` variant of this model (#10)
10 months ago
README.md
7.68 kB
Preliminary model card (#3)
about 2 years ago
config.json
852 Bytes
Update config.json
over 4 years ago
model.safetensors
2.24 GB
LFS
Adding `safetensors` variant of this model (#10)
10 months ago
pytorch_model.bin
pickle
Detected Pickle imports (3)
"collections.OrderedDict"
,
"torch.FloatStorage"
,
"torch._utils._rebuild_tensor_v2"
What is a pickle import?
2.24 GB
LFS
Update pytorch_model.bin
almost 5 years ago
rust_model.ot
2.24 GB
LFS
Update rust_model.ot
about 4 years ago
sentencepiece.bpe.model
5.07 MB
Update sentencepiece.bpe.model
almost 5 years ago
tokenizer.json
9.1 MB
Update tokenizer.json
almost 4 years ago
tokenizer_config.json
25 Bytes
Adds the tokenizer configuration file (#11)
8 months ago