File size: 1,615 Bytes
a3ffd6d
 
 
 
 
 
 
 
 
 
 
 
 
 
a3af621
a3ffd6d
 
 
 
 
 
 
 
21fe3b3
 
 
 
 
 
 
a3ffd6d
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
---
title: README
emoji: 🧬
colorFrom: gray
colorTo: purple
sdk: static
pinned: false
license: mit
---

# Model Description
ClinicalMobileBERT-i2b2-2010 is a fine-tuned version of the [ClinicalMobileBERT](https://huggingface.co/nlpie/clinical-mobilebert) model on the i2b2-2010 dataset for clinical Named Entity Recognition (NER). The model specialises in recognising entities from three categories: problems, treatments, and tests. The initialisation was conducted using the pre-trained checkpoints of the ClinicalMobileBERT model available on Huggingface.

# Architecture
The architecture of this model is identical to [ClinicalMobileBERT](https://huggingface.co/nlpie/clinical-mobilebert). The model was fine-tuned on the i2b2-2010 dataset for the task of clinical NER. The fine-tuning process targeted three categories of entities: problems, treatments, and tests. The model has around 25M parameters.

# Use Cases
This model is useful for NLP tasks in the clinical domain that require identification and classification of problems, treatments, and tests.

# Citation
If you use this model, please consider citing the following paper:

```bibtex
@article{rohanian2023lightweight,
  title={Lightweight transformers for clinical natural language processing},
  author={Rohanian, Omid and Nouriborji, Mohammadmahdi and Jauncey, Hannah and Kouchaki, Samaneh and Nooralahzadeh, Farhad and Clifton, Lei and Merson, Laura and Clifton, David A and ISARIC Clinical Characterisation Group and others},
  journal={Natural Language Engineering},
  pages={1--28},
  year={2023},
  publisher={Cambridge University Press}
}