Edit model card

SpaceBERT

This is one of the 3 further pre-trained models from the SpaceTransformers family presented in SpaceTransformers: Language Modeling for Space Systems. The original Git repo is strath-ace/smart-nlp.

The further pre-training corpus includes publications abstracts, books, and Wikipedia pages related to space systems. Corpus size is 14.3 GB. SpaceBERT was further pre-trained on this domain-specific corpus from BERT-Base (uncased). In our paper, it is then fine-tuned for a Concept Recognition task.

BibTeX entry and citation info

@ARTICLE{
9548078,  
author={Berquand, Audrey and Darm, Paul and Riccardi, Annalisa},  
journal={IEEE Access},   
title={SpaceTransformers: Language Modeling for Space Systems},   
year={2021},  
volume={9},  
number={},  
pages={133111-133122},  
doi={10.1109/ACCESS.2021.3115659}
}
Downloads last month
32
Inference API
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.