metadata
license: apache-2.0
language:
- en
- th
library_name: transformers
pipeline_tag: fill-mask
tags:
- code
The distilled version of the 'airesearch/wangchanberta-base-att-spm-uncased'. This is the 62M params model trained with Assorted Thai Texts (4.8 GB) used for WangchanBERTa pre-training.
pls use the tokenizer from the 'airesearch/wangchanberta-base-att-spm-uncased'