Turkish

Commit History

Update README.md
6c0d4cb

bengisucam commited on

Update README.md
97e618a

bengisucam commited on

This tokenizer was build using "bengisucam/tr_dataset_combined" dataset. The vocab size is 50000.
6e417ac

bengisucam commited on

Upload tokenizer
bff834e

bengisucam commited on

initial commit
797e72a

bengisucam commited on