Join the conversation

Join the community of Machine Learners and AI enthusiasts.

Sign Up
kaisugiย 
posted an update Jun 18
Post
2256
๐Ÿš€ Sarashina1-65B

SB Intuitions has announced the release of Japanese Large Language Models (LLMs) with 7 billion, 13 billion, and 65 billion parameters to aid academic and industrial research and development. The company plans to develop a 390 billion parameter model by the end of 2024. The models, named Sarashina1 and Sarashina2, show significant performance improvements, especially Sarashina2 which is an enhanced version of Sarashina1.

Performance evaluations using five Japanese language datasets reveal that Sarashina2 outperforms other models, including continued pre-trained models. The name "Sarashina" originates from a historical diary linked to the headquarters' location in Tokyo's Takeshiba area, symbolizing the company's ambition to create globally utilized models from Japan.

Model URL:
- sbintuitions/sarashina1-65b
- sbintuitions/sarashina2-13b

Detailed press release (in Japanese):
https://www.sbintuitions.co.jp/news/press/20240614_01/
In this post