Tokyo, March 17, 2026 – Rakuten Group, Inc. released its latest Japanese large language model (LLM), Rakuten AI 3.0, developed as part of the Generative AI Accelerator Challenge (GENIAC) project promoted by the Ministry of Economy, Trade and Industry (METI) and the New Energy and Industrial Technology Development Organization (NEDO). Unveiled in December 2025 and now fine-tuned, Rakuten AI 3.0 is Japan’s largest high-performance AI model*1 that empowers companies and professionals developing AI applications.
Available free under the Apache 2.0 license*2 from the official Rakuten Group Hugging Face repository*3, Rakuten AI 3.0 is optimized for the Japanese language and excels at tasks including writing, code generation, document analysis and extraction. Compared to Rakuten's earlier models, it achieves significantly higher accuracy and more robust performance.
In July 2025, Rakuten was selected for the third term of the GENIAC project to develop Japanese language-optimized AI models. Part of the training cost for Rakuten AI 3.0 was provided by the GENIAC project, which offers support for computing resources necessary for Japan’s generative AI development.
Ting Cai, Chief AI & Data Officer of Rakuten Group, commented, “Rakuten is committed to delivering high-quality, cost-efficient models that empower businesses and users. Rakuten AI 3.0, our largest and most competitive model, is an outstanding combination of data, engineering and innovative architecture at scale. By sharing open models, we aim to accelerate AI development in Japan. We are excited for the opportunity to work with the Ministry of Economy, Trade and Industry to foster a collaborative AI development community that drives progress for all.”
Best-in-class Japanese performance
Rakuten AI 3.0 (LLM) compared to leading models focusing on the Japanese language
