Rakuten Group, Inc. has introduced two innovative AI models: Rakuten AI 2.0 and Rakuten AI 2.0 mini. These models are designed to enhance Japanese language processing capabilities and will be made available to the open-source community by Spring 2025. The Rakuten AI 2.0 model utilizes a Mixture of Experts architecture, significantly improving computational efficiency and performance.
The Rakuten AI 2.0 model boasts an impressive 8x7B parameter structure, while the mini version is optimized for mobile deployment. These advancements are expected to empower local businesses and professionals in developing AI applications tailored to the Japanese market. Rakuten's commitment to AI innovation aims to drive productivity and foster growth across various sectors.
• Rakuten AI 2.0 features a Mixture of Experts architecture for efficiency.
• The models will be open-sourced to support local AI development.
MoE architecture allows dynamic selection of relevant models for input processing, enhancing efficiency.
LLMs are AI models trained on vast datasets to understand and generate human language.
SLMs are compact models designed for specific applications, optimizing for privacy and efficiency.
Rakuten Group focuses on developing advanced AI models to enhance language processing capabilities.
RCR Wireless News 12month
Isomorphic Labs, the AI drug discovery platform that was spun out of Google's DeepMind in 2021, has raised external capital for the first time. The $600
How to level up your teaching with AI. Discover how to use clones and GPTs in your classroom—personalized AI teaching is the future.
Trump's Third Term? AI already knows how this can be done. A study shows how OpenAI, Grok, DeepSeek & Google outline ways to dismantle U.S. democracy.
Sam Altman today revealed that OpenAI will release an open weight artificial intelligence model in the coming months. "We are excited to release a powerful new open-weight language model with reasoning in the coming months," Altman wrote on X.