RakutenAI-3.0
Model Description
Rakuten AI 3.0, is an approximately 700 billion parameter Mixture of Experts (MoE) model optimized for Japanese. Developed by leveraging the best from the open-source community and building on Rakuten’s high-quality, bilingual original data, engineering and research, it offers a superior grasp of Japanese language and culture.
For more details, please check the announcement at:
Model Details
- Developed by: Rakuten Group, Inc.
- Language(s): Japanese, English
- License: Apache License, Version 2.0
- Model Architecture: Mixture of Experts (MoE)
- Total Parameters: 671B
- Activated Parameters per Token: 37B
- Context Length: 128K
How to Run Locally
Inference with SGLang
- Recommended DOCKER Image: dockerhub-us/lmsysorg/sglang:v0.5.6.post2
python -m sglang.launch_server \
--model-path Rakuten/RakutenAI-3.0 \
--tp 8 \
--mem-fraction-static 0.85 \
--trust-remote-code \
--show-time-cost
Limitations and Bias
RakutenAI-3.0 can generate human-like text across a wide range of topics. However, like all large language models, it has limitations and may produce biased, inaccurate, or unsafe outputs. Please exercise caution and judgment when interacting with this model, and ensure appropriate safeguards are in place for production deployments.
Citation
@misc{rakutengroup2026rakutenai3.0,
author = {Rakuten Group, Inc.},
title = {RakutenAI-3.0},
year = {2026},
publisher = {Hugging Face},
url = {https://huggingface.co/Rakuten},
}
Contact
For questions or feedback, please open an issue on this repository or visit ai.rakuten.com.
- Downloads last month
- 257