A sophisticated text-based Mixture-of-Experts (MoE) model featuring 21B total parameters with 3B activated per token, delivering exceptional multimodal understanding and generation through heterogeneous MoE structures and modality-isolated routing. Supporting an extensive 131K token context length, the model achieves efficient inference via multi-expert parallel collaboration and quantization, while advanced post-training techniques including SFT, DPO, and UPO ensure optimized performance across diverse applications with specialized routing and balancing losses for superior task handling.
With CoreAI, you can start chatting with Baidu: ERNIE 4.5 21B A3B instantly — no separate subscription needed. CoreAI bundles access to Baidu: ERNIE 4.5 21B A3B along with 300+ other AI models from Baidu and other providers like OpenAI, Anthropic, Google, Meta, and more.
Chat with Baidu: ERNIE 4.5 21B A3B and 300+ other AI models — all in one app.