Baidu has announced the open-source release of its Ernie 4.5 series of artificial intelligence (AI) models. The Chinese technology leader had previously indicated its intention to share proprietary large language models (LLMs) with the public on July 31. The latest release includes a total of 10 distinct variants, all designed using a Mixture-of-Experts (MoE) architecture. Alongside these models, the company has introduced multi-hardware development toolkits for Ernie 4.5 in the open-source domain.
Baidu Releases 10 Variants of Ernie 4.5 AI Models in Open Source
The Chinese tech giant shared the details of its 10 open-source Ernie 4.5 AI models through a post on X (formerly Twitter). Among the released variants, four models are multimodal vision-language formats, while eight utilize the MoE structure. Additionally, two models are focused on reasoning or cognitive tasks. The variants comprise five post-trained models and a selection of pre-trained ones, all of which are now accessible for download on the company’s Hugging Face listing or its GitHub repository.
Baidu provided further insights into the MoE models in a blog post, revealing that these models boast a total of 47 billion parameters, with only three billion active at any moment. The largest variant among the released models features an impressive 424 billion parameters. All models were trained utilizing the PaddlePaddle deep learning framework.
In internal assessments, Baidu claimed that its Ernie-4.5-300B-A47B-Base model exceeds the performance of the DeepSeek-V3-671B-A37B-Base model in 22 out of 28 benchmarks. Notably, the Ernie-4.5-21B-A3B-Base model demonstrated superior capabilities over Qwen3-30B-A3B-Base across various mathematics and reasoning benchmarks, despite containing 30 percent fewer parameters.
The company also disclosed its training methodologies on the model pages, indicating the utilization of a heterogeneous MoE framework during the pre-training phase. Additionally, they employed advanced techniques such as intra-node expert parallelism, memory-efficient pipeline scheduling, FP8 mixed-precision training, and a fine-grained recomputation method to enhance model scaling.
Beyond the models, Baidu has released ErnieKit for the open-source community—a development toolkit intended for the Ernie 4.5 series. This toolkit equips developers with the means to execute pre-training, supervised fine-tuning (SFT), Low-Rank Adaptation (LoRA), and other customization strategies. Importantly, all models are licensed under the permissive Apache 2.0 license, facilitating both academic and commercial usage.