Huawei has open-sourced two of its artificial intelligence models under its Pangu series, marking a significant step in the Chinese tech giant’s bid to strengthen its global AI footprint.
The company also released some of its model reasoning technology, in a move experts say is central to Huawei’s wider plan to build out its AI ecosystem and accelerate international adoption of its products.
The announcement, made Monday, reflects an intensifying open-source drive within China’s AI industry—one that has gained momentum since DeepSeek set the precedent with the release of its DeepSeek-V2 and DeepSeek-Coder models earlier this year. The strategy is seen as a way for China’s top AI firms to work around U.S.-led chip export restrictions and geopolitical barriers while courting global developers and institutions.
Register for Tekedia Mini-MBA edition 19 (Feb 9 – May 2, 2026): big discounts for early bird.
Tekedia AI in Business Masterclass opens registrations.
Join Tekedia Capital Syndicate and co-invest in great global startups.
Register for Tekedia AI Lab: From Technical Design to Deployment (next edition begins Jan 24 2026).
Huawei said its open-source initiative supports the broader “Ascend ecosystem strategy,” designed to encourage the use of its AI models and chips in “thousands of industries.” Pangu, which had previously been applied to weather prediction, pharmaceuticals, and logistics, is now being positioned as a scalable foundation model that global partners can deploy, tweak, and integrate across critical sectors.
The move comes as the company, still under heavy U.S. sanctions, accelerates its pivot from a hardware-focused telecoms firm into a vertically integrated AI powerhouse. Analysts say Huawei is now targeting every layer of the AI value chain—from chips to data centers, to open-source software and model deployment.
With American chipmaker Nvidia barred from exporting its most advanced AI chips to China, Huawei’s Ascend chip series has become Beijing’s best hope for a domestic AI accelerator platform. The coupling of Pangu’s open-source code with Ascend chips is meant to drive demand for Huawei’s broader AI infrastructure, similar to the way Google has aligned its open-source Gemma models with its custom Tensor Processing Units (TPUs).
“Huawei is not as strong as companies like DeepSeek and Baidu at the overall software level – but it doesn’t need to be,” said Marc Einstein, research director at Counterpoint Research.
“Its objective is to ultimately use open source products to drive hardware sales, which is a completely different model from others. It also collaborates with DeepSeek, Baidu and others and will continue to do so,” he added.
DeepSeek Set the Precedent
Huawei’s open-source push builds directly on the path carved by DeepSeek, a rival Chinese AI company that has taken the global community by storm with its free and modifiable language models. DeepSeek’s decision to release its models under liberal licenses has helped it break through internationally, particularly in regions where high licensing costs or Western export restrictions have made U.S. models less viable.
Experts say DeepSeek’s success provided Huawei with a roadmap: leverage open access to build market share, gather global user feedback to improve models, and use that momentum to drive hardware and infrastructure adoption.
Pangu being available in an open-source manner allows developers and businesses to test the models and customize them for their needs, said Lian Jye Su, chief analyst at Omdia.
“The move is expected to incentivize the use of other Huawei products,” he added.
Other Chinese AI companies have adopted the strategy. Baidu has announced a decision to make its Ernie generative AI large language model open source.
Huawei appears to be taking that strategy further by targeting vertical AI applications, rather than general-purpose chatbots. Pangu models are optimized for high-stakes, high-volume tasks such as pharmaceutical simulations, financial modeling, and industrial process management—areas where transparency, customizability, and infrastructure control matter more than broad conversational capabilities.
Global Push, Local Advantage
Huawei’s announcement invited developers, researchers, and corporate partners around the world to experiment with the new open-source models. While this may be a gesture of outreach to the global open-source community, it is also a calculated geopolitical move. By distributing its models freely, Huawei sidesteps Western control over cloud infrastructure and model licensing while giving emerging markets access to advanced AI without the costs associated with American platforms.
“Open-source strategy will resonate well in developing countries where enterprises are more price-sensitive, just like Huawei’s other telecom products have,” Einstein noted.
This could help Huawei replicate the success it had with its 5G infrastructure, which gained wide adoption in Asia, Africa, and parts of the Middle East despite ongoing U.S. resistance.
This outreach is paired with the company’s effort to bring its AI data center solutions to new international markets. Countries across Southeast Asia, Africa, and Latin America—long-time clients of Huawei’s telecom division—are being targeted as key partners in the next phase of the company’s global expansion.
However, the true strength of Huawei’s open-source approach lies in its vertically integrated strategy. Unlike many AI companies that license their models through API access alone, Huawei offers an entire stack: Ascend chips, hardware accelerators, development platforms, data storage, and now the model code itself.
This integration allows the company to control the performance pipeline from silicon to application—enabling it to outperform rivals in sectors like energy management, healthcare diagnostics, and logistics optimization. Analysts say that while Huawei may lag in large-scale chatbot development compared to Baidu or DeepSeek, it is staking a leadership claim in applied, enterprise AI.
It’s a model that mirrors Google’s playbook which offers free models to promote internal chip and cloud adoption. The idea is to create lock-in not through control, but through comprehensive utility.



