This post is also available in:
עברית (Hebrew)
Alibaba has introduced a new large language model, Qwen-3-Max-Preview, marking its most advanced release to date. The model, unveiled on the company’s cloud platform and through the AI marketplace OpenRouter, is equipped with more than one trillion parameters, placing it among the largest text-based AI systems currently available.
The launch follows earlier entries in the Qwen3 series, first rolled out in May. Those models ranged from 600 million to 235 billion parameters, offering developers a spectrum of options depending on scale and resource requirements. With this new release, Alibaba is moving into the ultra-large category of AI models, where only a handful of players operate.
For comparison, OpenAI’s GPT-4.5 is believed to operate with five to seven trillion parameters, significantly more than Alibaba’s new model but in the same tier of high-capacity systems. While parameter count is often cited as a marker of capability, it also directly affects computing cost and energy use, making efficiency a growing challenge for developers.
According to Alibaba, Qwen-3-Max-Preview shows notable improvements in Chinese-English comprehension, instruction following, and handling of complex or open-ended queries, along with stronger support for multiple languages and integration with external tools. Benchmark results released by the company suggest that it surpassed competitors such as MoonShot AI’s Kimi K2, a simplified version of Anthropic’s Claude Opus 4, and DeepSeek V3.1 in several standard tests. However, Alibaba has not yet published a full technical report to substantiate these findings.
The model is not open-source but can be accessed through official channels. Pricing is set at $0.861 per million input tokens and $3.441 per million output tokens, making it the most costly Qwen model to date. This is higher than both the earlier Qwen3-235B model and MoonShot’s Kimi K2.
Alibaba has committed 380 billion yuan (around $52 billion) over the next three years to expand its AI infrastructure, signaling the company’s long-term commitment to scaling its technology. While reports suggest that in parallel the firm is developing its own processors to reduce reliance on U.S. suppliers, Qwen-3-Max-Preview demonstrates the company’s immediate push to remain competitive in large-scale generative AI.