On April 29,My Mothers Friend 5 Alibaba unveiled Qwen3, its newest large language model and China’s first hybrid reasoning model that integrates both fast and slow thinking modes to reduce computational costs.
The Qwen3 series includes a range of models, such as the fine-tuned Qwen3-30B-A3B and its pre-trained base, now available across major platforms. Alibaba Cloud also open-sourced two Mixture-of-Experts (MoE) models: the flagship Qwen3-235B-A22B, with over 235 billion parameters, and the lightweight Qwen3-30B-A3B, with 30 billion total and 3 billion active parameters. According to Alibaba Cloud, Qwen3-235B-A22B delivers competitive results in coding, math, and general reasoning benchmarks, rivaling top models like DeepSeek-R1, 01.AI’s o1 and o3-mini, Grok-3, and Gemini 2.5 Pro.
Qwen3’s two reasoning modes allow users to toggle between in-depth step-by-step answers or rapid responses, depending on task complexity, a flexible design aimed at balancing speed and intelligence. [Alibaba, in Chinese]