Summary

  • Alibaba’s Qwen team has launched its series of open-source AI large language multimodal models, known as Qwen3.
  • These can be accessed and deployed across platforms including Hugging Face, ModelScope, Kaggle and GitHub, as well as Qwen’s own chat interface and mobile apps.
  • These models are trained to provide “hybrid reasoning” or “dynamic reasoning” capabilities, allowing for more time-consuming and compute-intensive responses in specialised fields, which can be toggled on and off as required.
  • Qwen3 includes both Meld of Experts (MoE) and dense models, all covered by Apache 2.0 open-source license, and offer GPT-4-class reasoning at a similar GPU memory cost to a 20-30 billion dense model.
  • The models also significantly broaden multilingual support to 119 languages and dialects.

By Carl Franzen

Original Article