Particle.news

Zhipu AI Releases GLM-5, Open-Source Model Touts Coding and Agentic Gains

The open-source release positions the Chinese 'AI tiger' for rapid external testing.

Overview

  • GLM-5 doubles its predecessor’s scale to 744 billion parameters, expands training data to 28.5 trillion tokens, and adopts DeepSeek Sparse Attention for efficiency.
  • The model is available on Zhipu’s website and has been open-sourced on GitHub and Hugging Face.
  • Zhipu says internal tests show GLM-5 surpassing Google’s Gemini 3 Pro on some coding and agentic tasks while trailing Anthropic’s Claude on coding, with CNBC unable to verify the claims.
  • Zhipu’s Hong Kong–listed shares jumped about 30% after the launch as Chinese AI stocks rallied, with MiniMax up roughly 11%.
  • The debut comes during a burst of pre–Spring Festival releases from Chinese developers, including a DeepSeek upgrade and Ant Group’s Ming-Flash-Omni 2.0.