Shared from twixb · venturebeat.com

DeepSeek-V4 arrives with near state-of-the-art intelligence at 1/6th the cost of Opus 4.7, GPT-5.5

venturebeat.com·Apr 24, 2026

DeepSeek, a Chinese AI startup, has launched its new model, DeepSeek-V4, which boasts 1.6 trillion parameters and is available under a commercially-friendly open source license, significantly undercutting the pricing of leading U.S. models like GPT-5.5 and Claude Opus 4.7 while delivering competitive performance. The model's innovative architecture allows for a one-million-token context and dramatically lower computational costs, prompting a reevaluation of the economics of advanced AI deployment and challenging the dominance of proprietary systems.

DeepSeek's release of the 1.6-trillion-parameter Mixture-of-Experts (MoE) model, DeepSeek-V4, is a game-changer for AI deployment, offering near-frontier performance at dramatically lower costs compared to proprietary models like GPT-5.5 and Claude Opus 4.7. For enterprises managing extensive inference workloads, this opens up cost-effective opportunities for automation and advanced AI applications. This model, freely available under the MIT License, could significantly disrupt the economics of AI model deployment and force a reevaluation of cost-benefit analyses for using high-end AI systems.

Powered by twixb

Want more content like this?

twixb tracks your favorite blogs and social media, filters by keywords, and delivers personalized key learnings — straight to your inbox.