Nvidia has moved to consolidate its dominance in artificial intelligence hardware with a $20 billion acquisition of key assets from Groq, the Silicon Valley startup known for its ultra‑low‑latency inference chips. The all‑cash deal is the largest in Nvidia’s history and nearly triples Groq’s $6.9 billion valuation from a funding round just three months earlier.
Under the agreement, Nvidia will absorb Groq’s core technology and engineering leadership—including CEO Jonathan Ross, the architect behind Google’s original TPU—while Groq continues operating independently under new management. The structure is framed as a “non‑exclusive licensing agreement”, allowing Nvidia to sidestep traditional merger scrutiny while effectively acquiring the company’s most valuable assets.
Groq emerged as one of the few credible challengers to Nvidia in AI inference, thanks to its novel architecture built around on‑chip SRAM rather than external high‑bandwidth memory. This design enabled faster, more efficient responses for real‑time AI applications, positioning Groq as a rare competitive threat in a market where Nvidia already commands more than 95% of training workloads.
The premium Nvidia paid—roughly 190% above Groq’s recent valuation—underscores the strategic urgency. Reports indicate Groq was not seeking a buyer, and Nvidia initiated the approach, highlighting how seriously it viewed the startup’s potential to disrupt inference performance and cost structures.
The transaction mirrors a growing trend in Big Tech: quasi‑acquisitions that transfer talent and intellectual property without triggering full antitrust review. Similar moves include Microsoft’s $650 million deal for Inflection AI’s team and Amazon’s hiring of Adept AI’s founders. Regulators have yet to challenge these arrangements, effectively validating the model.
With Groq’s architecture and engineering talent now under its umbrella, Nvidia strengthens its grip on both training and inference, the two pillars of modern AI computation. Analysts say the deal leaves enterprises with fewer alternatives for diversifying AI infrastructure, reinforcing Nvidia’s position at the center of the global AI ecosystem.
NVIDIA Hq , Santa Clara Picture by Kevin McCarthy