
Groq raises $640m for inference AI chip

AI inference chip developer Groq has raised $640m at a valuation of $2.8bn.
Groq will use the cash to deploy over 108,000 Language Processing Unit (LPU) Inference Engine chips, manufactured by GlobalFoundries, by the end of Q1 2025.
Groq has grown to over 360,000 developers building on its GroqCloud, creating AI applications on openly-available models such as Llama 3.1 from Meta, Whisper Large V3 from OpenAI, Gemma from Google, and Mixtral from Mistral. Groq will use the funding to scale the capacity of its tokens-as-a-service (TaaS) offering and add new models and features to GroqCloud.
The LPU sits in the data centre alongside CPUs and Graphics Processors that enable training and customers can choose on-premise deployment or API access. Groq is currently running Llama-2 70B at over 300 tokens per second per user.
The Series D round was led by funds and accounts managed by BlackRock Private Equity Partners with participation from both existing and new investors including Neuberger Berman, Type One Ventures, and strategic investors including Cisco Investments, Global Brain’s KDDI Open Innovation Fund III, and Samsung Catalyst Fund.
“The market for AI compute is meaningful and Groq’s vertically integrated solution is well positioned to meet this opportunity. We look forward to supporting Groq as they scale to meet demand and accelerate their innovation further,” said Samir Menon, Managing Director, BlackRock Private Equity Partners.
“You can’t power AI without inference compute,” said Jonathan Ross, CEO and Founder of Groq. “We intend to make the resources available so that anyone can create cutting-edge AI products, not just the largest tech companies. This funding will enable us to deploy more than 100,000 additional LPUs into GroqCloud. Training AI models is solved, now it’s time to deploy these models so the world can use them. Having secured twice the funding sought, we now plan to significantly expand our talent density. We’re the team enabling hundreds of thousands of developers to build on open models and – we’re hiring.”
Groq also gains the expertise of its newest technical advisor, Yann LeCun, VP & Chief AI Scientist at Meta.
