AI Chip Startup Groq Secures $640 Million in Series D Funding

Read Time:2 Minute, 22 Second

QUICK BITE

  • Groq raises $640M in Series D funding, reaching a $2.8B valuation as it aims to revolutionize AI chip technology.
  • Groq’s LPUs claim to run AI models 10x faster while using only 1/10th of the energy compared to traditional processors.
  • Yann LeCun, Meta’s Chief AI Scientist, joins Groq as a technical advisor, boosting the startup’s AI expertise.

Groq, a startup focused on creating chips to run generative AI models faster than traditional processors, announced on Monday that it has raised $640 million in a Series D funding round, valuing the company at $2.8 billion. 

With the recent funding, Groq plans to deploy over 100,000 additional LPUs into GroqCloud, focusing on the deployment of AI models for global use. including Neuberger Berman, Type One Ventures, and strategic investors including Cisco Investments, Global Brain’s KDDI Open Innovation Fund III, and Samsung Catalyst Fund. The unique, vertically integrated Groq AI inference platform has generated skyrocketing demand from developers seeking exceptional speed.

“You can’t power AI without inference compute,” said Jonathan Ross, CEO and Founder of Groq. “We intend to make the resources available so that anyone can create cutting-edge AI products, not just the largest tech companies. 

With the recent funding, Groq plans to deploy over 100,000 additional LPUs into GroqCloud, focusing on the deployment of AI models for global use. The company, having secured double the anticipated funding, also intends to significantly expand its team and is actively hiring. Groq’s mission is to enable hundreds of thousands of developers to build on open models.

Furthermore, Groq also announced that Stuart Pann, formerly a senior executive from HP and Intel, joined its leadership team as Chief Operating Officer.

“I am delighted to be at Groq at this pivotal moment. We have the technology, the talent, and the market position to rapidly scale our capacity and deliver inference deployment economics for developers as well as for Groq,” said Stuart Pann, Chief Operating Officer at Groq.

Additionally, Yann LeCun, VP & Chief AI Scientist at Meta also joins Groq as its newest technical advisor.

Groq, which came out of stealth mode in 2016, is developing an LPU (language processing unit) inference engine. The company claims its LPUs can run existing generative AI models like OpenAI’s ChatGPT and GPT-4 ten times faster while using only one-tenth of the energy. 

Co-founded by Jonathan Ross, who helped invent Google’s tensor processing unit (TPU), and Douglas Wightman, a former engineer at Alphabet’s X moonshot lab, Groq offers a developer platform called GroqCloud. GroqCloud provides open models like Meta’s Llama 3.1, Google’s Gemma, and OpenAI’s Whisper, along with an API for cloud instances. The platform, which also includes an AI chatbot playground called GroqChat, has over 356,000 developers as of July.

CIM Editorial

CIM Editorial is the official voice of Crypto India Magazine. We bring you the latest news, analysis, and insights on Web3 and AI in India and beyond.

Leave a Reply

Your email address will not be published. Required fields are marked *

Previous post ABS2024: Taiwan Leads the Way as Asia’s Web3 and AI Innovation Hub
Next post Apple Intelligence Writing Tools Warn Users About Sensitive Content