MENU

Generative AI drives $154m for Lightmatter’s photonic interconnect, chips and servers

Generative AI drives $154m for Lightmatter’s photonic interconnect, chips and servers

Business news |
By Nick Flaherty



Lightmatter in the US has raised $154 million to boost the roll out of its photonic interconnect technology in hyperscaler data centre chips and servers.

This brings the total raised to $270m and follows a $130m Series C round by competitor Ayar Labs which also raised a total of $220m. Both driven by the craze in generative AI and demand for high performance processing systems.  

The Series C investment round in Lightmatter comes from SIP Global, Fidelity Management & Research Company, Viking Global Investors, GV (Google Ventures), HPE Pathfinder and existing investors.

“Rapid progress in artificial intelligence is forcing computing infrastructure to improve at an unprecedented rate. The energy costs of this growth are significant, even on a planetary scale,” said Nick Harris, co-founder and CEO of Lightmatter. “Generative AI and supercomputing will be transformed by photonic technologies in the coming years, and our investors, partners, and customers are aligned with Lightmatter’s mission of enabling the future of computing infrastructure with photonics.”

Lightmatter is introducing its photonics-enabled Envise, Passage, and Idiom IP and products, providing a full stack of hardware and software for photonic compute and interconnect. The new capital will be used to fund the delivery of these products to customers.

The Envise 4S server features 16 Envise Chips in a 4U server configuration with only 3kW power consumption. This is a building block for a rack-scale Envise inference system that can run the largest neural networks developed to date with three times the performance of Nvidia’s DGX-A100 with 8 times the IPS/W on BERT-Base SQuAD.

Passage is the photonic chiplet interconnect with transistors and photonics integrated side-by-side for dynamic reconfiguration of the communications topology. This integration allows 40 passage lanes to fit in the space of a single optical fibre, reducing packaging complexity and cost while dramatically increasing performance.

Idiom is a workflow tools that converts standard deep learning frameworks and model exchange formats such as Pytorch, TensorFlow or ONNX to Envise while providing the transformations and tools required by deep learning model authors and deployers.

“Lightmatter’s unique approach to harnessing the power of photonics in hardware trips will further the initial capabilities and use cases that we’re seeing from generative AI,” said Jeffrey Smith, General Partner at SIP Global Partners. “These technologies and its global customers will need the highest compute power to run these algorithms and apply them to new verticals, and we’re thrilled to invest in Lightmatter who can make that potential with computing that is faster and more sustainable.”

“Photonic technology has the potential to meet the demand of today’s artificial intelligence compute workloads. Lightmatter is taking a differentiated approach by using silicon photonics and bringing together a deeply technical team to further its mission,” said Erik Nordlander, General Partner at GV. “We’re thrilled to continue supporting Lightmatter’s next stage of growth as they build the leading silicon photonics company.”

Large language models (LLMs) that power Generative AI are both more lucrative and resource intensive than their predecessors—leaders in the space are now claiming they see power and cost limits to the size of these models. Under this backdrop,

www.lightmatter.com

If you enjoyed this article, you will like the following ones: don't miss them by subscribing to :    eeNews on Google News

Share:

Linked Articles
10s