Celestial AI raises $100m for optical interconnect tech
Celestial AI in the US has raised $100m to develop for its optical interconnect technology with venture backing from imec and Porsche.
This comes as Lightelligence is set to show its photonic network on chip (oNOC) in an AI chip on a PCIexpress card.
After close collaboration with hyperscalers, AI compute and memory providers, Celestial AI is introducing the Photonic Fabric, the optical interconnect for disaggregated, exascale compute and memory clusters. The Photonic Fabric is unconstrained by package beachfront limitations and can deliver data to any location on the compute die, directly to the point of consumption.
Celestial AI is building a robust Photonic Fabric ecosystem consisting of AI compute, memory suppliers, hyperscalers and high-volume commercial supply chain partners. The technology is compatible with existing industry standards including CXL, PCIe, UCIe, JEDEC HBM, and proprietary electrical communication links.
Celestial AI is offering its Photonic Fabric optical interconnect technology and silicon proven IP through a technology licensing program. Photonic Fabric has very low latency and does not need re-timers. This allows disaggregation of more processors than is possible with electric interconnects. Workloads that previously would not have been candidates for disaggregation due to latency sensitivities, can now be disaggregated with the higher bandwidth and lower latency.
Developers can use the IP in their own designs or see end-to-end benefits with Celestial AI’s Orion AI accelerator.
- Demo of world’s first optical network-on-chip processor
- ChatGPT craze drives Ayar Labs to add $25m in C1 round
- Ultra-low-energy optical chip-to-chip interconnect demonstrated
The Series B funding round was led by IAG Capital Partners, Koch Disruptive Technologies (KDT) and Temasek’s Xora Innovation fund. Other major investors in the round include Samsung Catalyst, Smart Global Holdings (SGH), Porsche Automobil Holding SE, The Engine Fund, imec.xpand, M Ventures and Tyche Partners.
Stranded memory
The startups in this area, including Ayar Labs, see the era of electrical connectivity ending, with optical interconnectivity becomes the foundational building block for advanced AI models that require exponentially increasing memory capacity and bandwidth. Large Language Models (LLM) including GPT-4 used for ChatGPT and Recommendation Engines are memory bound rather than compute bound and cloud service providers (CSPs) and hyperscale data centres are unable to decouple memory scaling from compute.
These startups have shown that Optical Compute Interconnect (OCI) is a viable approach for the disaggregation of scalable data centre memory and accelerated computing, allowing optically interconnected pooled memory systems to avoid stranded memory.
“Generative AI and Recommendation Engines, which today are running on general purpose computing systems, have already begun to measurably impact business processes, products and services. A rapid transition will take place in the coming years as global data centre infrastructure transitions from general purpose to accelerated computing systems,” said Dave Lazovsky, Celestial AI founder and CEO.
“This next wave of data centre infrastructure is being architected to deliver tremendous advancements in AI workload efficiencies, resulting from disaggregation of memory and compute resources which is enabled by optical interconnectivity. Our technology solutions are lighting the way to the future of accelerated computing.”
“Celestial AI has developed a transformational optical connectivity capability that will deliver step-change advancements in both performance and energy efficiency of high-performance computing, unlocking the potential of Generative AI and other complex workloads,” said Chase Koch, founder and CEO, Koch Disruptive Technologies.