Neurophos has raised a hefty $110 million Series A round to take its photonic AI chips from the lab into data centers. The Austin, Texas-based startup says the oversubscribed round brings its total funding to $118 million and positions it to deliver exaflop-scale AI inference hardware.
For eeNews Europe readers, this matters because data center power limits, GPU shortages and slowing Moore’s Law are now system-level problems. Neurophos’ approach points to a potential new class of AI accelerators that could reshape compute architectures, supply chains and energy budgets across Europe’s data center and industrial AI landscape.
$110m round backed by big tech and industry
The Series A was led by Gates Frontier, with participation from M12 (Microsoft’s Venture Fund), Carbon Direct Capital, Aramco Ventures, Bosch Ventures, Tectonic Ventures, Space Capital and others. Strategic interest from both hyperscalers and industrial investors underlines the growing urgency to find alternatives to power-hungry silicon GPUs.
“Modern AI inference demands monumental amounts of power and compute,” said Dr. Marc Tremblay, Corporate Vice President and Technical Fellow of Core AI Infrastructure at Microsoft. “We need a breakthrough in compute on par with the leaps we’ve seen in AI models themselves, which is what Neurophos’ technology and high-talent density team is developing.”
Neurophos is targeting AI inference workloads in data centers, where rising energy costs and scaling limits are forcing operators to rethink hardware roadmaps.
Optical processing as a GPU alternative
At the core of Neurophos’ platform is an optical processing unit (OPU) that integrates more than one million micron-scale optical processing elements on a single chip. The company claims up to 100x gains in performance and energy efficiency compared with today’s leading chips, while remaining a drop-in replacement for GPUs.
“Moore’s Law is slowing, but AI can’t afford to wait. Our breakthrough in photonics unlocks an entirely new dimension of scaling, by packing massive optical parallelism on a single chip,” said Dr. Patrick Bowen, CEO and co-founder of Neurophos. “This physics-level shift means both efficiency and raw speed improve as we scale up, breaking free from the power walls that constrain traditional GPUs.”
The key technical advance is the use of micron-scale metamaterial optical modulators, representing a 10,000x miniaturization over earlier photonic components. According to the company, this finally makes large-scale, manufacturable photonic computing viable.
Road to products and deployment
“As the AI industry grapples with a surge in demand that tests our ability to satisfy with compute and power, disruptive approaches to compute may open routes to sustained or accelerated systems scaling that will be needed before the end of the decade,” said Michael Stewart, Managing Partner at M12, Microsoft’s Venture Fund.
Neurophos says the new funding will accelerate delivery of its first integrated photonic compute system, including data center-ready OPU modules, a full software stack and early-access developer hardware. The company is expanding its Austin headquarters and opening a new engineering site in San Francisco as it moves toward early customer deployments.
If you enjoyed this article, you will like the following ones: don't miss them by subscribing to :
eeNews on Google News