MENU

Second generation chiplet platform targets generative AI

Second generation chiplet platform targets generative AI

Technology News |
By Nick Flaherty



d-matrix in California has launched its second generation of chiplet platform for large AI chips using in-memory computing.

The company says Jayhawk the industry’s first Open Domain-Specific Architecture (ODSA) Bunch of Wires (BoW) based chiplet platform for energy efficient die-die connectivity over organic substrates.

The architecture builds on the Nighthawk chiplet platform launched in 2021 for inference compute platforms to manage Generative AI applications and Large Language Model transformer applications with a 10-20X improvement in performance. 

Large transformer models are creating new demands for AI inference at the same time that memory and energy requirements are hitting physical limits. d-Matrix provides one of the first Digital In-Memory Compute (DIMC) based inference compute platforms to come to market built to handle the immense data and power requirements of inference AI. Improving performance can make energy-hungry data centres more efficient while reducing latency for end users in AI applications.

Jayhawk uses a combination of a digital in-memory compute-based IC architecture, design tools that integrate with leading ANN models and chiplets in a block grid formation to support scalability and efficiency for demanding ML workloads.

By using a modular chiplet-based approach, data center customers can refresh compute platforms on a much faster cadence using a pre-validated chiplet architecture. To enable this, d-Matrix plans to build chiplets based on both BoW and the competing UCIe standard interconnects to enable a heterogeneous computing platform that can accommodate third party chiplets.

The Jayhawk chiplet platform features 3mm, 15mm and 25 mm trace lengths on organic substrate with 16 Gbite/s/wire bandwidth and less than 0.5 pJ/bit energy efficiency. It is built on a 6nm process technology at foundry TSMC.

“With the announcement of our second generation chiplet platform, Jayhawk, and a track record of execution, we are establishing our leadership in the chiplet ecosystem,” said Sid Sheth, CEO of d-Matrix. “The d-Matrix team has made great progress towards building the world’s first in-memory computing platform with a chiplet-based architecture targeted for power hungry and latency sensitive demands of generative AI.”

Jayhawk is currently available for demos and evaluation. d-Matrix will be showcasing the Jayhawk platform at the Chiplet Summit in San Jose, CA, this week.

dmatrix.ai

Related UCIe articles

If you enjoyed this article, you will like the following ones: don't miss them by subscribing to :    eeNews on Google News

Share:

Linked Articles
10s