MENU

In-memory computing startup launches to enable edge AI at scale

Business news |
By Rich Pell


AI accelerator hardware startup EnCharge AI has announced that it has raised $21.7 million in Series A financing to further develop and commercialize its semiconductor hardware and software stack that offers the highest reported efficiency for AI compute to date. The company’s charge-based in-memory computing technology was originally born out of DARPA and Department of Defense funded R&D and has matured through six years of deep research and validation.

This, says the company, has resulted in demonstrations of AI performance with orders-of-magnitude higher compute efficiency and density than has been achieved by both best-in-class digital accelerators, such as GPUs or TPUs, and recent concepts for beyond-digital accelerators, based on optical or analog computing. Several generations of test chips with end-to-end programmable model execution capability have been demonstrated, resulting in impressive technology advantages:

  • Highest efficiency reported for AI compute to date – EnCharge test chips and hardware can achieve over 150 TOPS/W for 8-b compute.
  • Seamless integration – EnCharge offers a software stack that supports broad AI models and resolutions, while integrating seamlessly into user frameworks and design flows.
  • Lower cost for performance – EnCharge will deliver platforms that provide over 20x higher performance per Watt and over 14x higher performance per dollar, compared to the best-in-class digital AI accelerators implemented in the most advanced technology nodes.

With these benefits, says the company, it seeks to unlock the immense potential of AI by making it accessible to power, energy, and space constrained applications at the edge. These capabilities are valuable for market applications such as automotive sensing, advanced manufacturing, smart retail, smart warehouses and logistics, industrial robotics, and drones.

“In order to keep pace with the incredible innovations we are seeing in AI, we need fundamentally new ways of computing,” says Naveen Verma, Ph.D., CEO and Co-Founder, EnCharge AI. “This requires differentiated technologies, but which are also extensively validated and refined across the entire stack for real-world operation. We believe that EnCharge’s technology is the only robust and scalable form of next-generation in-memory computing. Following this Series A round, EnCharge is now positioned to develop products to engage with customer applications in production at the forefront of AI.”

The financing round was led by Anzu Partners with participation from AlleyCorp, Scout Ventures, Silicon Catalyst Angels, Schams Ventures, E14 Fund, and Alumni Ventures.

Jimmy Kan, Ph.D., Partner, Anzu Partners says, “As Edge AI continues to drive business automation, there is huge demand for sustainable technologies that can provide dramatic improvements in end-to-end AI inference capability along with cost and power efficiency. EnCharge’s technology addresses these challenges and has been validated successfully in silicon, fully compatible with volume production.”

EnCharge AI


Share:

Linked Articles
10s