AI Energy Crisis Boosts Interest in Chips That Do It All

April 2, 2024

Bloomberg’s Jane Lanhee Lee talks about energy intensive nature of AI and how in-memory computing solutions can make AI inference more sustainable.

The intensive power consumption of Nvidia’s main product, a type of chip known as a graphics processing unit, makes it a relatively inefficient choice to use for inference, says Sid Sheth, founder and CEO of d-Matrix, a Silicon Valley-based chip startup that’s raised $160 million from investors like Microsoft Corp. and Singaporean state-owned investor Temasek Holdings Pte.

Read the full article on Bloomberg

 

Suggested Articles

The Complete Recipe to Unlock AI Reasoning at Enterprise Scale

By d-Matrix Team | February 13, 2025

Think more vs. Train more

By Sid Sheth | January 29, 2025

Deep divers off pier

Impact of the DeepSeek Moment on Inference Compute 

By d-Matrix Team | January 31, 2025