April 2, 2024
Bloomberg’s Jane Lanhee Lee talks about energy intensive nature of AI and how in-memory computing solutions can make AI inference more sustainable.
The intensive power consumption of Nvidia’s main product, a type of chip known as a graphics processing unit, makes it a relatively inefficient choice to use for inference, says Sid Sheth, founder and CEO of d-Matrix, a Silicon Valley-based chip startup that’s raised $160 million from investors like Microsoft Corp. and Singaporean state-owned investor Temasek Holdings Pte.
Read the full article on Bloomberg