AI is hungry for power — D-Matrix wants to fix that with new chips


In conversation with Shradha Sharma, founder and CEO of YourStory, Sid Sheth, founder and CEO of D-Matrix, laid out why he believes the next phase of AI will be driven by the infrastructure layer—chips and data centres—designed to run AI faster, cheaper, and with far better energy efficiency.

Sheth said D-Matrix was founded in 2019 to build a new class of AI chips designed from the ground up specifically for AI workloads. The objective, he explained, is to make AI fast enough for real-time experiences—where creators and users don’t have to wait for outputs—while keeping the system energy efficient and cost effective as AI usage scales.

<div class="externalHtml embed active" contenteditable="false" data-val="” align=”center”>

Energy efficiency, he argued, is quickly becoming a defining constraint. If AI becomes a daily utility used repeatedly by billions of people, the world will need either dramatically more power generation or dramatically more efficient compute—or both. He pointed to wider conversations around rising data-centre power demand, including extreme ideas such as moving data centres into space to harness more solar energy. D-Matrix’s bet, he said, is that gains at the chip layer can reduce the energy required to run AI systems at scale.

A capital-heavy business with a long runway

Sheth described chip-building as a long-cycle, capital-intensive venture. He said D-Matrix has raised close to $500 million to support the effort, noting that hardware companies often require substantial funding before they see the first meaningful dollar of revenue.

He attributed the ability to raise that kind of capital to experience and delivery. Sheth said he began his career as a chip engineer at Intel and spent decades in the semiconductor space before starting D-Matrix. He also said he had previously built a business that scaled from zero to roughly $3 billion in revenue, which helped build investor confidence over time.

What India needs for AI scale

On India’s AI priorities, Sheth said the country is strong in applications and has the capability to build models, but needs more researchers and, crucially, wider access to compute. The biggest lever, he argued, is infrastructure: faster buildout of data centres, greater energy availability, cooling resources such as water, and a regulatory environment that enables compute to scale quickly.

For Indian entrepreneurs attempting deeptech bets, Sheth’s message was to think globally about capital. Funding at this scale, he said, is increasingly international, and founders should not assume they can raise only within one geography if they want to build at global ambition



Source link


Discover more from News Link360

Subscribe to get the latest posts sent to your email.

Leave a Reply

Discover more from News Link360

Subscribe now to keep reading and get access to the full archive.

Continue reading