#GoogleAlum

Anthropic’s Daniela Amodei on the company’s ‘do more with less’ bet

Tech

Anthropic’s ‘do more with less’ bet has kept it at the AI frontier, co-founder Amodei tells CNBC

Published Sat, Jan 3 2026, 8:00 AM EST, Updated Sat, Jan 3 20268:53 AM EST

By MacKenzie Sigalos@KENZIESIGALOS

Key Points

  • Anthropic is taking a disciplined approach to spending and algorithmic efficiency while rival OpenAI makes $1.4 trillion in headline compute commitments.
  • Daniela Amodei argues the next phase won’t be won by the biggest pre-training runs alone, but by who can deliver the most capability per dollar of compute.
  • Heading into 2026, with both labs running as if they could go public while still raising fresh capital, it’s a question of brute-force scale versus efficiency.

See Video: Anthropic’s Daniela Amodei on their ‘do more with less’ bet and the AI race

SAN FRANCISCO — Inside Anthropic headquarters, President and co-founder Daniela Amodei keeps coming back to a phrase that’s become a sort of governing principle for the artificial intelligence startup’s entire strategy: Do more with less.

It’s a direct challenge to the prevailing mood across Silicon Valley, where the biggest labs and their backers are treating scale as destiny.

Firms are raising record sums, locking up chips years in advance, and pouring concrete across the American heartland for data centers in the belief that the company that builds the largest intelligence factory will win.

OpenAI has become the clearest example of that approach.

The company has made roughly $1.4 trillion in headline compute and infrastructure commitments as it works with partners to stand up massive data center campuses and secure next-generation chips at a pace the industry has never seen.

Anthropic’s pitch is that there’s another way through the race, one where disciplined spending, algorithmic efficiency, and smarter deployment can keep you at the frontier without trying to outbuild everyone else.

“I think what we have always aimed to do at Anthropic is be as judicious with the resources that we have while still operating in this space where it’s just a lot of compute,” Amodei told CNBC. “Anthropic has always had a fraction of what our competitors have had in terms of compute and capital, and yet, pretty consistently, we’ve had the most powerful, most performant models for the majority of the past several years.”

See Video: Anthropic bets efficiency can beat brute-force scale in the AI arms race

Daniela Amodei and her brother, Dario Amodei, who is Anthropic’s CEO and a Baidu and Google alumni, helped build the very worldview they’re now betting against.

Dario Amodei was among the researchers who helped popularize the scaling paradigm that has guided the modern model race. It is the strategy that increasing compute, data, model size, and capabilities tends to improve the model in a predictable way.

Editor’s Note: Interesting article on Anthropic, improving the model is exactly the right approach, IMHO. It’s not scale that will win, IMHO. Comments?

Continue/Read Original Article Here: Anthropic’s Daniela Amodei on the company’s ‘do more with less’ bet

#AnotherWay #BaiduAlum #CEO #CNBC #DanielaAmodei #DarioAmodei #DataCenters #Efficiency #GoogleAlum #Scale #WorldView
108247636-anthropic_thumb

Client Info

Server: https://mastodon.social
Version: 2025.07
Repository: https://github.com/cyevgeniy/lmst