Close Menu
Fin Street NewsFin Street News
  • Home
  • Business
  • Finance
    • Banking
    • Stocks
    • Commodities & Futures
    • ETFs & Mutual Funds
    • Funds
    • Currencies
    • Crypto
  • Markets
  • Investing
  • Personal Finance
    • Loans
    • Credit Cards
    • Dept Management
    • Retirement
    • Mortgages
    • Saving
    • Taxes
  • Fintech

Subscribe to Updates

Get the latest finance and business news and updates directly to your inbox.

Trending
Foreign Travelers Are Avoiding the US, in a Continued Blow to Tourism

Foreign Travelers Are Avoiding the US, in a Continued Blow to Tourism

January 17, 2026
Matt Damon Says Netflix Is Catering Action Movies for Attention Spans

Matt Damon Says Netflix Is Catering Action Movies for Attention Spans

January 17, 2026
What Bank CEOs Are Saying AI Will Do to Their Head Counts

What Bank CEOs Are Saying AI Will Do to Their Head Counts

January 17, 2026
Why Downsizing Our Home Gave Us More Time and Connection

Why Downsizing Our Home Gave Us More Time and Connection

January 17, 2026
America’s Largest Labor Movement Joins the Fight Against ICE

America’s Largest Labor Movement Joins the Fight Against ICE

January 17, 2026
Facebook X (Twitter) Instagram
  • Privacy Policy
  • Terms of use
  • Press Release
  • Advertise
  • Contact
January 18, 2026 2:34 am EST
|
Facebook X (Twitter) Instagram
  Market Data
Fin Street NewsFin Street News
Newsletter Login
  • Home
  • Business
  • Finance
    • Banking
    • Stocks
    • Commodities & Futures
    • ETFs & Mutual Funds
    • Funds
    • Currencies
    • Crypto
  • Markets
  • Investing
  • Personal Finance
    • Loans
    • Credit Cards
    • Dept Management
    • Retirement
    • Mortgages
    • Saving
    • Taxes
  • Fintech
Fin Street NewsFin Street News
Home » This Big AI Bubble Argument Is Wrong
This Big AI Bubble Argument Is Wrong
Markets

This Big AI Bubble Argument Is Wrong

News RoomBy News RoomNovember 19, 20251 ViewsNo Comments

Dour warnings of an AI bubble have rocked markets in recent weeks. At least one big concern is misplaced, though.

Back in March, I told you about depreciation risks for some AI companies, including CoreWeave. In August, Jim Chanos, the guy who shorted Enron, shared similar concerns.

The big worry centers on GPUs, the chips needed to train and run AI models. As new GPUs come out, older ones get less valuable, through obsolescence and wear and tear. Cloud companies must use depreciation to reduce the value of these assets over a period that reflects reality. The faster the depreciation, the bigger the hit to earnings.

Investors have begun to worry that GPUs only have useful lives of one or two years, while cloud providers depreciate the value of these assets over five or six years. An accounting mismatch like this could set the AI industry up for a nasty earnings hit in a few years.

This view has become almost a consensus on Wall Street now. It’s one of the main pieces of evidence for the argument that we’re in a huge AI bubble. The problem is that it’s wrong: Even as Nvidia rolls out new GPU architectures every 18 months or less, GPUs aren’t aging out nearly as fast as some investors fear.

“GPUs can profitably run for about 6 years,” Stacy Rasgon, a leading chip analyst at Bernstein, wrote in a research report on Monday. “The depreciation accounting of most major hyperscalers is reasonable.”

Healthy margins

The cost of operating a GPU in an AI data center is “very low” compared to market prices for renting GPUs via the cloud. That makes the “contribution margins” of running old GPUs for longer quite high, Rasgon and his fellow analyst at Bernstein noted. (Contribution margins measure revenue left over after variable costs. It’s a common way product profitability is assessed and business decisions are made).

“Even with meaningful improvements in price/performance with each GPU generation, vendors can make comfortable margins on 5-year-old A100s, in turn implying a 5-6 year depreciation lifespan is reasonable,” the analysts added, referring to Nvidia’s A100 chips, which came out in 2020.

Seven to eight years

To find out why these GPUs are so valuable for so long, it pays to speak with the people who actually run these components at scale inside AI datacenters.

Matt Rowe, senior director of strategic business development at AI cloud provider Lambda, said recently that the effective lifespan of GPUs can stretch to seven or eight years.

While most firms still use a six-year depreciation schedule for accounting purposes, warranty extensions and redeployment strategies are extending their useful life, he told Bernstein.

Warranty contracts are often overlooked by observers worrying about depreciation, Rowe explained. These warranties typically last five years, so if GPUs fail, they are replaced with new ones, extending the life of the overall GPU fleet.

He also noted that Amazon Web Services offered very early generations of GPUs, such as K80s, P100s, and V100s. These all lasted well beyond six years.

Nvidia’s H100 GPUs, which debuted in 2022, are still running well inside Lambda data centers. Utilization is above 85% and Lambda hasn’t cut its on-demand public cloud pricing for this GPU in more than 12 months, Rowe noted.

“We all think seven to eight years is possible,” Rowe said.

Crusoe’s experience

I chatted this week with Erwan Menard, SVP of product management at Crusoe, which is developing the huge Stargate data center complex in Texas. Before joining Crusoe, Menard helped build Google’s Vertex AI cloud service, so he’s a real hands-on expert.

Menard described a lifecycle where GPUs migrate from cutting-edge AI model training jobs to less demanding inference workloads.

When creating a new state-of-the-art model, you need the latest and greatest GPU from Nvidia.

Then, you have to run these top models, a process called inference. That requires powerful GPUs, but not the latest ones.

Beyond that, there are thousands of different, valuable AI workloads that can run well on older GPUs, according to Menard. That means there are many GPUs that are multiple years old in Crusoe’s fleet and are still actively used and profitable.

“Because there’s a large diversity of models to solve many different problems, there’s a lot of room to use GPUs for a long time, just transitioning them from one type of job to the next,” Menard told me. “It’s actually a widely accepted view in the industry.”

Free versus paid

AI cloud companies consider user expectations and budget to help them decide which GPUs to use. To illustrate, Menard described an example of an AI service that has a free tier and a paid version.

“You may decide that for the freemium version you’re going to use an AI model that can be inferenced on older, cheaper hardware with lower performance,” he said.

That’s likely good enough to create an initial experience for users. Then, some customers might migrate to the paid version. At that point, you tap into a more powerful AI model that requires newer GPUs to deliver a superior user experience.

“We see a lot of these opportunities,” Menard said. “Not everything is a nail requiring one single mega-model running on the latest and greatest GPU.”

Open-source + older GPUs

Some AI services are less compute-intensive and can be run on open-source models, such as Alibaba’s Qwen, DeepSeek, or Meta’s Llama offerings. One example is speech-to-text services (such as the transcription service I used to transcribe my interview with Menard).

Older or less-capable models can be run on older GPUs, while still providing valuable intelligence for AI services that customers will pay for. (Business Insider pays for those transcriptions, for instance).

As more startups embrace cheaper open-source models, older GPUs could actually be used even more. “An open model may be absolutely great and give a more cost-competitive structure,” Menard said.

Older GPUs are cheaper

Older GPUs use more energy to produce the same amount of intelligence, so another investor concern is that newer GPUs will always be preferred—aggravating this depreciation problem.

That’s actually not true either, according to Menard. Older GPUs are cheaper to buy, so the fact that they consume more energy doesn’t change the fact that older GPUs are often cheaper to run, when all costs are taken into account.

“The driver for a given GPU is going to be cost, first and foremost,” he explained. “So we go to the older ones because they’re cheaper.”

What’s an L40?

So, I asked Menard for an example of an old GPU that Crusoe uses. He described new modular data centers Crusoe has built that are powered by recycled EV batteries from the startup Redwood Materials.

“I can put L40s from Nvidia in these data centers,” Menard said. “Because the whole deployment is energy-first in its design, I’m going to be able to make an impact.”

I hadn’t heard of L40s and had to ask him what they were.

“That’s an old GPU,” he said, laughing.

Sign up for BI’s Tech Memo newsletter here. Reach out to me via email at abarr@businessinsider.com.



Read the full article here

Share. Facebook Twitter LinkedIn Telegram WhatsApp Email

Keep Reading

Foreign Travelers Are Avoiding the US, in a Continued Blow to Tourism

Foreign Travelers Are Avoiding the US, in a Continued Blow to Tourism

Matt Damon Says Netflix Is Catering Action Movies for Attention Spans

Matt Damon Says Netflix Is Catering Action Movies for Attention Spans

What Bank CEOs Are Saying AI Will Do to Their Head Counts

What Bank CEOs Are Saying AI Will Do to Their Head Counts

America’s Largest Labor Movement Joins the Fight Against ICE

America’s Largest Labor Movement Joins the Fight Against ICE

I did my 5-day vacation all wrong. An expert says here’s what I should have done differently to optimize rest.

I did my 5-day vacation all wrong. An expert says here’s what I should have done differently to optimize rest.

A Woman and Her Husband Built Her Mom a Tiny Home on Their Property

A Woman and Her Husband Built Her Mom a Tiny Home on Their Property

Trump Uses Tariffs to Pressure Denmark, EU Nations on Greenland Deal

Trump Uses Tariffs to Pressure Denmark, EU Nations on Greenland Deal

Musk Wants up to 4 Billion in OpenAI Legal Battle

Musk Wants up to $134 Billion in OpenAI Legal Battle

Silicon Valley Rewards Standout Employees As AI Shifts Compensation

Silicon Valley Rewards Standout Employees As AI Shifts Compensation

Add A Comment
Leave A Reply Cancel Reply

Editors Picks

Matt Damon Says Netflix Is Catering Action Movies for Attention Spans

Matt Damon Says Netflix Is Catering Action Movies for Attention Spans

January 17, 2026
What Bank CEOs Are Saying AI Will Do to Their Head Counts

What Bank CEOs Are Saying AI Will Do to Their Head Counts

January 17, 2026
Why Downsizing Our Home Gave Us More Time and Connection

Why Downsizing Our Home Gave Us More Time and Connection

January 17, 2026
America’s Largest Labor Movement Joins the Fight Against ICE

America’s Largest Labor Movement Joins the Fight Against ICE

January 17, 2026
Inside Selkirk’s Million-Dollar Lab That Makes Top Pickleball Paddles

Inside Selkirk’s Million-Dollar Lab That Makes Top Pickleball Paddles

January 17, 2026

Latest News

I did my 5-day vacation all wrong. An expert says here’s what I should have done differently to optimize rest.

I did my 5-day vacation all wrong. An expert says here’s what I should have done differently to optimize rest.

January 17, 2026
Costco Kirkland Signature Vs. Name Brands: Price Analysis Shows Savings

Costco Kirkland Signature Vs. Name Brands: Price Analysis Shows Savings

January 17, 2026
A Woman and Her Husband Built Her Mom a Tiny Home on Their Property

A Woman and Her Husband Built Her Mom a Tiny Home on Their Property

January 17, 2026

Subscribe to News

Get the latest finance and business news and updates directly to your inbox.

Advertisement
Demo
Facebook X (Twitter) Pinterest TikTok Instagram
2026 © Prices.com LLC. All Rights Reserved.
  • Privacy Policy
  • Terms
  • For Advertisers
  • Contact

Type above and press Enter to search. Press Esc to cancel.