10 min read
Join Discussion

Has AI Created Any Economic Value Till Now?

Yes, but it's narrow and uneven. Companies like JPMorgan ($1.5B in value), Klarna ($40M savings), and Shell ($2B savings) show real gains, but only 9.3% of businesses use AI in production. We're in the investment phase before the productivity boom—like electricity in 1905. The value exists, but economy-wide transformation is still 5-15 years away.

artificial-intelligenceeconomicsproductivitycloud-computinggpu-marketai-strategy

There's a simple question that cuts through all the hype: Has AI actually created economic value, or are we just burning money on an expensive science experiment?

Before we can answer that, we need to understand what "economic value" even means. And more importantly, we need to know what happened the last few times humanity invented something that supposedly changed everything.

Because this isn't the first time we've been here.


What Economic Value Actually Means

Strip away the jargon and economic value is simple: you take inputs (labor, materials, time) and transform them into outputs that people value more than the inputs cost.

A baker takes $2 worth of flour, water, and yeast. Makes bread. Sells it for $5. That's $3 of economic value created.

Scale this across millions of transactions and you get GDP. But the metric that actually matters for whether people's lives improve is productivity: how much value each worker produces per hour.

Why does this matter? Because productivity determines wages. If workers produce more value per hour, companies can afford to pay them more. If productivity stagnates, wages stagnate. Living standards stagnate.

This is the only thing that matters in the long run. Not stock prices, not market caps, not how many billions get raised. Can we produce more value with the same amount of human effort?

That's the test.


The Last Time This Happened: Electricity

Let's talk about electricity, because it's the closest historical parallel to what's happening with AI right now.

Electric motors were invented in the 1880s. By 1900, everyone knew electricity was the future. Investors poured money into electrical companies. Newspapers wrote breathless articles about the coming transformation.

But here's what nobody talks about: productivity didn't budge for decades.

Between 1900 and 1920, even as electricity spread through American factories, productivity growth was fine. Not spectacular. Not revolutionary. Just fine.

It wasn't until the 1920s (40 years after the invention) that productivity exploded.

What took so long?

The problem was that early adopters were doing it wrong. They were taking their factories (designed entirely around a massive central steam engine with belts and shafts running through the ceiling) and just swapping out the steam engine for an electric motor.

Nothing else changed. The building layout was still dictated by the mechanics of steam power. The workflow was the same. The management structure was the same.

They were using revolutionary technology to do the exact same things in the exact same way.

The productivity boom only came when someone finally asked: if every machine can have its own small electric motor, why do we need these massive drive shafts at all?

That question changed everything.

Suddenly you could design factories around production flow instead of power transmission. Single-story buildings with natural light instead of dark multi-story structures. Assembly lines. Flexible machine placement. Entirely new ways of organizing work.

That's when the value showed up. Not when the technology arrived, but when people reorganized around it.

The pattern repeated with computers. Invented in the 1940s-50s, commercially available in the 1960s-70s, but the economist Robert Solow famously said in 1987: "You can see the computer age everywhere but in the productivity statistics."

The gains didn't arrive until the 1990s. Another 20-30 year lag.

This isn't a bug. It's how general-purpose technologies work. The technology arrives first. The reorganization comes later. Sometimes decades later.


So Where Are We With AI?

ChatGPT launched in November 2022. We're about three years into this.

If AI follows the same pattern as electricity and computers, we shouldn't expect economy-wide productivity gains until somewhere between 2027 and 2035.

But that doesn't mean nothing's happening. Value is being created right now. Just not evenly, not everywhere, and not yet at the scale that justifies the spending.

Let me show you what I mean.


The Value That Actually Exists

JPMorgan Chase publicly states their AI systems create $1.5 billion in value. Not "could create" or "will create." Created. Past tense.

Their fraud detection system cut false positives by 95%. Think about what that means. Thousands of analysts were spending hours reviewing transactions that turned out to be legitimate. That time is now freed up for actual productive work.

Klarna's AI customer service handles the equivalent work of 700 full-time agents. It saves them $40 million annually. Average resolution time dropped from 11 minutes to under 2 minutes. Repeat inquiries fell by 25% (meaning the AI is actually solving problems better than the humans were).

Shell reports $2 billion in annual savings from AI-driven predictive maintenance across their oil and gas operations. Equipment failure rates down 40%. Unplanned downtime down 35-40%.

These aren't projections. These are numbers from financial statements.

Software developers using GitHub Copilot complete coding tasks 55% faster. That's not a 10% improvement. That's a fundamental shift in what one person can accomplish in a day.

Customer support agents using AI see a 14% productivity improvement on average, with less experienced workers seeing gains up to 35%. The technology doesn't just speed things up (it levels the gap between novices and experts).

The value is real. The problem is it's narrow.


Why the Big Numbers Don't Show It Yet

Only 9.3% of U.S. businesses actually use AI in regular production. Not "employees played with ChatGPT at home." Not "we ran a pilot project." Formal, institutionalized deployment where AI is built into how the business operates.

Compare that to electricity in 1900: about 10% adoption. By 1920, it hit 30%. It didn't reach 80% until 1930.

We're in year three of a thirty-year transformation.

Here's the other problem: even when companies deploy AI successfully, they're mostly doing the "electric steam engine" thing. They're plugging AI into existing workflows without changing anything fundamental.

A junior employee used to spend an hour drafting a document. Now they use ChatGPT and do it in 15 minutes. Great.

But the document still goes to a manager for review, who sends it to a director for approval, who sends it to legal for clearance. The workflow built for humans operating at human speed is still there.

The time saved on drafting gets absorbed by all the unchanged friction in the process. The productivity gain disappears into the organizational structure.

This is why 95% of AI pilots fail to reach production. Not because the models don't work, but because companies aren't ready. The data is a mess (unstructured, siloed across legacy systems from the '90s). The verification requirements are strict. The integration costs are high.

And there's a deeper issue: AI is probabilistic. A chatbot that's 90% accurate sounds great until you realize that in finance or healthcare, a 10% error rate is catastrophic. So companies keep humans in the loop to check everything, which negates the whole point.

The technology works. The scaffolding around it doesn't exist yet.


The Signal in the Labor Market

While GDP numbers stay quiet, labor markets are screaming that something is happening.

Employment for workers aged 22-25 in AI-exposed occupations dropped 13-20% between 2022 and 2024. Not mass layoffs. Something subtler: companies just stopped hiring juniors.

The entry-level work that juniors used to do (basic coding, drafting documents, data entry) is now automated. Companies keep their senior staff and skip the apprenticeship layer.

Freelance writers and translators saw earnings drop 5% immediately after ChatGPT launched. Contract volumes down 2%. The market for commodity content is collapsing toward zero.

But high-end strategic work? Still valued, often more than before.

This is economic value creation, just not the kind we usually measure. It's deflationary. Doing the same work with fewer people. Producing the same output at lower cost.

That value goes somewhere (either to company profits, or to consumers through lower prices, or to shareholders through higher returns). It's value. It's just being created through elimination rather than addition.


The Infrastructure Spending Question

Microsoft, Amazon, Google, and Meta are collectively spending over $350 billion annually on AI infrastructure. Data centers, GPUs, energy systems, networking equipment.

This is real economic activity. Construction workers building facilities. Electricians wiring systems. HVAC specialists installing cooling. Semiconductor foundries running at capacity.

That's GDP. That's employment. That's value being created right now in the present tense.

But there's a catch.

Goldman Sachs estimates the AI industry needs to generate $600 billion in annual software revenue to justify this infrastructure spending. Current estimates put actual revenue in the tens of billions.

The question everyone's afraid to ask: what happens if the applications never catch up to the infrastructure?

We're building a highway system before we know if the cars will come.


The Productivity J-Curve

There's an economic concept that explains what's happening: the Productivity J-Curve.

When a transformative technology arrives, measured productivity initially drops or stagnates before it explodes upward. The shape looks like a J.

The mechanism is simple. Companies spend enormous amounts of money and time on adoption (buying hardware, training staff, cleaning data, redesigning processes). All those costs show up immediately on financial statements.

But the benefits lag. The new workflows aren't built yet. The workers aren't trained yet. The complementary technologies don't exist yet.

For a while, you're running two systems in parallel: the old way that still kind of works, and the new way that's expensive and not quite ready. Both cost money. Neither is fully productive.

This is the bottom of the J-Curve. This is where we are now.

The upward swing comes when the intangible capital finally pays off. When the new workflows are in place. When the workers are trained. When the organization is actually structured around the technology instead of treating it as an add-on.

That transition doesn't happen overnight. Historical precedent suggests 5-10 years minimum, often much longer.


What History Tells Us to Expect

The pattern is consistent across every general-purpose technology:

Steam power (1760s invention to 1800s value creation): around 50 years Electricity (1880s invention to 1920s productivity boom): around 40 years
Computers (1960s adoption to 1990s gains): around 30 years

The lags are compressing, but they're still measured in decades, not quarters.

And there's another pattern: the initial economic effects are often negative for workers, not positive.

During the early industrial revolution, workers' living standards declined. Machines displaced skilled craftsmen. The "displacement effect" dominated over the "reinstatement effect"—the creation of new jobs.

It took generations for the balance to flip, for wages to rise, for the average person's life to actually improve.

We might be in a similar period now. The technology is eliminating entry-level positions faster than it's creating new roles. The gains are accruing to companies and investors, not workers.

That could change. It historically does. But the transition period is painful.


So Has AI Created Value?

Yes. Billions of dollars worth. Measurable, documented, real.

Is it enough to justify the valuations and the spending? Not yet.

The value exists, but it's concentrated. In specific tasks. In specific companies. In specific use cases where the conditions happen to be right.

The rest of the economy is still in the messy middle—spending money on adoption, running pilots that fail, trying to figure out how to make this technology fit into the reality of their business.

We're not in a productivity boom. We're in the investment phase that precedes the boom.

Think of it this way: in 1905, if you asked "has electricity created economic value?", the answer would be technically yes, but trivially compared to what was coming. The factories that had adopted it were seeing gains. But most of the economy hadn't reorganized yet. The real transformation was still 15 years away.

That's where we are with AI. The technology works. The early adopters are seeing real gains. But the economy-wide transformation—the moment when productivity statistics actually start moving—that's still ahead of us.

The question isn't whether AI will create value. It already is.

The question is whether we have the patience to wait for the organizational and social changes that allow that value to compound into something economy-transforming.

Based on history, we should. Based on market behavior and quarterly earnings pressures, we probably won't.

And that tension (between the inevitable long-term transformation and the impatient short-term capital markets) is where all the interesting chaos lives right now.

Loading comments...