Computation does real work, generating revenue and profit

 

Summary: AI skepticism is everywhere, and some of it is understandable, but we think most of it is missing the point. AI tools have become truly productive and that is creating massive growth in usage. AI is consuming computing power at a scale we’ve never seen before, and that’s not a temporary surge. It’s a one-way door. This demand is creating a sustained, multi-year tailwind for the companies building and supplying the infrastructure that makes it all run, and those companies are in your portfolio. 

(Jargon Alert: When we use terms like “computational” or “computational resources,” we simply mean the computer’s raw working power—its ability to process information and get things done.)

Last year, we wrote about how there’s a significant shift going on with how we use computers, in No Longer Idling. In that letter, I shared a screenshot of my Mac’s resource utilization while running several applications simultaneously. Despite all the activity on the computer, its computational resources were mostly sitting idle.

Let’s compare that to asking a question to a large language model (like ChatGPT) and the computational intensity involved in just that single task. As the computer thinks about its answer and generates a response, it’s nearly maxing out its short-term memory and its computational resources.

By this point, most people have tried either ChatGPT, Claude or Google Gemini. And while we’ve gotten some interesting answers and maybe even had a full conversation, there’s been a reasonable skepticism about the value of chatting with an AI model. Is it really worthwhile to consume all these computing and energy resources just to chat with a machine? Are these chats really productive? 

Then late last year, Anthropic released Claude Code, the first artificial intelligence tool that autonomously gets work done—in particular, writing code for developing software. Unlike the chat interface, Claude Code operates directly on files and folders. It can read code across multiple files, make changes, create new files, run tests and work on its own for several minutes at a time. AI tools like Claude Code that operate with greater autonomy and productivity reflect a fundamentally different category of capability—one increasingly described as ‘Agentic AI.’

Here’s Nvidia’s Jonathan Ross with a succinct definition:

“Let’s try and explain agentic AI really simply. Who here finds AI useful in getting tasks done? [all the hands in the room go up]. Okay, so would you be surprised to find out that AI finds AI useful in getting tasks done? When AI invokes AI to solve the task, that’s agentic AI.”

Instead of doing just one task, chatting, Claude Code can do many tasks, often simultaneously. As we showed in the screenshot above, doing even one AI task results in a lot of computation. Now with AI agents doing many tasks and longer tasks, computational intensity has taken another leap forward. 

There simply isn’t enough supply of computing hardware capacity to meet the growth in demand. Two weeks ago, Arm Holdings announced a new datacenter CPU chip to help make a dent in the exponential growth in demand for computation hardware. Here’s Arm CEO Rene Haas on why the company decided to create a new chip:

“And what has changed in the last number of months has been this explosion of agents… Now, why is this important? Why am I talking about this? Because as we move to agentic query, the number of tokens [units of computation] per human goes up by 15x, if not greater…they don’t sleep. They’re at it 24/7.”

As AI agents do more computing work, not only will computers not be idling, they will be running 24/7, creating a backdrop of continuous computation consumption. Computer technology has walked through a one-way door towards greater computational intensity and we will not be going back. This means massive demand for computation resources that will outstrip supply for several years, or longer. The result is a sustained, multi-year tailwind for the companies building and supplying that infrastructure.

 

Best regards,

Evan McGoff

 

 

Disclosure: Dock Street Asset Management, Inc. and/or our clients may own Nvidia (NVDA) and ARM Holdings (ARM). This article is not intended to be used as investment advice.

Dock Street Asset Management, Inc. is an investment adviser registered with the U.S. Securities and Exchange Commission. You should not assume that any discussion or information contained in this letter serves as the receipt of, or as a substitute for, personalized investment advice from Dock Street Asset Management, Inc.

It is published solely for informational purposes and is not to be construed as a solicitation nor does it constitute advice, investment or otherwise.

To the extent that a reader has questions regarding the applicability of any specific issue discussed above to their individual situation, they are encouraged to consult with the professional advisor of their choosing.

A copy of our Form ADV Part II regarding our advisory services and fees is available upon request.

Our comments are an expression of opinion. While we believe our statements to be true, they always depend on the reliability of our own credible sources. Past performance is no guarantee of future returns.