Both mean more GPUs are needed, not fewer.
Thanksgiving is around the corner, but on the New York Stock Exchange, it’s Groundhog Day.

Earlier this year we highlighted how the stock market massively misinterpreted the Chinese AI reasoning model called DeepSeek (Nvidia was down 17% in one day). In short, this language model was quite smart despite being smaller and cheaper than other cutting edge models. That led to the following incorrect conclusion “if AI models are cheaper to make, then we won’t need so much computing hardware.”
Last week, Google released Gemini 3 and took the crown as the best performing AI model across a battery of tests. Since Google offers this model quite cheaply, and trained it on Google’s own in-house hardware (TPUs), the market is once again concluding “Google’s model is so good, no one is going to need to train models on Nvidia GPUs”.
The DeepSeek interpretation was wrong for at least 2 reasons. Specifically, cheaper models mean more accessible models—they get used more and lead to more total demand for computation. DeepSeek was also a reasoning model, and reasoning models use far more computation when interacting with them. Several times more computation, in fact.
The Gemini 3 interpretation is wrong because Gemini proves that bigger models trained with more computation resources still make for better models. Nvidia is essentially the only game in town for anyone who isn’t Google to train a state-of-the-art model. And Nvidia’s latest Blackwell systems are the perfect hardware for training this next generation of larger, smarter models.
All roads point to more computing usage on AI. And all signs point to more computing usage making models smarter. In short, larger models are smarter models, and smarter models can create more economic value.
Both DeepSeek and Gemini 3 support the notion that there simply isn’t enough computing hardware to meet the demands for all the things people want with computing hardware.
The market reaction to DeekSeek was wrong. The market reaction to Gemini 3 is also largely wrong (except for the part that it’s good for Google to be at the cutting edge; which of course is good for Google).
Best regards,
![]()
Evan McGoff
Disclosure: Dock Street Asset Management, Inc. and/or our clients may own Nvidia (NVDA) and Google (GOOG). This article is not intended to be used as investment advice.
Dock Street Asset Management, Inc. is an investment adviser registered with the U.S. Securities and Exchange Commission. You should not assume that any discussion or information contained in this letter serves as the receipt of, or as a substitute for, personalized investment advice from Dock Street Asset Management, Inc.
It is published solely for informational purposes and is not to be construed as a solicitation nor does it constitute advice, investment or otherwise.
To the extent that a reader has questions regarding the applicability of any specific issue discussed above to their individual situation, they are encouraged to consult with the professional advisor of their choosing.
A copy of our Form ADV Part II regarding our advisory services and fees is available upon request.
Our comments are an expression of opinion. While we believe our statements to be true, they always depend on the reliability of our own credible sources. Past performance is no guarantee of future returns.

