Stack the Deck

AI's gonna be a thing. That's a cold take. AI's hot. You *may* have noticed. Investors have already made plenty on that theme. The edgy call now is to go short the bubble.

Is it a bubble? Surely. OpenAI's said it'll spend over a trillion — with a "T" — dollars over the next handful of years. It only makes $20 billion or so in revenue and it loses money on that. So, there are signs that suds might be stretched.

Will it all pop tomorrow? Probably not. Production capacity for some pieces of the puzzle, like memory, have already sold out through 2026. Popping bubbles are a much more interesting thing in hindsight anyway. Hindsight isn't how investing works. We can only look forward.

Bubble or not AI will be around. It's already changed a lot. It already creates value. Even if OpenAI loses money today, Nvidia, TSMC, CoreWeave and others in the value stack are making money. Take all the pieces in that stack together and AI produces value. That will grow too.

In that stack, AI is unbundled. Each player specializes in one or two pieces. Lamplighter's talked about the two ways to make money: unbundling and bundling before. If you're looking for a bundle, that's Google.

Hot hand

A handy thing about bundles is that you have more cards to play. There are a handful in the AI deck where value could wind up. Here are some of them: chips, data centers, data, and models.

Ante up

When anyone talks about AI chips, they mean Nvidia. Everyone wants to use them. No one else makes as many AI chips. Next to Nvidia, the next biggest AI chip designer isn’t a chip designer. It's Google. It uses its tensor chips ("TPUs") internally for its AI work. It deployed more than 2 million in 2023. It’s the second most used AI chip behind Nvidia. Nvidia is flirting with a $5 trillion market cap. Figuring out AI chips has been a nice business for Nvidia.

Google's chips show up mixed in with the rest of its business. They're harder to see than Nvidia's kit. But they're competitive and a viable alternative to Nvidia. Midjourney uses TPUs for its images. Anthropic recently announced a deal worth tens of billions for one million TPUs to train its models. Google's the first and, so far, only chip maker to chip into Nvidia's corner.

Pot committed

Then there's the thing chips go in: data centers. Amazon and Microsoft run the lion's share of these. Google's in third place with about 13% or so of the market. Altogether they handle 60% of the data center load. AWS and Azure favor handling their customers' workloads. Google runs more internal stuff, but its cloud business is its fastest growing and all three companies struggle to provide enough racks to the market.

Google's DCs run more on their own hardware than the others. Broadcom helps with the design and makes about a 50% margin on those chips. Everyone else has to pay the Nvidia tax. Nvidia takes 80-90% margins on its high end chips. By using its own chips, Google carved a pathway to being the lowest cost AI operator. That, though, depends on some other cards in the stack.

The turn

Then there's data. So far, most frontier models (ChatGPT et al.) trained on "the internet." They trained on the whole thing. There's nothing left. No open sources of data to improve the models. There are plenty of "closed" sources though. Companies have loads of internal documents they could use to train models for their specific needs -- think law firms.

And, as it happens, plenty of people have Google accounts for emails, docs and other stuff not out in the open. Google’s also operating autonomous vehicles IRL through Waymo. These collect massive amounts of data no one else can use. But the big — maybe the biggest — source of data for training is YouTube. Google also owns that too. Google's AI already supports closed captioning, but you can see what’s possible its recent releases of Flow and Genie. Google's the only one of the frontier model makers — OpenAI, Anthropic, DeepSeek — to also have this depth of native and captive data it can use to train new and better models.

I call

Then there’s the models themselves — the chat bots and AI agents the world uses to access AI. OpenAI's ChatGPT has captured user's and investors' attention. It's potentially preparing for an IPO in the coming year. Meanwhile Anthropic performs better for coding (Google owns 14% of Anthropic). Google also runs its own frontier model — Gemini. It also has more experimental "AI mode" for search, Nano-banana, Flow, Genia and others AI tools it releases often.

Lamplighter uses many of the models — ChatGPT, Anthropic and Gemini. It's easy and fast to run the same task through multiple models to get more nuanced information and check responses. So far, the LLM layer looks least defensible. It’s easy to switch between what's working. There's a lot of value in this layer, but most of it looks like it might go to "AI researchers" not necessarily to investors.

Full house

Google designs its own chips. This gives it a cost advantage. It runs its own AI datacenters. It has loads of data no one else can use. It builds its own models (and has that stake in Anthropic). Is Google the perfect AI investment?

ChatGPT says that’s an “overstatement.” Lamplighter agrees. Nvidia is the world's biggest company. It makes A LOT of AI chips. OpenAI is the leader in the LLM category through sheer force of will. Microsoft and Amazon run many more data centers. There's lots of competition to worry about.

And there's the cost. Google raised $25 billion in debt this week. This after increasing the investment it expects to make in the last three months of the year by, oh, just $10 billion. The debt adds to $22 billion already on its balance sheet. Google makes a lot. It can afford it.

Elsewhere, AI financing has veered into "creative" lanes. This sometimes happens in late-stage bubbles.

Google though, is a leader at each layer in the stack. It's already operating at a massive scale. It has a deep bench of AI talent. AND it has a handful of cash generating businesses funding most of this. Those all continued to grow right through all the disruption of the last few years. So maybe not a perfect AI investment, but modest expectations for the product of all this work puts the shares in very attractive territory.

Disclaimer: None of this is investment advice. It's meant to illustrate ways LCM thinks about investing. Things that LCM decides are good investments for LCM and its clients are based on many criteria, not all of which are covered here. Some or all of LCM's ideas may not be suitable for other investors. LCM does not recommend investing either long or short any position mentioned. LCM may own positions in some of the companies mentioned. Some of its ideas will lose money — investing entails risk. See full disclaimer here.

Next
Next

The Old Ball and Chain