CNBC Television
How Silicon Valley's 'tokenmaxxing' is juicing AI demand
2026-04-09 42min 3,226 views watch on youtube →
Channel: CNBC Television
Date: 2026-04-09
Duration: 42min
Views: 3,226
URL: https://www.youtube.com/watch?v=2OHMstRVqdE

Tokens are the basic unit of AI usage. Engineers across Silicon Valley are now competing to burn the most tokens, a trend called "tokenmaxxing." What does this mean for the $1 trillion+ infrastructure buildout? Ramp CEO Eric Glyman announces a new product that tracks AI token spend across the enterprise for the first time. Investor Dan Niles weighs in on whether the AI trade can recover when nobody can separate real demand from noise.

Hi everyone. It's Thursday, April 9th. I'm Dear Drabosa with another CNBC live stream. So, there's this thing happening in tech right now called token maxing and it tells you a lot about where we are right now in the AI cycle. Engineers, they're competing to consume the most AI measured by tokens. It's it's almost like a sport at this point. Jensen Wang, CEO of Nvidia, he said that he'd be alarmed if a top engineer was not burning 250K a year in AI compute. Shopify told me that they use it as a performance signal. And Meta employees, well, reportedly they blew through an estimated 900 million tokens in a month. It's 900 million, nearly a billion dollars. This all raises a pretty obvious question. How much of that actually did anything? It's like measuring a trader by how many trades they make instead of whether they're actually profitable. Volume in this case, it doesn't equate to value. And we have literally seen this before. Amazon used to grade call center reps on how fast they got off the phone. So reps started hanging up on customers. That metric went up, but service went off of

a cliff. So Bezos killed it immediately. Something immediately he killed it. Something similar is happening here, except that the stakes are a trillion plus dollars in infrastructure spending. So if you own any AI stocks, you might want to ask, what if a chunk of the buildout is inflated? Is all of that demand real? Now look at two of the biggest AI labs. Open AI. It's making AI cheaper, easier to use, so more people consume it. It needs the usage numbers to justify spending. Anthropic, meanwhile, putting limits on how much and making people pay for it, maybe because it wants to know demand that it's seeing. Is it real? Now, one of these companies is right. Another maybe not so much. And we want to get to the heart of that question. If you own any AI stocks, you might want to ask, what if the chunk of the buildout is inflated? Um, so how do you separate real AI demand from the noise? That is exactly what I want to get into with Eric Lyman, the CEO of Ramp, who's launching a new product today to do exactly that. Welcome. >> Thanks so much for having me.

>> So nice to have you in person. >> Yeah, even better. >> How's Humanex first? How have you found the conference? >> It's super interesting. I mean, look, having a big conference centered on AI um really in the heart of where AI is being developed means there there just a lot of great conversations. And so it's been a lot of fun. >> Who do you think has the momentum right now? Like what's like most talked about? >> Oh my god. I mean, you can't walk a block in San Francisco without hearing about Enthropic and Clyde. Um, I think it's it's undeniable. This is a company that three years ago made their first dollar in revenue and last month passed over $30 billion a year in revenue. Uh, it's got to be them, >> right? Last year, do you feel like you were in Vegas for Human X, right? Uh, >> I didn't go last. >> You weren't there, but you come here often from New York. Do you feel like that's changed? like a year ago it was open AI. >> Absolutely. Yeah. >> Absolutely. Look, it it's um you know, I think even a year or two ago people would would joke this sounds Claude sounds like a mid-century modern name for some interior design thing to now you know I feel like Claude has become many people's best friends at least in the city. So it's a complete shift. >> It's good to hear that from you that

travels between New York and San Francisco because I feel like being here all I hear about now is Claude these days. Um I think that's getting through to Wall Street as well. Certainly. I think we had someone who boded their own app on this morning on TV, which Jasmine and I have had a little experience with, too. It's It's a lot of fun. >> It's cool. >> Okay, so yes, the talk is all about AI all the time. We have known that, but something has changed a little bit. We started talking about token maxing, and I love the way you guys put this, your token tracking. Explain what that means and the product that you have. >> Perfect. So first of all, there is an incredible amount of transformation clearly in what AI can do for people and there is a lot of excitement and fervor to get people using these tools which is great. I think it leads to faster revenue growth for companies that use this well. We can see this very clearly in the data. Uh and I think in general this is a good thing. I think more people should experiment and dabble with this. But our customers are CFOs. their finance leaders and their job is to budget uh for where the business is going to go

and make sure they don't miss and if you look over the past year across ramps data uh token and AI spend has grown by 13 times over the past year 50% a quarter uh and you look at this uh and what is very clear is no one knows how to budget for this um on top of that uh it is possible maybe to use an analogy uh one could say you know I want to get groceries delivered, uh, buy a Ferrari, um, and have, you know, hire, you know, the the the the Tony's car to go pick up your groceries. >> You can use the most advanced model on the planet to edit your email, but maybe you don't need to. >> That's a great analogy, >> you know, and I think >> you may just need a Prius to deliver those groceries, Ferrari would be fine. You can wait till next week. It'll be okay. >> Be a bike. It doesn't matter. >> Even a bike, you know, and that is a good lens for what's happening in these companies. Um there is an incredible amount of usage of models which today even models that are outofdate six months old are perfectly performant for

most of the nature of tasks being done and so you're seeing kind of this intersection um where CO was saying I need help we need to go and manage this better how do we get on top of this so what we launched today uh is really token tracking but but really it's AI spend management um it's the ability for CFOs and finance teams to be able to drill into and understand um where their spend is going what's the nature of the task and if other models may be more performant and how much could be saved by shifting volume based on the right level of task and so I think both in ramp being a leader in this but I think this is going to be a mega theme for the next year >> how do you figure that out on your end how do you figure out you know if someone's using >> opus for you know an email >> and they should be using sonnet how do you like break that down and even know how that's being split up >> yeah so first of all what what makes ramps data unique um is that both the the the card and bill payment spend as well as the invoice or the receipt for spend comes through every month for uh accountants that close their books. They add these records and we're able to get to an itemized level of detail um um

>> itemized even to the level of questions within a chatbot. >> So it's itemized on on on the spend side of it to the model and nature of the task being used and you can connect it even um to the entity. You can also add on software where it it functionally tracks and bor and and kind of drills into um we call our our software borer um to go and understand the nature of the task. And so we're able to go and show okay based off of this is the department that's being used. This is the nature of of the service actually this model is likely to have been performant. And so on on day one that's the level that's being built out um dramatically each and every week. Um, >> so how do you think that is going to change the way that engineers are using tokens? And I mean, you've heard these stories. I don't know if you know anything about it, but like the leaderboards, right? Internal leaderboards that are companywide. And I feel like it would be the same for engineers, for people in a company. They say, "I've used so many tokens. Look at how much I'm adopting AI." CFOs could do the same thing as well, right? What is the incentive? Obviously, costs to lower

it. Is are you starting to see that? Do you think we'll see that? >> It's it's real, right? I there was a lot of discussion even on X a few days ago about incentives at Meta. Uh and I think people are pointing out this this idea of good hearts law which says uh a a a measure of performance that becomes a goal ceases to become a good measure of performance. Um you know once you incentivize people to go and you know get off the call people hang up quickly. Once you incentivize use as many tokens, you know, you'll see engineers go and count all of the, you know, numbers of prime or all the digits of pi and use these tokens and it goes on and on and on. >> Another way I've been putting that is if you focus on a metric, the metric stops being useful and that like Bezos example that actually came from Dan Nile is our next guest. But that's exactly what you're saying, right? Is that once this you start using it to measure, people are going to kind of take advantage of it. >> That's it. And look, I I think that great management teams in any generation a 100 years ago to even now focus on exactly what you highlighted. What's the impact? What's the revenue growth? Are we shipping products faster? Are we

delighting customers um with less effort and more frequently? And we find there is a clear connection. I think this is part of why so many management teams are mixing this up of often folks who are adopting this incredible new tool fastest and using most tokens actually are the highest performers. Um, but you can't say I want you to go use the tokens first. It's kind of setting it incorrect incorrectly in our view. >> I guess that leads to the question of wastage then, and this is what Jasmine and I are trying to figure out is that if you have these sort of internal leaderboards, if you're focused on this metric, how much of that is wastage, but you're saying you're seeing in the data that these are actually top performers. And the way Ali Godsy put it to me a few days ago here at the conference, he said he kind of had this analogy like some amount of wastage is necessary when you have a new technology. when people were typing it up on a typewriter and then it was just people I mean I I think to produce any great work it requires drafts you know you look at how designers um you know craft

new products they make 10 different versions throw them away and it's the 11th version that works um you have rough drafts of of everything you know it's like the classic Mark Twainism I don't know if he said it but if I had more time I would have written you a shorter letter right? It takes time to distill and really simplify things and I think going through the motion really does work and I think it's really really effective. Um but uh as time goes on uh you learn uh efficient motion. Um you get better with with kind of your moves and I what I would say is I think we're we're clearly going through this and part of why we're focused on this is not we think the spend is wasteful. It's actually the opposite. I we think that AI spend is going to grow dramatically, but to get the benefit of it and to make it affordable, it needs to generate a return on your investment. And so we're seeking to provide the tools uh for companies to be uh better able to get deliver and connect token spend to ROI. >> So maybe wastage isn't the right word, then maybe it's efficiency, right? And do you think that especially with the data you have and I know you're releasing this product, do you think

that companies become more efficient? Do you think what you're seeing demand picture-wise is accurate right now? >> I I would the most compelling data that we've seen for this uh has to do with data we put out about two weeks ago around this emergence of what we're we're calling the K-shaped economy. Um so we separated out across over 50,000 businesses the bottom quartile of spenders on AI. These are people who are spending almost maybe never used one of these models maybe you'd find in there. and the top cortile um people who are spending a a the highest percentage of their revenue on AI um related tools. This isn't just tech companies. This could be construction firms, roofers, people installing windows. And what we found shocked us in the bottom quartile over the past 3 years their revenue grew by about 12% so call it 2 to 5% per year essentially flat. Um the top quartile more than doubled. Wow. >> And the rate of growth has grown by each year successively. And so what I would say is it it is real like the exciting part is that companies that adopt this

well are able to grow faster and frankly hire more invest more and uh it it becomes kind of the fuel to go and do this. Um but within both like both can be true of like you you can be using this but you could be getting far more performance out of it. And so in general I I am uh even though I'm uh uh based on the east coast I'm with a lot of these west coasters like I think that spin on AI is great. people should spend the time, but it's also true there is a better way to track the token spend and to get more out and we're excited to make it easier for people. >> So, it's almost like keep spending, but what you're introducing though is a way to have some discipline. That's right. And that's really important for Wall Street for CFOs for your customers. >> It kind of it's interesting when you look at two of the top AI labs, Open AI and Anthropic, they seem to have different approaches to this, right? Um, Open AI is just building up as much compute capacity as possible. They're preparing for a world in which demand continues on the straight line up into a right. Anthropic feels a little bit more disciplined even some of the things they're doing numbers wise, right? In

terms of cutting off third party access in some cases, charging, setting limits. >> Who's right in that case? It feels like anthropic is more what you're talking about building a lot but being a little more disciplined whereas open just throwing everything and meta too. Maybe >> it is. Look, I our whole mission as a company is to help everybody get more from every dollar an hour. I think performance and efficiency matters. Uh I think setting aggressive goals but understanding um what will the return on investment be? Um and underpromising and overd delivering is often the more prudent and long-term strategies. There's so many the businesses that have made it to today are the ones that didn't just do well in one year but survived each and every year. And I I think that that that Krishna um the CEO of Enthropic is um very proven finance leader. Uh I think uh they have far more data than than I do. Uh I think that that approach uh certainly resonates um I would say from

an East Coast and Wall Street mentality, right? >> Um >> less so from a West Coast mentality. >> We will see. We'll see though. >> Yeah. No, we'll we'll see who's who's right. I I would say my natural inclination uh is closer to that, but um you know, it's an interesting world. We we'll see who's right. Um, a question from the audience because we're doing a live stream, so this is great. Carson Allen, hello. Thank you for joining us. Um, she's a regular viewer. If you're a company, wouldn't you expect to use a lot more tokens at first until they get a grip and then rapidly trim down the amount of tokens used? If so, does this make AI revenues inflated if you assume optimization? Great question. >> It is a great question. One, and actually, I think it's the the right mental model for how I've seen learning observed and how actually learning takes place. like my um uh when when when you kind of look how how learning actually occurs in in in in the brain, there's all kinds of activity. Like if the first time you play tennis, your stroke is all over the place. You're missing things. But once you see a tennis player who's played it for 20, 30, 30 games, you know, it's a very efficient neural

pathway firing. Um they can put the ball right where they want to. Motion gets more efficient. And I think learning anything is kind of like that. And you know, uh you should take the time to play 20 games to get good. you should use all these tools. Uh and so I think that that's right. But there's also this effect in business where once you start um um uh using these tools and tokens more uh it allows these labs to provide the service at much lower cost to get more efficient understanding kind of the nature and suddenly you see kind of the parallel of the cost per token is dropping dramatically. Uh and as the cost per token drops dramatically uh there is a clear pattern of people tend to spend a lot more. And so what I'd say is there's absolutely waste. Um but I think if the cost curves keep dropping down and also if people can can spend more efficiently uh I I think there is a a strong economic case that uh you might see uh a lot more spend growth to come Jeb's paradox essentially. >> Jeans paradox that's that that is the term >> right. Um that's really interesting. So

when you say though like when companies start using this tool and they can see okay our engineers are wasting tokens using the best a frontier model when they could be using something more efficient. um Chinese models included actually. Have you seen some of this open router data? It's kind of amazing, right? >> Yes. >> I mean, the frontier models have gone to I think over 20% of the share of tokens used to 4%. Did you see that? It's incredible. >> Yeah. >> Um what does that tell you about adoption and what models are going to be used? >> I think that is a harbinger of what's going to come. Um it's very interesting seeing this development coming out uh of China. uh it is I think a place that classically has been focused on efficiency in production of things and and so I think that there's it's a it's a culture that's very attuned to this >> but when spend becomes so great when it goes from experimental to >> suddenly 30 plus billion dollars hundreds of billions of dollars soon on inference >> incentives drive outcome and I think

that companies will take a similar approach and there's an open question of will it be open source models that are self-hosted or will some of the hyperscalers right a Microsoft or an Amazon or others actually make it easier to separate out the harness and the model and your own >> model for you right >> that's exactly right and then leverage something like ramp token tracking to go and complete both ends and actually um programmatically say what you need done in the in the background it will get routed to the right source I think that is going to be a mega theme to come >> then what's the business model for open AI and anthropic strict AI labs that are, you know, their mode is essentially creating frontier models. >> You know, it's it's a super interesting thing. One, uh, I think there's a long history of companies that are on the bleeding edge as long as they can say they're do fine. You know, what was considered >> be an audience for them even if it's shrinking. >> Even if it's shrinking and look, I think that the the bullcase is, you know, you think about some of the leaders today, you know, the Apple iPhone 1, uh, you could barely use it to today, but at the time it was state-of-the-art. It led them to two, three, four and uh over

time they've been able to create products that service all sides of the markets both the very frontier as also the ordinary cases and I'm sure they have folks working really hard on that. Uh what's fascinating and and I think in some sense scary too is now there are a lot of eyes on this from a lot of big companies. So it's it's going to be very fun seeing how it evolves. >> Yes. And see how they're pricing it and what what how much they're willing to pay for. Last question from my producer and partner Jasmine. This is what I was wondering too. She puts it this way. Is it that humans need to use models more efficiently or models need to consume tokens more efficiently? I think you're getting at that with like the Amazon and the ramp tools to figure that out. But like yeah, is it ultimately this multi- mo multimodel world? Yeah. And like are are open AI and anthropic good at that or do you need like a perplexity or an Amazon that can choose for you? >> This is I think one of the most interesting questions you can ask. I I I think one uh I do believe in the multimodal version right like these are models that can reason and if you make part of the optimization function

deliver high is it maximum performance at minimum tokens and most efficient schemas I think that these models are clearly smart enough to go and do this >> the interesting question uh comes down to kind of the classic question of business of innovator's dilemma >> right >> if your business model is predicated on extracting the maximum amount of spend. Do you want to do it? And are you willing to make the sacrifice? Maybe they will or maybe they won't. I think it it sets the stage in opening for a great third party to go in and keep things in check. Uh and so we're excited to pursue it. >> Are you saying that OpenAI has an innovators dilemma? >> They just might. >> Wow. We have gone full circle from Google having the innovators dilemma to now Open AI as the incumbent having to figure that out. That's just like this. I know you know this space moves so fast. Um, but it's incredible that that is now really a legitimate question. >> Eric, it is always so wonderful to talk to you. I'm so glad we got to do this in person. Thank you. >> Thanks a lot. I really appreciate it. >> Okay, we'll let you go back off to the conference and we're going to line up

our next guest. Thanks again, Eric. >> Sounds great. Thanks again. >> Come back. We'll do it again in studio. >> All right, cool. >> So, Eric is building tools for companies to actually see what they're spending on AI. But the biggest question for investors, we touched on this, is what happens if a lot of that is waste and pulled back eventually, is the demand signal that we see right now, is it trustworthy? Founder and portfolio manager Dan Niles of Niles Investment Management is here and he's been saying that the AI trade needs more discernment. Dan, thank you so much for being with us. >> My pleasure, Deedra. >> Okay, so I'm not sure if you were listening to this conversation. um with Eric, but it's funny. He said the thing that I've been quoting you on all day, and that is when you focus on the metric, the metric stops being useful. Are we there right now with token tracking? Yeah, we are. But you need some The way I think about it is this. If that gets everybody in your organization to start

using AI, then that's a good thing. And you're going to get some wastage. you're going to get people gaming the system. You know, it always happens. You and I talked last night about the Jeff Bezos example with Amazon and call centers, and I think you mentioned it um in your intro. But, you know, I think ultimately what every company's trying to do is say, I need to get my employees using this stuff. And if token tracking is a way to at least get the ones that don't want to do it doing it, then I think that's a positive. >> So then how much of the demand signal right now do you actually trust? I mentioned this earlier, but you know the AI trade in public markets has been going sideways for a number of months now. What is that telling us? Is it telling us that the street still wants to know how much is actually useful or that they're skeptical of it? Well, I think they're they're two separate things. If you're talking about

the stocks, that's a question of differentiation where really since the launch of Chat GPT at the end of 2022, the assumption was well all these companies are expending exorbitant amounts and every press release was viewed as positive for the entire industry. Then you got to about October of last year and you know you saw the software index peak out. Um you saw a lot of questions around yeah Open AAI signing all these deals but at the time they had like 20 mill 20 billion of annualized run rate revenues but the capital commitments were 1.4 trillion. And I've been saying this since kind of the middle of last year of like I don't think they'll ever be able to get close to that um because they don't have the cash flow. And so you've had that starting to affect the stock prices and then people starting to question well which companies are getting return on investment from that

and which companies have the cash flow to support that and that's why you've seen names like Oracle get absolutely crushed because over half their backlog is related to OpenAI and OpenAI is expected to burn 220 billion in cash flow through 2029 until they get to profitability hopefully in 2030. 30. Meanwhile, you have other companies like Google that are still massively free cash flow positive, less than they were before, obviously, but they can fund all their ambitions with the cash that they are generating organically. And so, you've seen that split. And with Microsoft owning 27% of Open AI, that there's a reason that stock's down 20% year-to- date versus a Google that's actually up year to date. So it's a combination of all of those things coming together I think >> right and okay but there's maybe two ways to view free cash flow right you point to Oracle and some of the others but then you look at the example of let's take like an Amazon and a Netflix

right which were not free cash flow positive for so long but they saw the demand picture in terms of what people were ordering or what people were watching and they ended up being very profitable I know Amazon is going to go into cash flow negative negative or expected to. Um, so how do you square that? Like how do we know which one open AI is? Well, the thing is that if you are in a bubble and every major industrial revolutionary technology you've had over the last several hundred years, when you have an opportunity that big, by definition, you're going to have a bubble because if you end up being the person who comes out of it as the winner, then you're going to acrue a lot of benefits to yourself. So, there's going to be overinvestment. The problem is much like we saw most recently with the internet bubble, you have to have the cash flow to survive or the profitability or the business model to survive >> as you're going through this. And that's why you're seeing these software stocks just continuing to get killed. And you

you brought up Amazon. Don't forget Amazon in 1999 had about 1.6 billion in revenues. By 2001, their revenues had almost doubled to 3.1 billion. But from the peak of their stock in late 1999 to the bottom, the stock went down 95%. So even if you had picked the right company, you still had to suffer through a 95% draw down, which means you have to be up 20x on your money from the bottom to get just get back to even. So for me it doesn't it doesn't matter if you are having a great model if you can't survive in between and you had other companies like web van obviously go to zero and so that's why I look at that cash flow number and go by openai's own admission they're going to burn 220 billion and I don't have to worry about Google because they're cash flow positive and so that's why I look at open AAI as in a much trickier spot and

I think they're just caught between Google on on one end with consumer and then you've got Anthropic really killing it at enterprise and I think open AAI is in a squeeze between the two of them and I think they're going to have huge issues because of that and you're seeing it in actions too with right things like you know open AI cutting prices to try to get customers to them meanwhile as you mentioned I think in your prior segment anthropic is actually raising prices and cutting people off because they have so much demand that they're trying to keep people from using too much of it. And to me, that shows a company that's, you know, feeling the pressure between Anthropic and Google, >> right? And that's sort of narratives aside, that's what the numbers are showing you in in their actions. Um, we've talked about Amazon, but I know a lesson you like to bring up too because you've been covering the space um for so long is is a Cisco too, right? Like yes, you could be an Amazon and you could go through a turbulent period, but you could all and make it all back and

emerge stronger and bigger than ever, a mega cap, but you could also be Cisco and trade take 20 plus years to get back to that point. Um, what do you think open AI is? And do you think it's possible for them to go public this year given everything that you're saying? >> Well, let's forget about what I think. I think what came out publicly, well publicly is maybe the wrong word, but there have been a lot of stories that have come out that the CFO of OpenAI is not sure whether they can get public, which is part of the reason why the CEO is not taking her to some of the meetings he's had recently. That's not a good sign. And Sarah Frier is a great CFO. She's been around Wall Street for a very long period of time. So I think to some extent I I look at actions and they speak louder than words and I think you know not taking your CFO to meetings that that that tells you something especially when the CFO thinking maybe we're not ready.

>> Who ultimately wins that battle? Sam Alman wants to go public and let's say other people within the organization maybe Sarah Frier wants to hold off. What happens there? And and I mean can they let Anthropic go first? Yeah. Well, here I mean you saw Oracle replace you know get a new CFO a surprise change and they didn't reiterate guidance which always concerns me when you have a surprise change of CFO but I mean Open AI has to get here here's what you know for sure open AAI is going to go to zero if they don't raise more money by the end of the year. So that means they either a have to get public or b if they can't get public they have to raise money. If it's the second scenario, you're talking about a very big problem because if you can't get public, I doubt you're going to be raising at the valuation you want. And maybe all of this doesn't matter because one of the things we haven't talked about um yet is demand right now for AI is exploding higher because you've had this new thing called Agentic really

show up in a big way this year. I mean, your viewers have probably heard about Claudebot that came out at the end of November and that, you know, went through a couple of iterations and finally got solidified into Open Claw at the end of January. And what you've seen for token demand and I think you have a chart there um that you can maybe flash up. But if you look at what open router and you look at that chart in the two months prior to the end of January, token growth was up about 20% or so. In the two months after Open Claw got formalized, >> that token growth has grown about 130%. >> And so in that environment where now you've got people able to actually use agents to do all sorts of interesting things, this is really what everybody's been waiting for. And so the demand is this curve function. >> Sorry. >> Is the curve sustainable? I mean even you see it I see it all over Twitter and

I'm even you know playing around with my own open claw and trying to figure out ways to make it more efficient. Is that kind of that chart growth that you see is that sustainable or does it start to level out because people get better at using it and you know the models >> obviously has to level out at some point but right now you're not even a year into it right so my thought is by the early next year that starts to level out from a industry standpoint that doesn't mean like what we just talked about that you're not already seeing situations where Google and Anthropic are really thriving and you've got an Open AI that's gone from last year, right? Even to the end of last year, you saw deal after deal after deal. And now they're going, well, we're going to have to cut back on these side projects and focus on our core. That's a big change in how they've been operating really since we first heard about chat GPT at the end of 2022. So you can have both of those things happen happening simultaneously

where you know in 01 02 the internet growth was still doubling every year but it wasn't doubling every three months which was the problem and which is why you saw NASDAQ go down 78% over two and a half years you saw a thousand internet companies go to zero um go bankrupt and you know ultimately though that was good because the next generation of amazing companies was founded on the back of that cheap uh fiber in the ground, cheap internet access. So, Amazon, Google, um you know, the giants of today, Meta, Netflix, etc. They came out of that destruction off of that overbuild. But it's the same thing that happened with canals, railroads, electricity, radio, you know, all the big industrial revolutions. You had the overbuild and then you had a great boom on the other side of it. >> Okay, let me let me argue sort of what I

hear from people here in tech, right? And they say that this time is different than any other time. Jeans paradox, all of that. Talked a little bit about this with Eric before you. >> What if open AI is right and we do need all of this compute capacity and Anthropic is wrong and they haven't built out for that scenario where demand is not inflated and it just keeps going up and up. Well, let let me draw an analogy, right? You you had five years arguably. So, Netscape Navigator, which was the first internet browser, came out at the end of 1994 and then you had a huge build in internet infrastructure capacity and then you had that implode. And as I said earlier, anytime I mean, do you think AI is not talked about enough, right? It's talked about plenty. Everybody knows that fortunes are going to be made and lost in this. Um, you're looking at how fast Open AI is growing,

how fast Anthropic is growing, the the ramps that we've never seen before, even during the internet area era in terms of how fast these companies are growing. There is zero doubt that there's overinvestment. There's just going to be. It's either that or you don't think people are hyped up enough about AI. It's one of the two. And I think there's plenty of hype around. But >> I mean, in public markets, it feels like the hype has died down. The AI trade seems to be suggesting what you're saying is that not the market isn't totally bought into this demand curve. Well, again, it goes back to I think we're you can have both things be true, right? Which is >> do you think every company that's started to focus on AI is going to win? >> Of course not. >> Yeah. >> So now all we're talking about >> Okay. So give us your pick, >> Dan. >> Yeah. Well, my pick I mean you've kind of said Anthropic seems to be winning, but go ahead. Yeah. Especially with the public ones.

>> Yeah. I mean, I think Anthropic is winning. Amazon's obviously hosting them and I think they're going to do well. Amazon also has the advantage of they've got massive physical infrastructure which will really benefit from robotics and AI. If you've got robots wandering around your factories moving things, that's going to help a lot because they do have that e-commerce business and Amazon's the biggest public cloud vendor out there and they're hosting a lot of that anthropic workload. So, I think they're going to do well. I think ultimately Apple will do well because they've got this huge installed base of a billion and a half iPhone users out there. And so, and they, funnily enough, because they've been so bad on AI, they sort of get to benefit because all these other guys are are killing each other by investing hundreds of billions of dollars. They get to just sit back and say, "You know what? We're gonna license this from Google for a billion a year and you guys can go spend all that money and we've got the devices to get it out

to people like Dear Draosa and Dan Niles. And so I think at some point and it seems like the product, you know, the foldable iPhone might be getting pushed out. It might not be, but I think as you get into next year, you're going to see them start to benefit from that. Microsoft I would be wary of just because as I said and it comes down to how do how do you think this is going to play out with open AI Microsoft owns 27% of them. I've got some real questions around that. Um Oracle you could put it in the same bucket of over half their backlogs related to open AI and so you have to be careful with that. I think the one big thing with agents which is different than the last three years is you need orchestration. So, in other words, you're not doing the same thing over and over again with an agent. Like, let's say you're running something, Dearra. You might it may need to go out to CNBC to get some data. It might go to the SEC website to get some financial stuff. It might go open up your Excel spreadsheet. It might go ahead and

populate that. And so, it's doing a lot of different things. And so >> yeah, >> microprocessors which really got >> killed over the last three years because the value was acrewing to GPUs which do repetitive things over and over again really really well. When you go to agents and you've got to do a whole bunch of different things, all of a sudden you go from 7 gigawatts of GPUs to just one gigawatt of CPUs. That ratio goes to maybe like 4 to one. And so all of a sudden companies that have been given up for dead like an Intel obviously you saw Elon Musk saying hey you know looking forward to working with Intel. Um today you saw Google and a deal with Intel. And I think you're going to see that become something that people thought was dead come kind of come back to life which you're starting to see. And so those are just some of the names. Obviously optical is another area you need more memory because it's got to remember oh dear asked me to do this before. Oh, this is in addition to

that task she had me do. And and so there will be other parts of it. But that's kind of how I'm thinking about it. And I'm very wary of anybody who has OpenAI, >> excessive open AI related exposure. And I'm much more positive on companies that have >> a lot of Google related exposure. >> And >> right, you kind of brought us full circle, right? we started about talking about efficiency and token maxing and what you're saying is that similar argument is that the next phase of AI is different things are going to matter. Um I like that what you said about GPUs versus memory. Um finally Dan last question. If we get three IPOs this year, Blockbuster IPOs, SpaceX, OpenAI and Anthropic, which ones are you buying? Uh, the way I invest is I like a high margin of safety. I think Elon Musk, to be clear, is the Thomas Edison of our generation. We haven't seen anybody like him arguably since Leonardo da Vinci. Like, he's in healthc care with

Neurolink, you know, making blind people see. He's catching, you know, rockets with chopsticks and he's got EVs. He does so many things well. But the problem is I look at SpaceX and I go at a hund times sales. It's going to be impossible for me to think that I want to take any kind of view on that. >> Um, and it really comes down to valuation, right? Like if you said, "Hey Dan, you can buy my house in TBR for a billion dollars." I'd be like, "Did you're out of your mind." If you said, "Oh, you can buy my house in Ton for a million dollars." I'd be like, "Sure, I'm there." And so for me, I'm I'm a value. >> You could you could have said that about Tesla. You could have said that about Nvidia. You could have said that about a number of companies in this sort of retail investor era, too, that care less about um valuation and more about a generational company. >> Well, yeah, but you could have said that in 2000 as well. And that that's worked out so well with Cisco and we just got back to those levels recently, right?

So, you know, it's all well and good. >> Maybe my house is going to space, Dan. We don't know. >> Huh? >> Well, yeah, but at 100%. >> Maybe my house is going to space. So, >> yeah, >> I I understand your point and that's I know you're also in and out of these markets, right? You trade. So, I know that your positions you might find a better time to get in. Um, what about just quickly on anthropic and open AI? >> Yeah, I'm guessing for Open AI, you're not buying into that IPO. So I I guess the valuation is always a question but assuming it goes for high valuation anthropic at a higher valuation would you have more forgiveness for a higher valuation for anthropic? >> Well remember I can hedge positions so I can be short one company and long another. And so for me if you're in a position where people are starting to differentiate more between the companies in a space then I can be long one and short the other. And so, and that's like the perfect thing for someone like me where you go, okay, I can try to hedge my risk because I don't want to lose

money, right? Like you you I'm sure you're going to find people that say, "I owned Amazon all the way through and look at how great it's done." And I can say, well, that's great, but how about the person who owned web van through that whole thing or owned AOL or owned Yahoo or owned Motorola, Nokia, IBM, you know, run through >> Did you choose that? Did you mention that one? >> Yeah. Yeah. I mean, like, and I we have a slide on that too, right? That I don't know if you have that, but and it shows you don't know who the ultimate winner is going to be. Obviously, by just math, somebody would have picked, oh, this one's going to survive. But I'd rather buy Amazon when it's down a lot. I've owned Tesla in the past, by the way, after big sell-offs. And so, for me, it's always a >> adjust, you know, what is your risk versus reward? If you get it right, how much can you make versus and the valuation plays into that, right? Because I, as an example, for the last few weeks or so, I've been saying, hey, you know what? this whole agent thing,

it's gonna be good for microprocessors. And just over the last few weeks, you've seen Intel go absolutely vertical and but it was at a really good valuation before all this started. People hated it. It's going to be put out of business by Nvidia. The fab strategy is not going to work. So for me, >> there's that and then there's you saying your house is going to be in outer space, right? So like that's what I want to avoid. I want to get >> you want numbers and Warren Buffett's number as well investing right >> to your point Intel's on top. Eric just said that open AI has an innovators dilemma. This space is just moving so quickly and Dan I know that you're always you know taking the latest data, the latest news and looking at this. So always grateful for your insights. I know we got to get you out of here for the market close. Thank you so much for spending the time with us. >> I always love having conversations with you Juda. So thank you for having me on. Thanks again, Dan. And thank you to everyone behind the live stream. Jasmine, Drew, Carrie, Tim, hang behind

the camera. Um, Janice, and of course, Morgan in the control room. Thank you, Morgan. Uh, thanks a lot for watching. And we'll see you again next week.