The neXt Curve reThink Podcast

Silicon Futures for April 2026 - The AI CPU craze, Qualcomm's custom AI, Google TPU 8 explained

Leonard Lee, Karl Freund, Jim McGregor Season 8 Episode 16

Use Left/Right to seek, Home/End to jump to start or end. Hold shift to jump forward or backward.

0:00 | 45:25

Send us Fan Mail

Silicon Futures is a neXt Curve reThink Podcast series on AI and semiconductor tech and the industry topics that matter.

This month shall be dubbed the month of specialization as the tide of AI compute make a hard turn toward inference and agentic AI. What is all this all this madness about CPUs about? The Silicon Futures Trio unpacks the mystery of the month. 

In this episode, Leonard, Karl and Jim talk about some of the top headlines from April of 2026. 

➡️ What's up with the CPU mania in AI?
➡️ Why is everyone going custom, specialized AI systems? 
➡️ The diverse velocities of AI change & digestion. 
➡️ The many roles of CPU in an agentic AI system. 
➡️ Cloud demand driven by AI system complexity and speed. 
➡️ The AI narrative has graduated from the "chip". 
➡️ Is the AI boom sustainable? 
➡️ Clarifying the Google TPU 8 Enigma. 
➡️ How will AI system specialization and general purpose play out? 
➡️ The hyperscale red flag for NVIDIA. 
➡️ Where are the AI business model transformations? 
➡️ A quick thought on Terafab by Intel & Elon. 
➡️ Impressions on Qualcomm's custom AI silicon & systems. 

Hit Leonard, Karl, and Jim up on LinkedIn and take part in their industry and tech insights.

Check out Jim and his research at Tirias Research at www.tiriasresearch.com.
Check out Karl and his research at Cambrian AI Research LLC at www.cambrian-ai.com. Check out Karl's Substack at: https://substack.com/@karlfreund429026

Please subscribe to our podcast which will be featured on the neXt Curve YouTube Channel. Check out the audio version on BuzzSprout or find us on your favorite Podcast platform.

Also, subscribe to the neXt Curve research portal at www.next-curve.com and our Substack (https://substack.com/@nextcurve) for the tech and industry insights that matter.

NOTE: The transcript is AI-generated and will contain errors.

DISCLAIMER: This podcast is for informational purposes only.

neXt Curve

Next curve.

Leonard Lee

Hey everyone. Welcome to this episode of Next Curve's Rethink Podcast and our special series Silicon Futures. And I'm Leonard Lee, executive Analyst of Next Curve. And I'm joined by Carl Fre of Cambrian AI Research. And we have, shoot all your bullets before you hit the record button. Jim McGregor of the Famed Ally augmented. Curious research. How, how did I do? Pretty good.

Jim McGregor

Uh, pretty, Hey, I'm, I'm the southwest cal, I'm the cowboy. I, I'm, I'm, I'm, yeah. I'll admit that. I, I'm the cowboy here.

Leonard Lee

Yeah. He's quick to the trigger. You know, he's pretty happy. Oh my gosh. You folks don't even know. They don't know what happens before we. It's report

Jim McGregor

I'm I, I as I was called one time by a reporter, I was called the Outlaw Analyst, so, oh,

Leonard Lee

I, bind into that 1000%. So. In this, episode, we're gonna be talking about the hot headlines of the month of April, and of course, this is 2026 if you don't know the year. and before we get started, please remember to like, share, react, and comment on this episode. Also subscribe here on YouTube and on buzzsprout to listen to us on your favorite podcast platform. Opinions and statements by my wonderful guests are their own and don't reflect mine or those of next curve. And we're doing this for informational purposes only to provide an open forum and debate. And we shall have a debate in this episode, on all things, ai, quantum, and semiconductor, semiconductor industry stuff, chips, and all these things that we love. And so we're gonna get started and we're gonna start off with, Jim's favorite target of the month, the CPU and ai. So why don't you explain to us what is going on? Why is everyone going bonkers about the CPU in the world of AI computing?

Jim McGregor

there is the theory that, especially CPU performance increases and the, AI models get more efficient. Then you're gonna be doing more of the AI processing on the CPU. Now, I will agree that I do think CPUs will evolve over time. matter of fact, we're starting to see, instructions, new instruction sets out of the A MD and Intel partnerships that will lead to that. obviously you've got other vendors like IBM that have integrated an NPU directly into their CPU. you also have that, especially on the mobile side with, companies like, media Tech and Qualcomm. so I, I think that over time you will see. More, AI workloads running on the CPU, but I don't think there's a major shift. And I think that's the message we've gotten from a lot of people is that, oh, well you can run everything on the CPU. Well, if it's not really designed for ai, it's still Mason basically gonna be that orchestration engine. Matter of fact, that's what we're seeing, especially with a lot of the arm processors with Grace and Vera Rubin. Vera from, Nvidia as well as the a GI from arm. They're designed to be that orchestration processor. Could they run AI workloads? Yes. Are they designed to run AI workloads? No. IBM probably has the best balance where they have that AI accelerator on the CPU and they have the AI accelerator, separate discrete accelerators, so they can divide the workloads, but I don't necessarily see. in the data center, I don't see a lot of the AI workloads moving to the CPU. I do see, now I, I do think that especially as you have, more, inference processing, you may need more of that orchestration power, tied to the AI accelerator. But the AI accelerator's still gonna be more efficient no matter what it is. but I think as you move closer to the edge where you're not necessarily gonna have that AI accelerator. That's where it makes sense to definitely run the AI workload on your CPU.

Karl Freund

So I think, I would take issue just a little bit because I don't think people are saying You're gonna run the AI. On the CP what they're saying is that agent AI requires a heck of a lot more orchestration and planning and scheduling and all the work that goes on around the AI models. that's what I'm hearing people say is that the agent AI shift is adding more workload to CPU. And in fact, some folks have claimed instead of having, let's say, A1C PU to eight. GPUs, one or two CPUs to eight GPUs is going all the way down to one-to-one ratio, where if you'll need an ai, excuse me, A CPU for every GPU or whatever your accelerator is, and so that's what's driving the increase in CPU, which is benefit. Intel clearly, I think will benefit a, Nvidia with the, with Rubin Vera Rubin platform, where you've got a really good CPU that's doing all this, upfront and in between work and ongoing work. Remember, agent AI is running 24 7 in the background and what's running is, is combination of CPUs and GPUs.

Jim McGregor

See, I agree with that statement. Although the one to one I don't agree with. I think maybe one to four makes more sense.

Karl Freund

One or maybe

Jim McGregor

one to four, maybe one to two, but one to four makes a lot more sense and more efficient. But I think that some of the market and some of the industry, yeah, especially Wall Street is taking that message way too far.

Karl Freund

Well, it'll be interesting to see what Qualcomm announces.'cause they haven't really told us much. Not yet. At Qualcomm's, quarterly earnings report. Christiano Anan said that he's got a hyperscale customer For whom? They're building a custom

Jim McGregor

yes.

Karl Freund

Rack solution. So I think this is a sign of the further specialization of AI for specific workloads. So if you could imagine somebody like a meta or. Amazon or high Neo hyperscaler having a competitive advantage because they, and only they have access to a custom processor that solves some particular problem. May, maybe it's, may, maybe it's recommendation engines, maybe it's video. I don't know.

Speaker 4

No,

Karl Freund

but that can give you a competitive advantage to drive increased margins, increased revenue. because you have that, and I think we're gonna see a lot more of that kind of specialization. Yeah. And, I can't wait to see what they're doing.

Jim McGregor

I think that's the message for the month. Because we've seen a lot of announcements. We saw Qualcomm announce that they're doing custom CPUs and AI accelerators. Don't know what the announcement is that they're gonna make around their financial analyst conference coming up in June. We've seen, Google, who, I don't know if I want to be their partner right now. First they announced, two new TPUs for training and inference. Then they announced a partnership. They enhanced their partnership with nvidia. Then they announced that they're gonna sell the TPU to select customers. So, but we've seen, yeah, Amazon slash a Ws. We've seen Microsoft, we've seen everybody enhance their story around their silicon, especially for, inference, but also for training. even yesterday, tens Torrent had an event talking about, yeah, their black hole processor and their galaxy, clusters that they're gonna be able to provide, that are completely air cooled, custom AI or, I should say specialized AI solution. So I think that, it's still ai, ai, ai, and it's really driving the silicon message right

Karl Freund

Even to the point of some companies. Providing, silicon for a specific model, right? Mm-hmm. Palace, for example, that they've released a chip that only does one thing, and it's a small model. Was it 72? 7

Jim McGregor

billion.

Karl Freund

7 billion, yeah. Yeah. So it's only for small models right now, but their vision is that no, they're gonna have a custom piece of silicon and. That will run a model just for you, and that's all it'll do. And you're gonna have to upgrade grade that CPU, that the processor, every time that model changes, that's all baked into their financials. Yeah, it's still massively cheaper and more, more productive. Question. Will people adopt such a rapidly changing, infrastructure.

Leonard Lee

Well,

Karl Freund

the idea of specialization, I think is the key. The key, yeah. Key to the month of April.

Leonard Lee

And then also longevity. P a lot of developers can't keep up and they don't want to keep up. They want a baseline, and they want to be able to live with that baseline for a really long time.

Jim McGregor

That's

Leonard Lee

not gonna happen. You get closer to the edge.

Karl Freund

I disagree.

Leonard Lee

Yeah. Oh, no, I, I, I, I hear you. I agree with you. The developers.

Jim McGregor

I agree with you, but I don't think it's gonna happen.

Leonard Lee

Oh, no, no, no. It won't, it won't happen. But I think that demand is going to change, the demand profile for technologies, and, products that support longevity, right? Versus things that are on a very, very fast cycle. And so I think with inference, we're gonna see a different cycle, especially if you get, as you get into enterprise. than what we see in model training, right? What all these AI labs are doing. That's a different game, different velocity than what enterprises are gonna be comfortable with. Yeah. But what, more importantly, what infrastructure players and service providers are gonna be comfortable with. They can't keep up. they can barely make the shift toward cloud native. Now you're going to introduce a turbocharger on a whole field of technology that they can't absorb. I think at some point there's gonna be a digestion issue, and I think there's probably already a lot of that factored into some of the slower adoption we're seeing in the. Infrastructure, this, these, autonomous infrastructure and narratives and expectations. It is not going as fast as people think. Right? And when you do, yeah, if you look at Telco, telco, it is not getting absorbed as quickly. We're still like at level three, autonomous X, Y, Z versus like level four. Level four. We've been talking about level four forever. It's still a challenge and it's usually still a very narrow enablement. But I want to go back to, Google really quick in your comment, about the CPU for AgTech, and I think that is the big theme shift that's happened. In the last six months. I agree with both of you on the different layers that you've described in terms of the role of the CPU, but there's another role that people don't talk about, which is, what actually hosts an agent and what does the entire agentic system look like? And the reality is a lot of it has nothing to do with. Ai, a lot of it has to do with traditional, compute functions like, hosting the container for the agent. Right? and basically now what we heard coming out of, Google was that they partnered with, arm to, and, To, be able to support, sandboxing, right? secure what I call zero trust execution, containers, in a very similar fashion to, what we see with Nemo Claw. But, these things don't just, a pure out of nowhere, you need compute and that's gonna run on a CPU. So as we see, these claims of billions of. agents being, uh, deployed out there and running constantly. Well, guess what? You're gonna need CPU to instantiate support and provide a runtime for those agents.

Jim McGregor

In my mind, that's all part of the orchestration. Yeah.

Leonard Lee

Um, I, that, that there's the orchestration is more of a control plane function in my view. And then there's the physical stuff that, there's this, the, let's call it the fabric, right? The hypervisor that you still need to, implement, right? Whether it's serverless or it's a container based, so that's traditional cloud computing requirement. So when we start to look at agentic, it's a blend of, quote unquote accelerate computing and traditional computing. And a lot of the tooling is going to be orchestrated, managed by, functions that are likely gonna run on A-A-C-P-U. And then all the tools, most of the tools, especially the stuff that, hits like traditional, data center backends. Most of that is running on CPU. So I think, that's where we can expect a broader. Demand for, CPUs and I don't think that's what Wall Street's reacting to, because a lot of the folks that I talk to don't recognize this because they're still focused on. The AI computing element, they haven't extended their thinking out toward this holistic view on what the agentic system looks like.

Jim McGregor

I think we need to redefine this because, yes and no. I mean, if you look at the CP is the entire SOC, then yeah. It's gonna continue to, yeah. Gobble up more blocks. Like in the data center you mentioned everything has to run the CPU. That's actually not true because we're seeing the emergence of the DPU to offload a lot of that overhead workload, for security, for network management, for stuff like that. And quite honestly, could that DPU eventually be a function, a hard quoted function within A CPU, or within a processor or an SOC, as one of those functional blocks? It could be. Eventually, so I, I think that, we still end up going towards more customization to do each specific workload and each specific task.

Leonard Lee

Oh, yeah, yeah,

Jim McGregor

To your point, and to your point about zero trust, in some cases, that's actually a separate functional block. So if you look at some of the, you. Mission critical applications for automotive or ERO or stuff like that. They actually have separate blocks that are zero trust blocks that, make sure that there's no bleed over between applications, between data, between even IO to ensure that is completely zero trust.

Leonard Lee

But then that's like that secure enclave

Jim McGregor

Yes. Type

Leonard Lee

of concept. Very much so concept, but I'm talking about levels above it. And so what kind of. Computing requirements are at closer to the software and application level mm-hmm. That are driving demand. And so that's where you really have to look holistically at how are we evolving from standalone LLMs to agent systems. And so when you look at the CO and then, but you're making a really a cool point about the CPU and how the CPU itself. It's not just the cores. When people think about, oh, we run on a CPU u, they think of the chorus for some reason. Right? It's the CPU course. But to your point, there's a diversity of, of cores that we're seeing. You have asics embedded in like a package co packaged into cpu. Right.

Karl Freund

We shouldn't miss, mentioning the big picture of this uhhuh, this whole trend, which is that everything we're talking about is very complex and If you add to that, the thought that you're gonna be turning this silicon very rapidly, all of this is gonna drive increased growth in cloud. Computing. Nobody wants to manage the myriad of SKUs in their data center. If you're going to be taking a problem, breaking it up in lots of different pieces, each one of the specialized processor, CPU, or GPU, or ASIC or whatever, they don't wanna deal with all that stuff. And so increasingly I think enterprises will now shift even more rapidly toward cloud computing. If you look at the earnings announcements last week, we saw tremendous growth across all cloud computing platforms. and it's lifting all boats, because the ne the neo clouds are also absorbing a lot of this, a lot of this bandwidth requirements. So I th I think, cloud computing will increasingly become the utility of the future. Yeah. And, the thought that you have to stand up your own data center, that's gonna become obsolete.

Jim McGregor

Well,

and

Karl Freund

with exceptions.

Jim McGregor

I think that especially what, Leonard's trying to describe is what I call workflow processing.

Karl Freund

Mm-hmm.

Jim McGregor

Where instead of thinking about platforms, you have to think about workflow and my best example on this is physical ai where you have to think about what you put on device because you got limited. power battery processing capabilities, what you put in the network, what you put in the cloud. And I think overall we are moving there and I think that's one of the biggest changes. And quite honestly, that's changing drastic requirements for networking technologies, just be able to handle this. But we have to start thinking about a single workflow that may cross all of those. It may cross the device, it may cross the network, and it may be in the cloud. It may be all three of those. Working together and understanding that is a huge challenge. It's really a workflow play.

Leonard Lee

Yeah. I agree with, I agree with you on that. And that for the most part is a big missing piece.

Jim McGregor

Mm-hmm.

Leonard Lee

there's still people talking about chips and these companies selling chips. No. They sell systems. At a minimum, if not like an entire cluster. Right. We've, we've said, you know, Jim

Jim McGregor

and the software and the tools and everything

Leonard Lee

else. Yeah. Yeah. You're literally, like for Google, you're, when they ship these TPU TPUs, it's not just the chip and that's not the hard part. They already have generations and generations of tpu. the DNA that they have is basically delivering not only their model specific, right? The Gemini stuff. The stuff is optimized for Gemini, and generationally optimized. AI compute stacks, right? And then the software, all this stuff evolving together. And this is something, Carl you especially tuned into and highlighted, I think about almost, nine months ago, right? When we were at Hot Chips, right? You got really excited about TPU all of a sudden. But I think that it's an important point for everyone to. Tune into. Mm-hmm. Because I still see a lot of commentary out there in the media as well as in, in the investor community where they're still fixated on the chip and it's no, you probably should be thinking about networking more than the

Karl Freund

chip. Well, maybe, and maybe you should be thinking about memory and storage instead of the storage. Yeah. Boring old storage. Sand is just a top performing s and p 500 stock sand disk. Right. Really? Yeah. And guess who, like number three or four Micron. Right? Which provides both a geopolitical hedge as, as well as necessary. a memory needed for the high performance computing and ai. they're eating it up. They're just Yeah, they're sold out for 20, 27

Jim McGregor

and it's a whole data layer that, that's, yeah, that's, and it's constantly changing. We didn't have SoCal until last year and just became a standard this year. Yeah. So Cam, sorry, SoCal, but so Cam, where

Leonard Lee

I live in SoCal.

Jim McGregor

Exactly. So Cam, but the fact that you have to think about, how you cross from. A, uh, from SRAM to HBM, to DRAM to or so Camp Dram to, yeah. Storage, blah, blah, blah. It's an entire data layer.

Leonard Lee

Yeah.

Karl Freund

that data layer is what has become the fastest growing. Segment in the landscape as well as the new choke point. And as it becomes a choke point, guess what happens? There's not enough supply. So what happens to prices? They go up, they've literally doubled the last 12 months. Yeah, just in prices. So margins explode. Because instead of running a 40% gross margin business, you're in a 50 to 70% gross margin business. so the data layer has become the hottest part of the market.

Jim McGregor

Yeah. I got a question for you guys, our industry is still charging along. Despite the memory issues. We are, yeah, we're seeing, you know, maybe pullbacks in some of the smartphone announcements, or the number of SKUs being offered and stuff like that throughout the year. But our industry is still cranking. I gotta ask you a question. Do you really think that, from a global economic perspective, we can keep this up because,

neXt Curve

no,

Jim McGregor

we've got ira. Even looking at semiconductors, we still need a lot of helium out of the Middle East to be able to do, yeah. Advanced semiconductor manufacturing. I'm just looking at this between tariffs, between cost of everything going up from food to chips to systems. I'm just like, I don't believe this is sustainable.

Leonard Lee

I think there's a, the constraints right from supply and, manufacturing perspective. how, what is your ability to global, at a global scale, meet the perceived demand. But then on the other end, I think one of the most telling, moments during the earning season was a question an analyst asked, Amy Hood and, sat Nadella. They asked the question. so when is the, they asked the ROI question, who's gonna pay for all of this? How is it gonna be paid for? Yeah. Economically. And they still couldn't give a good ga good answer. It's still all the hypothetical stuff, and I'm not picking on Microsoft, but pretty much. With maybe the exception of Google, it's still difficult to tell.'cause it's one thing to look at surface numbers. You have to really drill down into the financials and then try to look through and see through the. Murkiness of some of the AI numbers, right? Everyone claims they have massive AI numbers and growth, but then they don't necessarily, they're not very transparent about it. You have to guess what it is that's actually happening underneath that number. but maybe with the exception of Google, because I think, honestly, they are probably best positioned to drive some degree of monetization. But that's advertising stuff, right? That's not. Productivity in terms of. Pumping out widgets or, shipping.

Karl Freund

I would

Leonard Lee

bushels,

Karl Freund

I would take a little different view. and that is, if you look at various segments, let's say consumer segment, let's where agent AI is finally gonna give people the productivity tools that AI has been promising for a decade now. So that, that is hitting an ale Inflection point will drive increased demand. Another segment is. In programming where you need fewer and fewer programmers to get the job done. So layoffs are coming and because the,

neXt Curve

yeah,

Karl Freund

the ability of AI to write code is pretty astounding. I asked my favorite. Perplexity to do all kinds of things for me now. And you know what it does? It writes programs to do those things and links them all together for me. and all I had to do is ask it to solve a problem. these you could have massive productivity improvements. Now, another segment would be in very specific markets, let's say in genetic based, drug discovery. Where we're starting to see some very significant drugs come to market that will have huge impact on the marketplace. So I view it differently, Leonard, I view it as each of these segments is hitting its stride at about the same time, and that's gonna drive the economic benefits that everyone's been asking for. Justifiably no.

Jim McGregor

But to your point, and I agree that we're gonna get, that we're having these innovation waves. If you're laying off very high price talent.

Karl Freund

Yeah.

Jim McGregor

does that also have an impact on the overall economy? You know, um, absolutely. Elon thinks that everyone's just gonna get a universal income. I don't see any company paying that anytime soon government pays. if all of a sudden we see unemployment going up because of ai, if we see. Geopolitics limiting our industry and capacity limiting our industry. And let's face it, the tech industry's been the bright spot of the overall economy for, 2026. I'm looking at this and I'm still thinking that, I know, and I'm not an economist, I do track the economy and I'm still looking at this and saying I'm still very fearful that towards the end of 2026, we could be looking at a recession.

Leonard Lee

Well, companies like Vertiv, jab Ball, companies that are contributing to the infrastructure build out that are not tech. They aren't, these aren't intrinsically tech companies. They might call themselves tech now because everyone Yeah.

Jim McGregor

are

Leonard Lee

a tech company. I know. But you know what I'm saying? That's why I hate the term tech.

Karl Freund

They just move fluids instead of.

Leonard Lee

Company, dude,

Jim McGregor

PCBs, they build systems. They build, they build power solutions.

Karl Freund

It's amazing when you go to these tech conferences now and they start looking like plumbing shows.

Jim McGregor

Yeah.

Leonard Lee

Exactly. And then now you're gonna tell me,

Jim McGregor

or power shows. Yeah.

Leonard Lee

You gotta tell.

Jim McGregor

It's tech.

Leonard Lee

no, it's tech. Okay. All right.

Jim McGregor

okay. Where's that slap button I was looking

Leonard Lee

for? no, we still have to invent that. Maybe Carl can, get perplexity to code that

Karl Freund

knows ever cook. What was that in the background?

Jim McGregor

I was trying to find a slap function that I Oh, slap.

Leonard Lee

Okay. Hey, I wanted to ask you both a question really quick, because there seems to be like a lot of fascination about the fact that, Google has two rack systems for, they have a TPU eight I and A-T-P-U-A-T. Yeah. why don't you explain to everyone, what's really going on here with, Let's call it this split. Not really splitting the chip. There's really just creating two variations

Karl Freund

when you look at it at a system level down to the chip level, everything for inference is going to look different than it does for training. You could still do inference on a machine that's designed for training. It's not as cost effective as one that's really designed to solve the inference problem. As we've learned, there's not just inference. There's the decode segment, and then there's a prefill segment. And oh, by the way, you can then do even take the decode segment and split that into two, which is what Nvidia is doing with the GR language processing. Problem you're solving is as it cha continues to change. needs specialized hardware and specialized software for the orchestration of taking the inference process and splitting it out and disaggregating it across different, properties of very different architectures. Now, thas Torrent disagrees with that. Jim Keller was quite explicit yesterday. Mm-hmm. When he said, Hey, guess how many people are gonna be doing desegregation by the end of 2026? He said

Jim McGregor

Zero. Zero.

Karl Freund

that's a very profound statement. And if he's right, the, you're gonna see change in leadership in the hardware industry. so the specialization. As we started this whole conversation, yeah, the specialization continues to expand and I personally think Jim is wrong and that the specialization will produce more cost effective solutions, however. If he's right and a single, architecture can do a better job if you just let it do the job instead of doing all his orchestrations, put the job up in multiple, work workflows, that would be economically very powerful if Jim is right. We'll see. Right now they haven't published any LLM data. They've only published video data, but video benchmarks. But wow, video is an impressive.

Leonard Lee

Really in real quick defense of Jim,

Jim McGregor

that's

Leonard Lee

Jim

Jim McGregor

Keller, not Jim McGregor.

Leonard Lee

Yeah, but Jim, Jim McGregor is that, it depends. Inference and mal training will be categorically different. And I think the treatment of your comment probably is gonna be different in those two categories. I, my view is that, Nvidia is still, the dominant force in model training, right? As soon as you get out of that or then, inference starts to look a lot different, especially if you have generational, model to hardware alignment, efficiencies and optimizations that you can engineer. What Google's doing, what, Microsoft, is doing. And we're seeing these transitional shifts in the architectures as well as the chips themselves, as they try to make.

Jim McGregor

well,

Leonard Lee

A shift over toward. Inference,

Jim McGregor

right? I would like, I would link this to where we are and and there's technology cycles that we go through. When you're in a early period of the cycle where there's constant innovation, constant change, and everything else, it's really hard to put things into a, hardware, a single hardware solution. That satisfies the whole thing.'cause standards are changing, especially with ai. Models are changing how we process models, where we process models. All that's constantly changing. So I think that's still driving a huge amount of innovation. we're nowhere near that, that, That, that kind of maturity stage where it says, okay, now we have standards and let's design around those standards, and we're not there yet.

Leonard Lee

Yeah.

Jim McGregor

Yeah. So I think that there's some key message to, yeah, it would be great if we had something that could do everything, but I, just look at the number of companies that are now off. Bring custom silicon. First off, the hyperscalers are doing their own silicon and they're working with partners like Media Tech, Marvell, Broadcom, Qualcomm. Now

Leonard Lee

Qualcomm,

Jim McGregor

a MD and Intel have always said, if you have a big enough checkbook, we're willing to do custom silicon for you. So, we're seeing at Huawei obviously, so we're seeing a constant expansion of companies saying, listen, we need to adapt to the market. We need to adapt to the customers. The good thing is with. The foundry services with the advanced packaging technology, it's not cheap, but it's easier to produce a custom chip today than ever before.

Leonard Lee

Yeah. But then, but this is where I think, it is an eye-opener and a potential, warning sign for Nvidia that, hyperscalers who have large pools of value that can tap into, especially on the inference front, Google, like AWS, or Amazon, right? Amazon for their internal, requirements, right? These are large scale, needs and demands where their investments in their own. model or stack optimized. Infrastructure can have a benefit. This doesn't bode well. and I'm talking about the infra side of the equation, not the model training. They'll still probably in all likelihood for a long time rely on Nvidia for the model training stuff. Right. But once they figure out, okay, what can we scale out in inference and production use, they'll probably go with their own stuff. And that's probably what Apple's doing with, With Google. God knows if they're, you know, they're probably using GPUs, Nvidia, GPUs, they're just not inventing it.

Karl Freund

I think they're using GPUs.

Jim McGregor

there's never been a single process architecture for the data center. You wanna know why, because no two workloads are the same. So even if you go back 20, 30 years, you had different risk processors. You had X 86, you had custom processors,

Karl Freund

all these.

Jim McGregor

For doing processing, for doing storage, for doing networking, for doing blah, blah, blah, I don't think that's ever gonna change. I think that, with the level of innovation we have in our industry, you're always gonna have. the general purpose solutions that are really going to be the high volume, and you're always gonna have customized or, very niche solutions for other applications. But I definitely wouldn't count out anybody, especially that has the expertise, the size, the capabilities, the partnership, the ecosystem, like an Intel and a MD and Nvidia. Because these guys, they know what they're doing. They know how to do it. And they can have multiple designs going Si they do have multiple designs going simultaneously. That's the only way you can come out with a new chip every year.

Leonard Lee

Yeah, yeah. But then, think about the Google team. They're not

Jim McGregor

too shabby, right. I I they, they are not shabby and they're are focused, very specifically on their TPUs. I would say that, they are, they're definitely, it'll be interesting to see, one of the things that's hard to predict right now, and it's funny'cause I asked this of all the tech companies and it worries me when A CEO doesn't have an answer to this question. I ask him. I know how you want to use AI and you want AI to use your comp or change your company, improve your productivity, change your products and services, blah, blah, blah. How is AI going to change your business model? And I don't, and this gets to your very, the heart of the point that you bring up all the time. Leonard, you know, we don't, we, yes. The ROI, we don't know what the final business models of these companies are going to be yet.

Leonard Lee

they better figure it out soon is all I gotta say. but yeah, I mean the clock is ticking, you know? obviously great point. It's that as much as they're accelerating on a one year cadence for their tech, they better get on a one year cadence.

Karl Freund

Everybody tries to figure it out. Picks and shovels are where it's at.

Leonard Lee

Oh yeah.

Karl Freund

Right.

Leonard Lee

Yeah.

Karl Freund

find some of that goal at some point and consulting like IBM, right? where they help their clients figure it out. And meanwhile, no matter what. The outcome is there's a lot of picks and shovels going into the field, so,

Leonard Lee

yeah.

Jim McGregor

Yeah. And, and I'll, I'll be honest with you, the best answer I've gotten to that question was probably from Jensen Hu. I asked him mm-hmm. And I says, I gotta credit you, you keep. Going after all these opportunities. And let's face it, they're like any other tech company. They've gone after some stuff like their game consoles and stuff like that just really haven't panned out. But they keep evolving. And I said, okay, so you went from being a chip company to a, a device company to being a systems company, to being a service company, to being a software company. What does Nvidia look like in 10 years, especially in the AI era? And he looked at me, he says, Jim. I have no idea. but the good thing is he's constantly thinking about that. You don't know what the business model looks like, but you have to be ready and you have to be changing your business and your organization around, what's happening in the marketplace, what's happening with technology, and what's happening with your company.

Leonard Lee

Well, don't forget investment in VC house. They're becoming that as well.

Jim McGregor

Well, it's funny'cause you mentioned that, Nvidia, it's probably become one of the biggest VCs in the industry. Yeah. Over the past year. The way they're, with the amount of money they're making. I started a list of, partnerships and investments that they've made, and I think I'm up to like 58 over the past three years. and it's almost impossible to keep up with. They're like making announcements like every two weeks.

Leonard Lee

Yeah. and it's full stack. It's not just Chip, it's not core like I Apple.

Jim McGregor

across the ecosystem.

Leonard Lee

Yeah. What you see Apple do, which, they rarely make acquisitions above their pay grade, right? It's usually technology investments and stuff like that. Lower level stuff that contribute to their products and, their technology foundation. But NVIDIA's going freaking straight up. The stack all the way up to Anthropic and everything else. I mean, this is like, end market, platforms. Right.

Jim McGregor

So, and they, they're investing in EDA vendors, IP vendors. I like the chip vendors. yeah. systems, cooling, s PSM power, blah, blah, blah. Yeah. Everything.

Leonard Lee

they own T SMC now or are they still a

Karl Freund

No,

Jim McGregor

I was just gonna mention,

Karl Freund

I just gonna mention.

Jim McGregor

I wouldn't call them a customer. I'd call them a partner because they heavily invest in future capacity.

Karl Freund

they have not put their toe into that water No. In the manufacturing water. OO other than providing, models that do amazing things in the photolithography. but they have not built any kind of position and I'm not sure they want to, I think for right now, they're happy just to specify the requirements and how many wafers they need, and. Go down that road.

Jim McGregor

That brings up the whole Terra Fab concept.

Karl Freund

Yeah. Terra Fab.

Leonard Lee

oh my gosh. This is like so much to talk about.

Jim McGregor

There is so much,

Karl Freund

Hype Terra Fab seems like a lot of hype to me. I don't think there's anything real gonna come out of that, but I hope I'm wrong.

Leonard Lee

It's like one

Jim McGregor

I think a partnership will come out of that. I think eventually most of the capacity targeted towards Elon Musk Terra Fab idea ends up being Intel. Yeah,

Leonard Lee

it's more like a, it's a star gish kind of announcement, right? Yeah. It's okay. Yeah. There's a partnership. Let's see how it goes. You put float at a big number or created some inflated expectations. Let's see how much of it, it's actually real. And that's where maybe, we don't get as excited. Like you aren't all that, excited about the announcement. I think that's probably a better. Attitude toward these types of, press releases. But, really quickly, I just wanted to get your, both of your takes on, Qualcomm's, hinting here. Jim, you're gonna be at the Investors day. I'm gonna begin Amsterdam. No, Copenhagen. I don't know where you're gonna

Jim McGregor

Amsterdam. I was gonna ask him what he's gonna be doing,

Leonard Lee

But, some quick takes, what are you hoping to see, coming out of, that event in June, but, what are you reading into some of these indications? Because, the CPU thing was new. They didn't really formally announce or indicate that they're gonna push out a data center, CPU.

Jim McGregor

They haven't, they've hinted around it. Yeah. So it would be great. And this is a laundry list of ideas that we'd love to see from, Qualcomm. And that is that data center, CPU, to be paired with their AI accelerators. definitely learning more about who they're hyperscale. And feeling that there's more than one company they're working with right now. I know that they were also bidding on some stuff with meta that ended up going to. Arm that ended up being the A GI processor, A-G-I-C-C-P-U. so at least learning about that first, customer that they plan on being production with by the end of the year. and, hearing more about their whole IOT. Build out because if you look at their earnings, especially the last couple of quarters Yeah. They keep increasing despite The seasonality of handsets and the projection for handsets because mostly'cause of memory. This year they keep increasing in i, OT and automotive every single quarter.

Leonard Lee

Yeah.

Jim McGregor

That's

Leonard Lee

automotive. Wow. 38, was it 38%

Jim McGregor

year over

Leonard Lee

Yeah. It's like at a 1.3 billion. I think it's by the end of the year it'll be about 6 million,$6 billion business. So hats off to Nicole. I mean, you know, those are, the IOT stuff is tough business.

Jim McGregor

Mm-hmm. Well, and I got them is a big bucket. It, yeah, it's a big bucket. But everything, I would like to see them break that out a little bit more too.

Leonard Lee

call it

Jim McGregor

iot. Yeah. It'll be interesting what they have.

Karl Freund

I think Christiana's strategy is to have, a full hand of cards you can play.

Jim McGregor

Mm-hmm.

Karl Freund

And one of those cards is customization for very large customers. Another card you can play is the data center, AI 200, which. two 50, which we haven't heard anything about since the initial announcements, but I think we will hear more about that. I don't think it's an neither, or I think it's an and.

Leonard Lee

Yeah,

Karl Freund

it's an and both of, as well as what, as you correctly say, is erroneously called IOT. I think he's got a very good hand to play and now this year we'll see him start to play it.

Leonard Lee

And then maybe lever, I'm really interested in seeing how they leverage the Alpha Wave acquisition of their assets. Mm-hmm.

Karl Freund

Yes. Well that's really the engine behind their customization, right?

Leonard Lee

Yeah. And then the networking side of things, the networking interconnect, I mean, just how all that play plays out into some of the systems that they're gonna be either co-developing with the hyperscaler, customer that they have.

Karl Freund

I don't think we'll have to wait till Hawaii. I think we'll hear more maybe to checks. We'll hear a lot more detail from Qualcomm and so, okay. What's your real strategy here, guys?

Jim McGregor

Yeah, we have a couple of punches coming from them. First Compex, then their investor day.

Karl Freund

Then investor day, then

Jim McGregor

and then the Snapdragon summit.

Leonard Lee

Maybe they need to have an, a new conference. That's what we need. They're so diverse. It's like ridiculous. But, yeah, and it's also, it's timely given media tech is making some serious, Inroads into this space as well, so. Mm-hmm. it's, but yeah, it, and someone told me, oh yeah, they don't really have, a presence in data center. It's like, no AI 100 Ultra, they coupled that up with, cereus, right? Cereus was using, I think for decode.

Jim McGregor

yeah, I think, I

Leonard Lee

think in

Jim McGregor

terms of market share, I don't think we've seen the needle move

Leonard Lee

yet. Haven't, right, right. But in terms of having done the investments, having technology, having experience, they've been working data center on the accelerator side with MPUs. For at least three, four years, right? Mm-hmm. So, Jim, you probably have more accurate number in terms of the years, but they're not newbies. They're not newbie. No, they're not newbies. There's a continuum here that they're writing on,

Karl Freund

and I think their credibility's been stretched as they keep making claims and announcements that don't pan out. So I think this is the year they gotta put up or shut up. Yeah. And I think they're gonna put up, I really do.

Jim McGregor

And you have to remember that the another key asset that they have here is Nuvia. Nuvia was working on data center processors. They weren't working on mobile processors. Yeah. They were shifted towards that when they were acquired. So we still expect to see more coming out of that group.

Leonard Lee

Yeah. Yeah. Orion and, and who knows with all this, focus on and, and. Inference. I'm really interested to see how they build the NPU story. If they can find a role for that. maybe elevate it from what we're hearing on the edge, especially on the industrial side, to the data center. So anyway, any other things you guys wanna hit on before we. Call it.

Karl Freund

We gotta move on to May. Trying to move on to May. Let's see what happens.

Leonard Lee

man.

neXt Curve

Cin de may.

Leonard Lee

Just kidding. Oh my gosh, you guys. Yay. Well, hey, it was a great episode. Really great stuff. yes, we had a little bit of a, debate. Disagreement and, I learned a lot from this session, so thank you, gentlemen. This has been,

Jim McGregor

I'm still looking for that SLAP app, so Yes,

Leonard Lee

Carl, it's on you, man. I'll go. Figure out what that prompt

Karl Freund

did. Yeah, we'll do a slap out. How

Leonard Lee

Yeah, just send me the, yeah, send me the plugin and then I'll add No, send it to Jim and he'll add it. anyways, thanks everyone for, tuning in. We hope you found the episode. Informative gentlemen, thank you so much for jumping on every week and sharing your wonderful insights. And if you want to tap into Carl Fres wonderful insights in ai, quantum and semi the semiconductor industry. at large, hit him up at Cambrian ai research@www.cambrianai.com and he is also on Substack and Forbes.

Jim McGregor

purple guy for it. It's more than, it's more than me. We have a whole team. We are the purple team,

Leonard Lee

Jim McGregor.

Jim McGregor

Yes.

Karl Freund

How did you learn to

Leonard Lee

do that?

Jim McGregor

Getting better. Matter of fact, look for us on Forbes, look for us on E Times, a couple of other publications, which I'll talk about, maybe next time. And, will be launching our own Substack channel here pretty quick as well.

Leonard Lee

Oh, awesome. Yeah. Wonderful. Curious research

Jim McGregor

and look for our latest ai. and TCO model forecast,

Leonard Lee

and you still owe us the whole demo claw, result, open claw stuff that you're,

Jim McGregor

we're still working on it and it is impressive and we should probably have a podcast just on that.

Leonard Lee

Yeah,

Jim McGregor

kinda share what we've found.

Leonard Lee

And remember to reach out@www.teusresearch.com. and then please subscribe to our podcast, which will be featured on the next Curve YouTube channel. Check out the audio version on bus brought and find us on your favorite podcast platform. Also, subscribe to the next curve research portal@www.next curve.com for the tech and industry insights that matter. Gentlemen, we'll see you next month and maybe even sooner on the road.

Jim McGregor

Cheers.

Karl Freund

Take care. Thank you.

Leonard Lee

Have

Karl Freund

a lot.

Podcasts we love

Check out these other fine podcasts recommended by us, not an algorithm.

The IoT Show Artwork

The IoT Show

Olivier Bloch
The Internet of Things IoT Heroes show with Tom Raftery Artwork

The Internet of Things IoT Heroes show with Tom Raftery

Tom Raftery, Global IoT Evangelist, SAP