The neXt Curve reThink Podcast

The Coming of the AI Workstations Age: Lenovo P Series (with Anurag Agrawal)

Leonard Lee, Anurag Agrawal Season 7 Episode 30

Send us a text

neXt Curve’s ComTech Podcast series focuses on the latest and emerging trends in commercial and enterprise tech catering to IT buyers ranging from SMBs to global enterprises. 

Anurag and Leonard discuss the rise of the AI workstation as the AI PC continues to struggle finding footing with thin and light laptops which have the hope and rage of the last two years. Find out why the AI workstation might breathe life into the AI PC and provide a bridge for GenAI to cross from cloud to the enterprise.

Anurag and Leonard touch of the following topics in this episode:

➡️ What happened to the AI PC? It was supposed to be the big thing to bring PCs back. (2:45)
➡️ What is the AI workstation and where does it fit in the pocket to cloud continuum and the enablement of hybrid AI? (9:19)
➡️ Anurag shares the key points of his research note "Realizing AI Potential: Why Lenovo Workstations should be at the heart of compute strategy." (10:05)
➡️ Getting over the security and confidentiality adoption hurdle of enterprise GenAI. (12:49)
➡️ What are the leading AI workstation workflows and use cases? (19:55)
➡️ Why the AI workstation could be the bridge between GenAI in the cloud and enterprise AI. (22:18)
➡️ AI Workstations - Are they a solution of shadow AI or a catalyst? (26:18)
➡️ The secret to enterprise AI - Don't chase whales (a AI Moby Dick story). (29:11)
➡️ How AI workstation deployments can foster safe and secure enterprise GenAI applications and hybrid systems. (31:18)

Hit both Leonard and Anurag up on LinkedIn and take part in their industry and tech insights. 

Check out Anurag and Techaisle at www.techaisle.com.

Please subscribe to our podcast which will be featured on the neXt Curve YouTube Channel. Check out the audio version on BuzzSprout or find us on your favorite Podcast platform.  

Also, subscribe to the neXt Curve research portal at www.next-curve.com for the tech and industry insights that matter.

Leonard Lee:

Hey everyone. Welcome to, uh, this next Curve Rethink podcast episode where we break down the latest tech and industry event and happenings into the insights that matter. I'm Leonard Lee, executive Analyst at Next Curve, and in this next curve rethink episode of the com Tech series, which is Commercial Tech series. That I do in collaboration with, Techasile and Tantra analysts. we'll be talking about a really hot topic. This is a topic that is starting to get hot, which is the AI workstation. And of course, I'm joined by the amazing Honor Analog Agaral, who is the Chief global analyst of Tech aisle. Right. Not just a chief analyst, you're global.

Anurag Agrawal:

You know that's a fancy title of saying that, we are a global organization. That's all it is.

Leonard Lee:

Yeah. So I think I need to promote myself this year to executive or global executive analyst. That's what you should do. Absolutely. You should do that. I think I need to do that.

Anurag Agrawal:

I think I deserve a promotion. This year, in fact, I've seen that many, many of the analysts, independent analysts have actually copied a similar title now. Really? Yeah. We are a global analyst. Okay. All right.

Leonard Lee:

Wow. Yeah. So, we'll, we'll have to think about the next level stuff here, very shortly, because if everyone becomes a global analyst, then we might have to go elevate things up to the galactic level. Yes, yes, yes, absolutely. So before you get started, remember to like, share and comment on this episode, and subscribe to the Rethink Podcast here on YouTube and Bus Sprout to take us on the road and on your jog, on your favorite. Um, listen to us on your favorite podcast platform. Also, check out and connect with, OG and the amazing work that he does. at tech aisle. And, also remember, all opinions expressed by my guests are entirely their own and don't reflect those of mine. And next curve. We're here to foster an open forum for discussion around the hot topics in tech, especially, commercial as well as consumer tech, right? I mean, they kind of. Right. A fine line, and oftentimes use the same hardware. But, let's get started on r and do it. Yeah. I wanted to start off with this. We're gonna be talking about AI workstation, but I wanna ask you what happened to the A IPC?

Anurag Agrawal:

It is. A form factor that is still waiting to capture the imagination of both the consumers and the con, the commercial segment waiting to take off. I don't know when that waiting period will be over, but it's not there. Oh, okay. And it is not coming out. It's not coming up very soon.

Leonard Lee:

Yeah, I tend to share your sentiment and I think that's something that the industry has started to. Come to reckoning with, and as we all might recall, it was just a year ago that a Copilot plus PC was introduced and it was introduced, with, Qualcomm's, snap Drive X Elite. actually it was the series, right? Because I believe they also had, X eight, released at the same time,

Anurag Agrawal:

It's And X series.

Leonard Lee:

Right. But even before then, I mean, the AI pc, topic was a big one and a lot of it actually really stems from. The X 86 or the PC world trying to catch up with what Apple had done. So I think a lot of times we forget about that. And technically the first AI pc or one with what you might call an AI accelerator was a MD. Although you'll have intel and everyone argue that they already had one way before a MD. But when we look at the latest generation, they're really rooted in, this race to catch up with Apple, because Apple with Apple silicon had introduced an MPU, with their first M series, SOCs for the Mac. So just, a little bit of background for folks to. or where we got with this, AI PC conversation and it got supercharged, right, as you recall, on RUG when Chat GPT was announced. And then there was this whole idea that, hey, we can also run these, large language models and other types of generative models, on device. And, I'd have to say that a lot of that was spearheaded by, Qualcomm, just because they have that genealogy or that, legacy in mobile that's common with Apple. But, it's interesting, it, it, that conversation was largely dominated by the MPU, right? And I thought the CPU was more of the transformative thing, right? Because X-er, really changed the conversation around battery life. I don't know what you thought, on or wrong, but

Anurag Agrawal:

it is true, and I think battery life. Absolutely, matters a lot. Mm-hmm. And, snapdragons X series, which as you rightly pointed out, the XE and the Snapdragon, I think they call it the X plus processors. Yeah, that's right. Yeah. The XX eight is for mobile. Yes. The Smart X and the Snapdragon X plus, they are, the arm based, chip sets, so they were pretty ideally. Suited and I think, we've tried different machines right from Dell and Lenovo and all right. And they have also got a pretty industry leading, that AI engine with the hexagon, NPU also along with that. Right? Right. and they have got this 5G capability and the wifi seven capabilities and things like that. And then came along Intel, right? with their Ultra series, Yeah, the Intel code Ultra Processors. I think the series one. And there is some series two, which is coming up, right, which have got the Intel Boost, NPU and things like that. And now I. A MD came out with its, with a fanfare, right? Their rise in AI processes. Yeah. I think they were the RYZEN 7 0 4 0 or 8 0 4 0. And the rise in AI 3000 series, I think, or maybe 300 series is what we call it. Yeah, that's right. so after this chat, GPT, it did capture the imagination of the commercial segment, right. But to quote a off repeat statement, the value prop is not really coming through. Right? Yeah. We can write as many case studies as we can as to how the AI PCs really drive productivity or improve security and things like that. But the reality is the very large, very vast number of organizations don't even look at those things when refreshing their PCs. Yes, battery life makes a whole lot, much of difference. But if you look at our own study, if you. Look at, sorry to quote my own study. No, do it please. but when you look at a study that we did last year, there was the intent to buy IPCs was extremely high.

Leonard Lee:

Yeah.

Anurag Agrawal:

Right. But we recently completed another study. With about 2000 SMBs, not the commercial segment, not the enterprise segment. This time, last study included the commercial, the enterprise segment. This year when we did the study only with the SMBs, we find that of all the features, functionalities that an SM B looks at in deciding what type of PCs to buy, AI PC capability is number 11 on the list. Oh, really? Right. And again, it goes back to the same point that performance, the CPU, the NPU, the ram, the storage capabilities always stop and then the security features and all that. So I think the owners lies on the PC OEMs and including Microsoft. Right. Because they are pushing their copilot plus PCs.

Leonard Lee:

Yeah. To

Anurag Agrawal:

really connect. The dots. Right, right, right, right, right. Performance with the IPC and I think they have to be, juxtaposed in a way that they cannot be separated because I think everybody has got caught up in the fact that a I PCs, we need to have certain, an application. But once you establish that performance, the CPUs, the NPUs, and all are directly connected to the eye, then other things can start to fall into place at a later stage.

Leonard Lee:

Yeah. Okay. And, so. The AI workstation actually for a while I've been looking at this category, largely because we've been having a lot of conversations with our friend, Rob Herman the VP of Workstations at, Lenovo. Lenovo has this whole concept of pocket to cloud, right? So they're one of a few companies that actually, has a portfolio that spans handsets. So your, smartphone all the way to these hyperscale or uber scale, AI data centers in the cloud. And they have this notion or concept of. hybrid ai, right? That overlays this. And so I've always thought that the AI workstation would genuinely have a very interesting role for a lot of reasons. But you recently wrote a piece which I read and I thought was really well done. It's called, realizing AI Potential, why Lenovo AI workstations should be at the heart of. compute straTechasile. I thought that was really, an enlightening piece. thank you. You beat me to the punch, by the way, because I'm actually writing a piece.

Anurag Agrawal:

Leonard, I can never beat you to the punch. Don't, don't say that as I, as I told you that, right. It was just maybe a timing thing. Right. But I cannot. no, no, no, no, no, no. See, see, my analysis, our analysis, my analysis is basically rooted in, in, in data. And your analysis is very much goes down into the technology, the vendor straTechasile, and so on and so forth. So I can, I can never beat you to the punch.

Leonard Lee:

Oh, no, no, no, no. That's not true. Hey, everyone. Don't listen to him. Okay. He's being way too modest. Oh, geez. Come on. So the audience wants to know what are the insights that you got out of this? And I, I really want to talk about some of the key points that you've made here, because I think they're very interesting and is something that, we will, we'll definitely shape the conversation around AI workstations. going forward.

Anurag Agrawal:

Absolutely. So I think your observation and your bringing out the point about pocket to cloud is a very relevant one. And what we are finding is that, ai, the work sessions per se, as a category is seeing a. Resurgence, right? Yeah. I think,, 20 22, 20 23, there was like a downward trend. But now AI workstations, they are beyond the basic AI PPCs. Yeah. They have a very core role in AI development and they have the capability to really handle compute heavy task at a local, on-prem, so to say, right? Mm-hmm. Such as model training, image generation, real time analytics, and so on and so forth, right? So. I think the piece that we wrote was based on several interviews that we did, qualitative or depth interviews with about 20 or 24 odd different companies that are using, AI Workstation. Yeah. Is specifically looking at how they are using, Lenovo's AI workstation to really understand where. They're being used. So in a way, what I'm saying is that they actually fit into the whole continuum straTechasile, right? Yeah. You have AI PCs on the one end of the spectrum, then you have these workstations, then you have the compute servers and HPC so they fit in very neatly.

Leonard Lee:

Yeah. and I think one of the big factors that you do cite in your paper is, security. and, data sovereignty. Yes. Probably to a, it depends on how one thinks of data sovereignty, but definitely the ownership of. Data at different scales. Right. But my view is, and based on my research, one of the biggest challenges with bringing these generative, uh. AI applications, models, and systems to the enterprise is the challenges you have with security, and I don't mean just generic or high level or network level access control, we're. Talking about at the knowledge level or the data level. Right. when we think about data, we typically think about either unstructured data that's managed in, let's say a content repository or a content, management system. Okay. They're tagged. I mean, it's all the old school web stuff. Right. And then, the R-D-P-M-S. Yeah, right. Which is a relational database and what we need to recognize. And a lot of, enterprises, a lot of companies that I talk to and a lot of vendors speak to, don't realize the difference between security at that level and, let's call it enterprise security, the broader security, you know, security at a network or app level. This is really important and for some reason, this has been overlooked for two years, more than two years, because still a lot of folks don't know about this. And one of the reasons why di Vector databases are starting to come out of vogue because you cannot institute, let's say, embedding level, controls, role-based controls on these things. And so when you look at how, different groups within the organization. Can leverage these tools at a larger scale, you really do need to figure out how do we, bring them into confidential and trusted environments. Right? It's really weird because, these systems can overshare, generative ai, applications can overshare like crazy. And so in order to design these systems in a way I. Where they are truly secure and don't create, data and security and confidentiality risk for your company. You need to be able to place these things, on premise or, within the control of the enterprise. Right? And so, this is where I think, the argument that you make in the paper about. the security is really, really important.

Anurag Agrawal:

Yeah, we found that, security obviously comes up at the top because of the capability and then, of what the workstations offer, but you can also argue the other side that, you know, what prevents these organizations from using. On-prem compute servers. Right? Yeah. That means they can offer the same capability, but then the on-prem compute servers with AI capabilities are a lot more expensive than a typical workstation. For example, you look at, we are talking about Lenovo, the Think Station P series and the ThinkPad, PP series, fit very, very neatly. Yeah. just before, the compute servers, right. And what we found that, when we talk to the, r and d folks, right? Or when we talk to the graphics and media generation folks, during this process of this, study. They said that, they actually are very helpful because they manage these workstations, manage security, but also they have this powerful GPUs, Which we can use for rendering special effects, complex simulations, and so on and so forth. And we talked to a bunch of, pharmaceutical firms too, right? Which. For them, security is paramount, right?

Leonard Lee:

Yeah.

Anurag Agrawal:

they said that, if we are being using workstations for drug discovery and even material signs, right? yeah. Like molecular simulation or protein folding, virtual screening, and so those are the elements which are really, really important, which kind of. Take advantage of the capabilities that an AI workstation offers in terms of security, and it's, and it's, other capabilities.

Leonard Lee:

Yeah. And one of the things that I saw at Compex, this year, which was really interesting, is this actually pretty prominent movement toward trying to bring, let's say a large model. Like, deep seeks RR one, two a workstation form factor. And, as you very well know, workstations are typically much. A higher power in terms of their compute capability. Yeah. than a typical, you know, desktop or laptop, right? And so what you're seeing are two things. Number one, the models are getting smaller. They're also becoming much more efficient. So a lot of the Chinese models have created, let's say this, I don't wanna call it a revolution, but this inflection point in, the economics and efficiency of large scale reasoning models where now you can actually deploy them on a workstation, albeit you still need a good amount of compute. But that is a huge statement in that you already brought up, hey, you know, let's say, enterprise server or server might be, something that you would want to deploy a workload like that on. But actually. You're increasingly able to, deploy very capable generative ai, workloads and, applications and models on a workstation, right? And so this is where I think the capability, gets married with the security benefits that you can realize with, deployments on workstation, right? And so let's say that you have a department or a group that's working with, sensitive information or confidential information and you want to isolate, the access to those generative AI applications just to those groups. that group, then you can deploy it conveniently on a workstation versus having to, deal with all the security hoops that you have to jump through maybe with the cloud deployment, you know what I'm saying?

Anurag Agrawal:

You are absolutely right. And I think what we find is that these AI workstations serve as the essential foundation for these hybrid AI strategies. Which you brought up at the start of this, podcast, right. And they act as the on ramp. and a powerful local hub. within a distributed ecosystem, they enable agile experimentation local development, or seamless integration with cloud and HPC resources. In fact, what we are also finding. Is that they are actually suitable for three different types of AI workflows, One is what we call as the research workflow. Number two is the development workflow. Number three is the production workflow. this is where we are seeing, and some of the elements that you are talking about. The research workflow is, experimentation, open-ended goals. Tolerance for failure, things like that. The development workflow is, where your real world deployment, the full control over the pipeline from local data curation. Yeah. To training and simulation and production workflow, because they can enable containerization, consistent image deployment, things like that. So they have a really, really good role to play. Right. Normally what happens is It is like the AI. Compute continuum, right? As I said, range from ai, PCs to work and servers and HPC clusters. But it can also argue that there are a significant overlap between all of these, So the optimal placement of AI workloads across these. Tiers or continuum is dictated by factors like cost effectiveness. requirements, security, data sovereignty, scalability, and complexity

Leonard Lee:

Yeah.

Anurag Agrawal:

And one of the points I would just like to bring up, you know, sometimes people say that workstations are not scalable, right? Compute or the cloud servers. Yeah. But if you think through it, right, and then giving a plug to this, Lenovo ThinkPad P series and ThinkPad, think station, I think is a P seven P five or PX series. When you pair it with Nvidia AI workbench, or use Intel's AI software stack. I think they call it the open Vno.

Leonard Lee:

Yeah.

Anurag Agrawal:

Right. Or open API. Yeah. They provide the real flexibility. And the scalable development environments. Right? Yeah. To manage these fluctuating workloads, right?

Leonard Lee:

Yeah. And you're making a really Good point. here, because as an enterprise looks at their AI pipelines, and their ops holistically, and they look at, let's call it the infrastructure fabric from. Pocket to cloud that they wanna deploy. they need to look at this stuff holistically. But I think the point that we're trying to make, and this is where I think the industry is also coming to the recognition that the workstation has a unique role. Yep. Right? And you're spot on. It's like we're reading each other's minds. It's weird. So if you're coming from a data perspective, I'm coming from a technology perspective. We're having a medium of minds here because I describe it as a sort of a beachhead vehicle, right? It can establish the beachhead for the enterprise so that that enterprise of whatever scale has the ability to bring, these capabilities that are being built up in the cloud, right? Down into the enterprise in a secure and, let's say a securable way, right? Yep. And when I talk about, security, I'm gonna go back to what I said earlier about security is not just about the network security and the. Securing the app, it's about down at the embedding or the knowledge level to prevent oversharing and to be able to truly, secure these applications. Right? Which at the moment isn't happening, which is why you're seeing, generative ai, really not making it down into the enterprise at the pace and level that a lot of folks had anticipated. Right, because I still hear about POCs. Well, we're doing POCs. I mean, we're three years into this thing, right? And we're still doing POCs and people have to recognize it wasn't chat GPT that started the generative ai, exploration by enterprises. This started. A year, two years before, and you know, I've been exposed to this stuff for like seven years, right? Because I work with a lot of the semiconductor companies that, have been working with these,, research, organizations that have been building this stuff, right? So, I think we are onto something though.

Anurag Agrawal:

Yes. Yes. I think that this point about POCs is interesting, right? Depending upon whose data you see. Some say that, only 10% of the Gen AI projects, POCs go on to become full projects. somebody else's data. You see that 70% of the organizations have already put. Gen, ai projects in production. But I think most of the organizations make a mistake as to which POCs they start to work with, right?

Leonard Lee:

Mm-hmm.

Anurag Agrawal:

you know, there is a highly accomplished and, recognized, executive. consultant and thought leader in the field of gen ai. Her name is Sol. Rashidi, I like her, description that before you really embark on any kind of a POC, you kind of need to plot your. Projects or goals or objectives on a X axis, you can think about, you know, what is the criticality of this Gen AI project. That means criticality means, which one of these will have the maximum impact on the organization. And on the y axis, you plotted as, co complexity. And once you have the complexity and criticality, then you decide which one you want to attack. And I think those POCs then become full projects, right? If you go with the highly complex and highly critical, then that becomes, a failure.

Leonard Lee:

Yeah. But then I have a third access. Okay. It's feasibility. So whatever it is that your board is pressuring you to do, or some, evangelist is telling you is possible. Is it truly possible? Are you looking at it? Full lifecycle because what we have a tendency of doing is, listening to folks who are talking merely about the implementation at that point in time, and they don't talk about the whole full lifecycle of managing this thing, so I think as we see the adoption of, enterprise ai, especially with, AI workstation, I think there are gonna still be some challenges.'cause one of the things the security industry is now really starting to get concerned about is shadow ai, right? Yep. So the question is do these AI workstation capabilities open up the aperture foreshadow ai. And what is it that it organizations can do and leaders can do to, control and, improve the governance around that. And then on the other hand, you also have fragmentation. Of, your AI portfolio, how do you get a handle of that so that you can scale and ensure the ongoing, quality of all the AI applications, models and infrastructure within your portfolio? So these are all like, still big challenges, especially as you start to move toward, this more mature hybrid ai. deployment within your organization, right? Which I think is still pretty low in almost every organization out there. So these really great greenfield things that I think, Lenovo and their peers are gonna be challenged to, address.

Anurag Agrawal:

you bring up so many different and interesting points here, I think this shadow ai Yeah, it's a reality. And it'll be a reality for a few years. and forever. It's quite simple, right? Because the Gen AI project deployment directly addresses the business goals. Of the line of business leaders, whether it be in sales and marketing or HR or customer service, or operations or facilities or finance So what is going to happen, and what has already started happening is that many of these line of business leaders are hiring Gen AI experts within their own organization, reporting to them and not reporting into the it,

Leonard Lee:

Yeah,

Anurag Agrawal:

yeah. And then when 74% of the budget for Gen AI deployment is coming from the line of business, this shadow it. Is going to happen. You cannot already avoid. Yeah. So what does it do the, it has to be more agile, right? But how can it be more agile if AI per se, can re reduce the time that it really spends on support and maintenance activities? And 9% of the time of it is. Spent on support and maintenance activities. Can they hire enough skill sets to, or can they use gen AI to really optimize that support and maintenance activities and then hire enough skill sets to really be a partner to that align of business, buyer. I think that will take a long time. Right? I mean, you can call it AIOps ai, DevOps or whatever it is, right? And the other piece you were talking about, one of the things I tell, the channel partners, right? That in ai do not chase the whale opportunities. What I mean by that is do not go after that million dollar or multimillion dollar contracts because that are going to be very short supply because. the end customer is. Looking for smaller deals, smaller deployments. See the ROI then extend it so they are opening up windows of opportunity.

Leonard Lee:

Yeah.

Anurag Agrawal:

goes back to your point about, how many POCs can rate really convert, but if you have smaller POCs, deploy them, see the showcase, the ROI, then you can have a lot more opportunities there.

Leonard Lee:

Yeah, I tend to agree. In fact, I've recently written a piece where I've been advising, organizations, let other people make the mistakes and learn from them at this stage in the game, right? Because as the solutions start to mature, that means that the learning has already happened and other people have made the mistakes. I think we've already, highlighted this actually. The people who started off with these massive investments in ai, POCs and programs early on ended up with almost nothing, right? But the folks who sat in the sidelines and waited and watched, they're seeing where the value is landing and how the technology is progressing and what some of the ground realities are. but that all that being said, okay there is value to be had. I guess what you and I are trying to say, and if I'm putting words in your mouth, you can tell me. AI workstation in particular has a very interesting role that it can play in catalyzing the adoption of generative ai, AI within the. Organization.

Anurag Agrawal:

A hundred percent. Because I see four different benefits there. You know, one is obviously it's enhanced security, confidentiality, right? Which is there obviously the long term cost effectiveness and maybe sustained performance and data proximity. You have data. Within your control. And then you can also talk about the fact that there obviously takes care of the latency issues.

Leonard Lee:

Yeah. So, this is gonna be interesting to watch, and to see how this particular interest evolves into something bigger. my. Hypothesis is that this is gonna be definitely an integral piece in that whole, hybrid AI continuum. And definitely, for, Lenovo, I think they're particularly well positioned to make something happen here. Right. And yes, absolutely. Because of their pocket to cloud footprint. Yep, yep. fabrics. We like to talk about fabrics. Some people don't like the term fabric, but in this case we need to have a compute fabric. You have to figure out a way to get all that stuff in the hyperscale, ridiculous cloud down into, the production environments, right? And the production environments are across the edge. The edges. So

Anurag Agrawal:

the challenge with hybrid AI is also how do you do the orchestration and management, right? I think that's, a critical point. How do you do that? Right? So, yeah.

Leonard Lee:

These are the big problems to solve. and orchestration, the capabilities you need are always gonna change, right? I think you made the best point earlier bridge. but if the bridge doesn't exist, there's nothing to orchestrate across the bridges, right? But if indeed the AI workstation can serve as that bridge. Then we can start talking about, this, pocket to cloud orchestration, which I think, is gonna be that tough nut to crack because on the other end of it, if you can't do that, then you can't scale and you can't secure stuff. Right? And there may be something other I might be missing, but, thank you so much for jumping on. This is great. Doing this. Thank

Anurag Agrawal:

you for the opportunity. What

Leonard Lee:

do you mean? No, my pleasure. I always, look forward to the, chance to speak with you and, collaborate with you. to our audience, if you don't know who this gentleman is, he is a global analyst. He's not just any analyst. He's. Chief Global Analyst of Tech aisle, one of the leading, research firms out there, especially when it comes to, the SMB market for, tech. So, definitely, if he's not on your LinkedIn follow list. get'em on there and also follow tech aisle. Reach out to them@www.tech aisle.com. Right. Did I get that right? thanks everyone, of course, for, sticking around. Hopefully you'll learn something here in our chat and, discussion on AI workstation. he and I honor organ and I both think it's gonna be a thing.

Anurag Agrawal:

yep, I agree. I have to live up to that. Chief Global Analyst that you have all lauded. It is a pleasure.

Leonard Lee:

I don't think you would've called yourself. I wouldn't, I don't think you would've given yourself that title unless you were already there, my friend. So, so, yeah. Please subscribe to our podcast, which will be featured on the next curve YouTube channel as well as research portal. check out the audio version on buzzsprout and also subscribe to Next Curve research portal@www.next-curve.com for the tech and industry insights that matter. We'll see you next time, and thank you so much, Anna. Thank you.

People on this episode

Podcasts we love

Check out these other fine podcasts recommended by us, not an algorithm.

IoT Coffee Talk Artwork

IoT Coffee Talk

Leonard Rob Stephanie David Marc Rick
The IoT Show Artwork

The IoT Show

Olivier Bloch
The Internet of Things IoT Heroes show with Tom Raftery Artwork

The Internet of Things IoT Heroes show with Tom Raftery

Tom Raftery, Global IoT Evangelist, SAP