The neXt Curve reThink Podcast

2024-The Year in Security & Trust (with Debbie Reynolds)

Leonard Lee, Debbie Reynolds Season 6 Episode 53

Send us a text

Debbie Reynolds of Debbie Reynolds Consulting LLC joins Leonard Lee of neXt Curve on the reThink Podcast to recap a crazy, tumultuous, and disturbing year for security, trust, and privacy where we saw some of the most catastrophic data breaches and cybersecurity incidents in recent memory and in history.    

Leonard and The Data Diva parse through the security and trust events and happenings that mattered in 2024:

  • Leonard's impression of security & trust in 2024? (1:30)
  • Debbie's highlights in security & trust in 2024 (2:54)
  • The cybercriminal economy's catalytic moment (6:35)
  • GenAI: cybersecurity friend or foe? (8:15)
  • Data breaches find normalcy (9:20)
  • Losing a foundation of security - privacy (11:30)
  • The EU AI Act: Foundation for responsible AI? (15:33)
  • The future of decentralized obscurity (23:13)
  • Debbie's hot security & trust picks for 2025! (26:00)
  • Leonard's hot security & trust picks for 2025! (28:10)
  • We need to give less to suffer less (32:10)

Connect with Debbie Reynolds at www.debbiereynoldsconsulting.com 

Hit both Debbie and Leonard on LinkedIn and take part in their industry and tech insights. 

Please subscribe to our podcast which will be featured on the neXt Curve YouTube Channel. Check out the audio version on BuzzSprout - https://nextcurvepodcast.buzzsprout.com - or find us on your favorite Podcast platform.

Also, subscribe to the neXt Curve research portal at www.next-curve.com for the tech and industry insights that matter.

Next curve.

Leonard Lee:

Hey everyone. Welcome back to next curves. We think podcast series on security and trust, where we talk about the hot topics in the world of technology, privacy, and trust that matter. And in this episode, we are going to be wrapping up the year 2024. In security, privacy and trust. And of course, I am joined by my good friend and the legendary data diva, Debbie Reynolds of Debbie Reynolds, consulting LLC. How are you doing data diva?

Debbie Reynolds:

Thank you so much, Leonard. I love this show and I love to be on the show with you. So thank you.

Leonard Lee:

Oh, absolutely. This is where we get our thought leadership out, right? And you do that every day anyways, but this is just another channel for folks in the industry, organizations, and even consumers to get the latest on what's going on in security and trust and particular privacy, especially in this age of. Generative AI that we've been talking about for quite some time, right?

Debbie Reynolds:

Yeah,

Leonard Lee:

it's true. Yeah. Yes. Should we get things kicked off here? Absolutely. You start, you

Debbie Reynolds:

start.

Leonard Lee:

Okay. Well, I'm gonna say is this has been a really, really jam packed year of incidents and events across this what's becoming a broader and broader expanse Of security trust privacy. It just seems like there's nonstop change. And as we look at incidents in terms of breaches and, privacy compromises of consumers as well as confidentiality. Of enterprises. I don't think we've seen a more insane year. Last year was pretty crazy. This year, I, it, quite literally my impression was the vendors are overwhelmed. A lot of the companies that are solutions for what have you, whether it's a cybersecurity Or it's a privacy or trust related technologies from Silicon all the way up. They have their work cut out for them and it's not going to be easy. That was just a high level impression. I don't know what you think that you don't think that this was a really mellow year. Right?

Debbie Reynolds:

It wasn't. It wasn't actually the, the cyber threats and the cyber breaches have been escalating for a number of years. And so exponentially. So, and so this year though, I thought there were three that stood out to me for being unusually bad in different ways. So those three were the United health breach, the 23 and me breach and crowd strike. Crowd strike. That's. Really about supply chain and so that was really hard for a lot of businesses to deal with. I think there's still litigation. I think Delta airlines is suing and 500 million, if I'm mistaken, because I think they have to cast like. 7, 000 flights or something. It was like something really bad. But the, I think especially, solar winds happened about 4 years ago. Right? So to see crowd strike. Go out that way in in the same way. It was very concerning, especially in terms of the impact of the people, because we had people who had to restart computers that were, like they were locked down where they couldn't put thumb drives in it. But then they had to get the packet into the machine. It was just ridiculous. Right? Yeah. Uh, 23 and me. So this is about someone's, identity, their DNA. And so, these are things that can't change. So people, I was really concerned because people are like, well, what do I do? And it's like, well, I really can't do anything, and it was abysmal. I think it showed the lack of Actual redress that people have in these types of situations Because they were like, we're not gonna be gonna give you one not two, but three years of credit monitoring It's like well, what does credit monitoring you have to do with your dna? Exactly, right? Yeah, it just totally sucked and then another prospect is still looming out there is that this company's lost a lot of money They're thinking about selling. People are concerned if their data sells to another company, that company legally doesn't have to uphold the obligations of the initial company, right? So they're like, so what are they going to do with my data? Except for Illinois. Okay. So Illinois has a biometric information privacy act. So if 23andme sells that data to another company, if they have data of someone in Illinois, they don't have to delete that data unless they got permission. So that's the reason why that law is very important. And then the final one even though there are a lot of breaches, United Health United Health was bananas because, even though we seem to be desensitized to these types of breaches, this was one where, it's, I think, I don't know, they're like the biggest healthcare provider in the country, but, people were, it's,

Leonard Lee:

yeah.

Debbie Reynolds:

Yeah, because of that breach, people couldn't get medication. Doctors had to cancel surgeries. Medical professionals weren't getting paid. This is like horrible, right? And I think some senators in Congress, they want to pass. More laws of legislation around this because they feel like anything that happens within health should be almost considered almost like critical infrastructure. Right? I hope that maybe in the future. We should be talking more about resilience and systems, not just systems, but what do you do when it goes down?

Yeah,

Debbie Reynolds:

you know, let's not pretend like we always had digital systems, people got surgeries before The internet or technology or they got medication So the fact that we have things or you have systems set in place where you don't really have a good Backup plan that's not good.

Leonard Lee:

Yeah. And that I think is what a lot of these new, this emerging breed or actually they've been around for quite some time. The folks who leverage ransomware to act large attack. Pretty much everyone they found. The foundation of their economy, right? Which is cyber criminal economy and something that Microsoft made point of through the course of the last 2 years. Actually last year, I actually, maybe it was the beginning of this year. They cited that this, criminal economy. Digital criminal economy is the third largest economy in the world and they estimate I don't know where they got these numbers, but around 8 trillion dollars. Okay. So 2nd to China. The most recent ignite conference they stated that the cyber criminal economy is now 9. 2 and so it's growing at a much faster rate, and there's more of an incentive. Now if you take. Microsoft's numbers to heart. There's a massive incentive for this economy for people to participate and to continue to leverage the technologies that they can innovate on. This is innovation, right? People talk about innovation. Well, we're innovating. Well, guess what? You can innovate in ways that are actually quite detrimental. And so this is the other thing we need to consider as we look forward to this future where cyber security is really being overwhelmed by the capabilities of the cyber criminals. And it's not a joke, because I'll start off with my temperature test coming into 2024. With RSA conference last year compared to this year, last year, everyone was really excited about generative AI jokingly plugging generative AI as being the panacea. That's going to solve all the pro it was quite comical. To hear some of the people up on stage talk about how these models, LLMs are going to sniff out all of the vulnerabilities and be able to help, you know, organizations defend themselves against, increasing cyber attacks at that time, right? This year, the tone was totally different. The industry is scared and generative AI, as they tried to apply them into the tooling. As well as the technologies that they deploy out across the networks and various mechanisms to institute zero trust and all these things that we, love guess what it's not as effective yet. The threats continue to evolve and morph very quickly and and, I don't even know where to start with the breaches. Cheers. It's insane. Even I was looking on the web today I just Googled data breaches. I mean, they have five or six. It's crazy. It really is insane. How many new threats breaches are coming online at, or have been instituted by various actors, some of them, if not many foreign state actors, but for instance, the T Mobile, it's like, U. S. reaches 31. 5 million settlement with T Mobile over breaches. This is in September and, T Mobile's gotten hit multiple times in the past couple of years. We had AT& T and Snowflake. That was really huge when I

Debbie Reynolds:

forgot about that one

Leonard Lee:

because there's so many that are occurring that are even bigger than the 1 before and have. Shed light on the new vectors that let's call them the evil doers. Or lack of a better word are instituting to cause harm. It is harm because this is not productive innovation. These aren't productive things that support let's call it the regular economy, the global economy, right? This is like a negative force stealing. Really? I don't know. Is that a product and it contributes to GNP, you know, GDP, I guess. So as long as those thieves are buying yachts and man, but, um, it kind of does not make sense.

Debbie Reynolds:

Yeah. It's a stateless economy, right? So yeah, we talk about state actors. These are stateless actors. And so I think once we, yeah, once we get out of the idea that the act bad actor is a state or jurisdiction it really is a cadre of evil doers Yeah. That are coming together and they're being very successful in what they're doing.

Leonard Lee:

So the way I look at it and I guess this is sort of a revelation. I mean, I kind of knew it, but it's just something that really. Hit home to me this year, privacy is so fundamental and this whole 3rd party, exchanges or commerce of our personal data, I think, contributes to. What these cyber criminal actors are able to institute and how they can innovate, right? Because if we talk about data being the gold or fuel for AI, generative AI is perfect to train or rag on top of, let's say the dark web, right? And it doesn't have to be 100 percent accurate. It can hallucinate. The thing is if you're instituting a massive attack then, the odds are in your favor versus the potential victim. Right? And I think that's the dynamic that is is quite frightening and, at the risk of being labeled. A Luddite, which I'm not. I'm a practicalist. Is that a word? I'm practical about stuff, right? I mean, these are, as a practitioner, I, it's like, you know, building systems, we have to worry about this stuff, but worry about it. That's the problem because we took. Security for granted for years. So, if you think about all these systems that are out there, legacy environments, they're largely protected by firewalls. Right? And the level of security and how these applications are locked down are probably not as sophisticated as you might think. Right? And so this is what is. Really concerning.

Debbie Reynolds:

Yeah, I think for a long time within enterprises, we had, uh, security by obscurity.

Yeah.

Debbie Reynolds:

And that's gone now, especially when people went into the cloud. And so we tried to move our castle thinking for a while thinking into the cloud and these digital spaces. And so what these criminals are doing is that they're taking apart these defenses and people's weaknesses and then throwing AI on top where basically any company I could think of was trying to Hoover up as much data as possible. So the thing that maybe you had that was hiding in plain sight that no one cared about, they're trying to, it was not properly secure. They want to suck it into these models and make it part of their training set. But then, I tell people technology cuts both ways. So a lot of times when people are creating these tools, they're only thinking about the good way. They're not thinking about the bad way that technology can cut. So I'll give you an example. So I guess that The running app was Strava, the running app. Strava had a thing called a heat map and the heat map will basically crowdsource your location and help you share your routes. with people, you're running and stuff, and it sounds fairly innocent, right? Very, like, maybe you're a type, Oh, my neighbor runs this route or whatever. And what they found is that, and this is actually one of the reasons why the U. S. put this executive order in place, trying to stop data brokers from selling data Americans to countries are concerned. It's because people are using it to stock people. I think a couple of people got killed because they were stocks like their real time location or someone found out, you think you're anonymous, but because they have so much information on the app, they can find where these people are. And so they have to re, Engineer the way that they did that app, and I think they got fine. I can't remember what it was, but again, there needs to be more talk about what are you doing to protect the safety of the people that you're you're dealing with. And a lot of this 3rd party data sharing, it is part of the marketing ad tech economy, but, it's not just selling products, right? Creating a situation where people can be harmed, right? Whether that be them being denied things, them, maybe inferences made about them that aren't true. A bad actor being able to use it to create some unsafe situation for someone. Yeah.

Leonard Lee:

One of the big things that happened this year was the UAI act, right? Speaking of let's say inaccurate inference or biased inference or what have you that could materially impact your life. That's the other thing that occurred this year, right? That was another big thing. And I know that you've written quite extensively about it. I mean, what are your thoughts there in terms of what kind of influence it will have globally? Or is it going to be something that the EU, uh, uses? To institute as a, let's call it a defensive measure locally in their markets.

Debbie Reynolds:

I think I'm going to have a huge impact because they were the first really to come out with a comprehensive regulation around artificial intelligence and the way that they, the way that they structured it, it is around figuring out what tools. do in terms of what the potential human harm or impact will be. So we've not seen any jurisdiction try to do a regulation in that way where they're like, Hey, well, we think this type of use case will cause unacceptable risks to someone like, for example, emotional AI, that's a good one. So emotional AI. You see this in movies, right? We're like, Oh, that guy, he has a frowny face. So he's gonna do something bad or something, that's just not, that's quackery. That's not even science, right? But that doesn't stop people from creating tools because of that. But they're like, we, the EU, we don't want to use this. just like the general data protection regulation was very influential globally. Around the way people are thinking out about privacy and data protection. I think the act will be similarly influential, even if jurisdictions don't enact, similar things. I think they've said, like this is our, this is that you planted their flag on the moon, basically with this. So anything that comes after it will be compared to that act.

Leonard Lee:

It's also an example of how foresight can be applied to institute or a structure, a regulation that could reduce, or at least risk mitigate some, harm again, a lot of this, like, we've said before, in previous episodes regulation usually follows the harm. Right? It's also true that there's perspectives out there that are more informed than others that can anticipate harm. And I don't think it's true that innovation for the sake of innovation is necessarily a good thing. And there's, there are people who know enough and can read the tea leaves. To put in policies that not only foster positive and responsible innovation, because that's one thing that you and I've talked about related to AI is responsible and responsible is not, oh, we're going to put the technology out there and we want the developers to be responsible on how they apply it. If you're creating Frankenstein, you need to be responsible and not letting the monster out of the castle in this assumption that somehow the villagers are going to, do the right thing and not piss the thing off. And have it go on a rampage that's irresponsible. And I'm sorry. Right?

Debbie Reynolds:

No, no, I'm sorry.

Leonard Lee:

Are you trying to figure out the,

Debbie Reynolds:

yeah,

Leonard Lee:

I'm just trying to figure out the analogy on the fly, but I knew it had something to do with Frankenstein.

Debbie Reynolds:

Yeah. It was a good analogy. It's a good analogy.

Leonard Lee:

You improvise and something cool comes out. I didn't know whether or not I would land and stick that one.

Debbie Reynolds:

It was good.

Leonard Lee:

Cry sometimes. Right.

Debbie Reynolds:

It was good. It was good. You know, as you were talking, I'm thinking, the point that you made, it just solidified in my head just now what the difference is before and after to AI, right? So AI has been around for decades, right? So it's not a new thing. I think the new thing that we're seeing is that. Just like you're saying like we're being responsible with your innovations and stuff like that I think artificial intelligence before ginger AI is like, okay whatever we build we have to take responsibility for it Right. So now what we're seeing is this push towards innovation where it's like, you know We're gonna take the money And we're going to take the resources, but you're going to take the risk. Right. So a lot of this. Innovation wouldn't happen as quickly if they were putting some of those guardrails in place and if they were trying to make sure that they had those safety and responsible things in place and that's what we saw before AI that we're not seeing now and that's like a huge problem, right?

Leonard Lee:

Yeah, but even before then, there was a discovery process where people were starting to understand the problem with bias and accuracy, relevancy, all the things that challenge generative AI today. None of those things disappeared. Actually, they all persist. And so it's almost as if there's this new theoretical hyped revolution. Everyone forgets what they learned before. And somehow this new form or quote unquote breakthrough somehow resolved all of the fundamental. Issues and challenges that we have with a I actually, it didn't do that. But we also underestimated its potential. And accelerating innovation of other kinds, which typically is what seeds itself before anything else that. Is supportive of that theoretical revolution, right? The revolution in cyber criminal economy that's not part of the deck. Right, but that is clearly what it's being used for to significant effect. And, you know, there's always this fine line between digital marketing and cyber criminal tooling or enablement. Cyber crime is all about surveillance and spying on people and then figuring out where the vulnerability is so you can attack and you can do whatever nefarious thing that you're intending to do. Digital marketing is about knowing as much as you can about the quote unquote customer and what ends up being a very invasive manner in order to control them. You want to control them, you want them to buy. Right and so this is that fine line that we have to ride and it's going to be challenging, given all the interests on both sides. To keep this dynamic going. But I think this is where there's a huge market for champions of privacy, trust and security and trust privacy. Those are foundational security. It's just that's protecting you from anything after the fact. If you don't have a trust basis and you don't have privacy 1st in place the stuff that you have to handle at the security level is just going to be so much more overwhelming because all of all your stuff is in the dark web and it's accurate. These nefarious actors have a lot of fuel to go after you, and the security defenses that you can put up have to be much more robust than. They need to be otherwise

yeah,

Leonard Lee:

it's just my what I realized serving all this insanity. I look at a lot of stuff. Right? I mean, people research a lot of different areas.

Debbie Reynolds:

I think in the future, people are going to give less. Right? I think we're going to go more towards decentralized, not just architecture, but data. A lot of the pushes we're seeing, especially, I talked a little bit about age verification, I think that's gonna shrink a lot of apps and a lot of services because people are going to continue to give their personal data out to these companies that are continually being breached. Right? So they're going to think really hard about who gets their information or who gets their trust. I actually saw a stat just before this. Session it said that app downloads are down, but app revenue is up, especially on Apple. And that's mostly due to subscriptions. So I think they say only about 15 percent of apps on the app store take subscriptions, but they're driving that revenue. So that to me, that tells me that people, instead of downloading every app, Or whatever, they're finding things that really help them. And then if they're subscribing, they may have a long term need for this thing, but then they're building that those brands are building trust with people. So I think going back to your trust thing, if you're not building trust, I think in the future, it's going to be harder for you to get good data.

Leonard Lee:

Yeah. Yeah. And the AI training. So I tagged you on a recent post on LinkedIn about Google, right? Uh, YouTube as a creator, they prompt you now asking you consent to share your content to third parties for AI training. And of course, guess what I did, what was my choice?

Debbie Reynolds:

You said, no,

Leonard Lee:

yeah, of course, but no, semi kudos to Google for at least asking going back to your statement before it's like, hey, look you already monetizing my content in 1 form. If you're going to be monetizing my content by selling it to be trained on a model by a third party, why don't you pay me? Yeah, I think that's where creators and even consumers need to ask for that compensation because I think the whole idea of a lot of these privacy and data sharing regulations is to limit The, let's say the Unintended or unconsented use of your data, right? And the 3rd party use of your data. Oftentimes is obscure, but if you're going to monetize that data, you should give us a cut. If not the whole cut. Yeah, that's yeah. Anyway, so what are the hot topics to watch out for? In 2025, in your mind, in the landscape, what can we anticipate?

Debbie Reynolds:

Oh, wow. Let's see. I guess 2025, probably the, the not sexy things. So this is kind of the eat your vegetable stage of things. So I see a lot of companies, especially the ones who are super hot and gaga about doing AI. They realize their data was crap and they, their governance was horrible. And so they're going back. Back to the drawing board and they're really trying to go deep on governance Try to get their data in type of shape so that they can Leverage some of these other tools not necessarily general to AI but just AI in general So instead of being, you know Instead of being enamored with the shiny new object figuring out what really fits with their Your company what fits with your maturity, how to get to a maturity level and figure out even if you need it, I've seen big companies say, we don't really need this. It doesn't work with our systems. We don't need it. So hopefully it'll be more coming down to earth for companies about what they really have and what they can do. But I think especially because there's so much capital expenditure on AI in general, those are like, big bills that people will be paying. So I think companies don't make capital expenditures that often. Right. So the people who bought stuff like last year and a year before, they probably aren't buying that much this year. So I think a lot of the the shovel sellers will, definitely lose some revenue. Not that, maybe They'll get, three private jets instead of two or two, instead of three. But I think hopefully people, this is where the rubber, I think will meet the road in terms of companies figuring out what their next strategy is and try to find a way to use AI or not, depending on what makes sense for their companies.

Leonard Lee:

Yeah, I agree with that, especially cleaning your data up and but also ensuring that, governance from a security perspective. Right? And that's essential, especially as they're looking to push some of these generative applications. Into their corporate environments, they're going to find that security or they are discovering quite honestly, that security is actually very challenging and that, um, these open public models, number one, have limited utility. If you can only access so much of the corporate data, then you have holes in the data, which know some people haven't thought about. That's what, having access control limitations on data can do to you. Sometimes. Right? You don't see everything. And so whatever this LM is reasoning on quote unquote reasoning on top of is not the whole picture. Right? And just do the simple math. And that doesn't, Turn out all that great, but obscurity, you mentioned obscurity. I think a lot of companies are going to seek that. And I've not so jokingly. I've said it, but people thought I was joking is this whole battle star galactica dynamic will start to play out, especially as security threats become so intense and the risk of being compromised escalates because of, these novel ransomware attacks. And economic enablement of cyber criminals, people are going to try to figure out how do I leverage confidential computing to secure my run times and execution environments? And how do I. Then just completely isolate this thing and then figure out very, very controlled ways of sharing data outside of this obscure zone. Right? It's starting to happen and we're hearing more about confidential computing and other, let's say zero trust, enabling technologies and. I think that's something that in certain sectors in particular, especially that serve critical functions like the healthcare industry insurance financial services are really gonna take this to the next level in the next couple of years. So, um, but 2025, I think is really going to be that aha moment as people start to realize, Oh, wait a minute. This is what's really happening with generative AI. Innovation at scale and in what is a counterproductive productive mode. Um, which is hydrating the, because you know what? Hey, think about it. If Microsoft is right. And this economy is growing faster than any economy in the planet is already number 3 economy. Right? What does that say? Is Microsoft full of it? Um, I don't know. Maybe they're just being self serving. But, the thing is, You definitely see it in a lot of other reports and you hear that from, operators, you hear it across the board, and then we see the results. I mean, just look up cyber security breach or data breach in Google, and you will find. Massive incidents that have happened in the last 24 hours.

Debbie Reynolds:

Yeah,

Leonard Lee:

I don't know. The evidence is there to sit there and say, oh, you know, doesn't really you're just freaking out over nothing. Yeah, I totally forgot about freaking out about the fact that my data keeps getting compromised by these companies. Right? Right. We entrust our data to these companies. And we expect them to respond, be responsible. And these things happen. And what is there not to get freaked out about as a consumer?

Debbie Reynolds:

Right? Yeah, I get it. I get it. And Microsoft did recall.

Yeah,

Debbie Reynolds:

you know, Microsoft, Microsoft, they're still trying to do recall again. My thing is, we need to less we need to collect less data. We need more curators of data and not just, hoovering of data. The more data we give, the more risks that we take as consumers. And so we're just tired of, more companies want our data than we want to give. Right. And so I think there's definitely a pushback. Like it was so weird. So I have like a DocuSign account. And for some reason, they put like a location thing and setting in there. Like, why do you care where I physically am when I sign a contract? That makes no sense.

Leonard Lee:

Yeah, well, you know what? Here's the irony of it. They use it for fraud detection. That's why they want it. Isn't that ironic? They have to creep on you. To make sure that the untrusted entities aren't creeping on you and because, DocuSign, I get fake DocuSign phishing emails all the time. And so they want that traceability, but to see your bring up a perfect example of where we are and where things are going, and it's not getting better. Right. That's it is not getting better. And so to be passé about it and dismiss it, especially if you're a vendor or you're a product company that puts out digital services and products to dismiss this stuff is. Irresponsible, right? I mean, that's what we've been talking about regarding responsible AI responsible tech It's not about the developer. The user. It's about you. The provider. Are you responsible? Right? Are you taking care of all the things or are you just shipping junk out the door? MVP, uh, Yeah. May have all kinds of security holes, and it is basically a cyber security time bomb for your customer. You know,

Debbie Reynolds:

yeah, we are fed up

Leonard Lee:

Shouldn't we care I Guess we haven't for a long time. I guess and Unfortunately, there's folks out there their mentality is isn't like yeah, you know, it is what it is It's like yeah, it is what it is because you think that way But anyway I guess that's a good way of finishing off 2024. I think we got a little emotional toward the end, right?

Debbie Reynolds:

Yeah. We're our feelings,

Leonard Lee:

our feelings, and it's been a long year and a lot, I mean, you know, just going through this account of all the breaches and all the challenges that the, the world faces when it comes to digital security, privacy, and. Trust it. It really hits a chord, right? I mean, it should. But anyway. Debbie,

Debbie Reynolds:

yeah, I think it's gonna be, it's gonna be interesting year 2025.

Leonard Lee:

Yes. Obscurity the quest to be obscure. Oh, what? Yeah.

Yes.

Leonard Lee:

Yeah. So, Debbie, thank you so much for joining me here at the end of the year to recap 2024. Really appreciate it. I really. Appreciate you, you do such wonderful work. I don't think people even know, they know about half of the story. I seen you in action. It's actually a beautiful thing to witness. And

why

Leonard Lee:

don't you share with our audience how they can get in touch with you and tap into your wonderful insights and access your services because you do a lot of work in the areas that we talk about.

Debbie Reynolds:

You're the wind meet my wings, Leonard, you're doing a great job. So people can contact me on linkedin, just type in data diva, Debbie Reynolds, or they can go to my website, Debbie Reynolds, consulting. com. So have some interesting things that are going to be coming up in 2025. So check me out.

Leonard Lee:

Yes. And congratulations to, you and your podcast, which reached 500, 000 downloads recently. Right. So congrats.

Debbie Reynolds:

Yeah, it's great. It's great knowing that, maybe 10 years ago, no one cared about this topic. So the fact that we're getting a snowball effect that people are really listening in on it is great.

Leonard Lee:

Oh yeah, and everyone should know that you are a pioneer. You are the thought leader in this. You should get all the credit that you deserve. Once again, thank you. And thanks everyone for joining, listening to. This podcast, we really appreciate your viewership as well as listenership. So remember, please subscribe to our YouTube channel. The easiest thing to do is subscribe to our research portal and media center at www. next curve. com and Once you subscribe, you will be tapped into all of the tech and industry insights, especially related to tech to trust and security that matter. And so we will see you in 2025, happy new year.

People on this episode

Podcasts we love

Check out these other fine podcasts recommended by us, not an algorithm.

IoT Coffee Talk Artwork

IoT Coffee Talk

Leonard Rob Stephanie David Marc Rick
The IoT Show Artwork

The IoT Show

Olivier Bloch
The Internet of Things IoT Heroes show with Tom Raftery Artwork

The Internet of Things IoT Heroes show with Tom Raftery

Tom Raftery, Global IoT Evangelist, SAP