Click for Full Transcript

Intro  0:01  

Welcome to the She Said Privacy/He Said Security Podcast. Like any good marriage we will debate, evaluate, and sometimes quarrel about how privacy and security impact business in the 21st century.

Jodi Daniels  0:21  

Hi, Jodi Daniels here. I’m the founder and CEO of Red Clover Advisors, a certified women’s privacy consultancy. I’m a privacy consultant and Certified Information Privacy professional providing practical privacy advice to overwhelmed companies.

Justin Daniels  0:35  

Hello, I’m Justin Daniels here I am a corporate M&A and tech transaction lawyer and equity partner at the law firm Baker Donelson. I am passionate about helping companies solve complex cyber and privacy challenges during the lifecycle of their business. I am the cyber quarterback helping clients design and implement cyber plans as well as help them manage and recover from data breaches.

Jodi Daniels  1:00  

And this episode is brought to you by Red Clover Advisors. We help companies to comply with data privacy laws and establish customer trust so that they can grow and nurture integrity. We work with companies in a variety of fields, including technology, e-commerce, professional services, and digital media. In short, we use data privacy to transform the way companies do business together, we’re creating a future where there’s greater trust between companies and consumers. To learn more and to check out our book, Data Reimagined: Building Trust One Byte at a Time, visit Well, today we have a really special guest and we have had so much fun talking that we’re just going to dive right on into it.

Justin Daniels  1:43  

Okay, we’re not going to discuss that your this is a new shirt you’re wearing today.

Jodi Daniels  1:47  

No, not really. We’re going to skip that.

Justin Daniels  1:50  

It’s a great news. Alrighty, then,

Jodi Daniels  1:52  

We’re going to bring back to the show. Michelle Dennedy, who is a longtime privacy professional. She has held many leadership roles in data strategy and privacy at Sun Oracle, McAfee, Intel, Cisco startup companies. And she has turned CEO at and Partner at Privatus Online. So Michelle, welcome back to the show.

Michelle Dennedy  2:18  

Oh, thank you. It’s good to be back. I love when our mutual friends are like you guys should really meet Jodi and Justin and like, oh, yeah, they sound really smart. They’re like, yeah, yeah, yeah. And so I say, Hey, guys, I’ve been re-introduced to you again.

Jodi Daniels  2:35  

And we are grateful, but there is so much exciting things that you are working on. And so Justin’s gonna kick it off for us, when?

Justin Daniels  2:43  

Indeed I well. So for those of our listeners who didn’t catch the earlier episode last year, listeners can check that one out. Can you briefly tell us about your incredible privacy career that brought you to where you are today?

Michelle Dennedy  2:58  

Yeah, briefly briefly as hard for me. I will be as brief as possible. My partner is laughing. She’s like Michelle, don’t do brief. I am trained as an attorney. I was a patent litigator last century. And at the turn of century, we won’t talk all through 24 years of it. I switched into data privacy, because I was at Sun Microsystems. And it really is, it will it will zoom us through this timeline because it was the synergy of patent law, copyrights, trademark, m&a practice, technology, IoT, distributed computing,, recession, resurgence. And now. So speeding forward is like, what does a privacy professional do through this weird tunnel if you’re old enough to remember the $6 million man running through the tunnel? Michelle Danity was there are some times as a chief privacy officer, sometimes as a chief governance officer building something that we called the grid which we renamed the cloud and suddenly people thought it was cool. And then I stood up a a security and privacy sales practice that I went on to try to glue together the cultures of McAfee, Intel and do a privacy reverb or restart something at McAfee once they were owned by Intel and that incantation of them. Culture, each strategy for branch people and it was a long branch in that culture and business imperative. In that era of time, that’s when all the Snowden revelations and the kind of collapse of the vague detente of European and US privacy relations started to break apart so Schrems I happened. And we started to realize that not only could we not just paper our way through privacy We had to really integrate privacy technology, pragmatic policy. And so really the rise of the Data Protection Officer, the rise of the privacy professional, and the awareness of businesses. And of course, in 2018, the granddaddy GDPR popping up turning this into a real profession with board level awareness of privacy. And so in that period, we published the Privacy Engineers Manifesto, a couple of standards have kind of spun out of that work. And it brings us to today is, when I left Cisco, I first did a visual data analytics company, it was a little too soon, a little too fast for the privacy world. So we said, let’s go back down to basics. And let’s build an operating system for privacy. And so that’s what we’ve done. We’ve built a company is called It was built with machine learning, we use large language models to actually do our work and accelerate not just the size of the requirements library that we have, but also the speed and agility so that anyone can come in as a new person in privacy and security, data quality or ethics. And instantly, you’re working like an expert. And if you’re coming in as an expert into the platform, you can be a unicorn with sparkles coming out of your horn very quickly.

Jodi Daniels  6:20  

Well, I really like being a unicorn. So that’s cool. And you brought us into AI, which is now the conversation that’s happening everywhere, especially in privacy. So what are some of the big privacy and security risks that companies might not be thinking about yet?

Michelle Dennedy  6:39  

Yeah, so I mean, I can’t, I can’t help myself and say AI hasn’t actually happened yet, to be really dorky about it. Because AI is supposed to be artificial intelligence, and you’re supposed to pass the Turing test. So if you want to nerd out with your techie, people go in there and tell them that AI doesn’t exist, and they will love you. And then get over yourself and call it AI again, because that’s what we’re calling it. What are we talking about with AI, and particular generative AI. So AI, as we’re talking about it right now, in the kind of the wide wide land can be considered as very large data learning sets coming together. And they’re not, they’re no longer the Googles and the Facebook’s, and the experience, who have been gathering data on their own platforms, every time you click, you give data, every time you use your credit card, you give data, whether you like it or not. These large models have been created by crawling the net as it exists today, whether or not they have the right to do so gathering data together. And then presenting an interface particularly I’m talking about generative AI or ChatGPT, or Barter, or, or some of these platforms today. So that a consumer of this platform can simply prompt it and say not not even just a search and say, Please write me a letter of apology for talking too much on the podcast today. And it will send this thing in my voice. That sounds it will be too long, of course. But it will send me a nice letter, and I can send it to you guys after the pod. And the beauty of that, from the consumer perspective is, ah, just takes so much administrative work. It looks and sounds like me. I don’t have to do cover letters, I don’t have to do NDAs I only have two contracts. I’m not gonna call Justin anymore. It’s going to tell me what the citation of the law is. But except, it’s gonna lie to you if people. It’s gonna lie! So part of the Risk Number one, these models were created to sound really smart. They are that guy that you sit in a meeting with, who is always talking too much, who says a lot of nonsense. And you’re sitting there going. I’m pretty sure that’s bull. But it sounds so good. I’m with an English guy. Trust me. You bring that David Attenborough soundtrack on it. He could tell me it’s nighttime now. ChatGPT sounds so factual. And if it gives you a citation, it sounds so real. So it presents facts that are unpeeled in a way that even Wikipedia which already was a sort of a dangerous place, because it feels like they’re citations there. And it gives facts where you know that you’re going on the National Enquirer. You kind of have an expectation of quality of journalism. Or you’re going on, you know The Lancet for medical advice, so some of the dangers are source, some of the dangers or accuracy, some of the dangers are we the people coders haven’t considered things like bias. And to us privacy professionals? Where do you get all that data? Where do you get? It’s not legal, a lot of that is not legal. So that’s, I mean, there’s a billion more risks. But that’s a good grab bag.

Jodi Daniels  10:27  

I had someone tell me yesterday, they were so proud of themselves. They use ChatGPT to develop a cookie policy and a privacy policy. And they were all set and posted it. My response was, I highly recommend that you have that reviewed by a privacy professional, whether it is internal or external. But please have someone review that because there’s so much that you would need to ask that it’s not going to know there’s so much nuance of what might be happening on a site as well, that would not have been captured maybe a great starting point, and easier to edit than starting from scratch. Right? That’s kind of where we are you have people who are using it. And they think that that’s

Michelle Dennedy  11:08  

one and done. I think that I think you raised the Perfect Point. I love that people are at least saying instead of creating a myth in their mind of other voices saying, Well, who cares about privacy anyway, because they’re, they’re really afraid of it. It’s like an insecurity. Because when you get down to when you talk to people about privacy, every single person cares about privacy, of course they do. But there’s like an insecurity that they don’t really understand what it is or what it entails or what to do. And I might have to call a lawyer. And it’s also inconvenient, it’s almost like taxes or writing my will. And so, I like that you can go into the quietness of your chat, and create a cookie policy and create a policy. But you have to understand exactly what Jodi what you’ve just said is bang on. Take that rough draft, take it to your lawyer, take it to your technologist and say, let’s start the conversation with this very rough draft. It’s like when you bring in seven hairstyles to your hairdresser. And you say I want this, and I have done this a million times. And Jalen

Jodi Daniels  12:27  

has to go you’ve brought multiple hairstyles to the barber, haven’t you?

Justin Daniels  12:32  

What do you mean bow cut number three,

Michelle Dennedy  12:34  

right exactly like he wants to on the side five on the top.

Justin Daniels  12:37  

There’s no glare here. I can’t do the pigtails. Put the ponytail up make it straight. I’m on bowl cut number three Oliver AI wants?

Michelle Dennedy  12:48  

Well, there are things I cannot do. I can bring a picture of like Taylor Swift Legs, they cannot install those legs on me. I am a corgi and a corgi I shall be so you can bring that policy and to Jody go. This is my new policy. And she’s like, Oh, no, no, no, no, you’re a startup that does mental health care for children. And you take in videos, you cannot do a disclaimer, you want to do a piece of hardware. That’s a TV that has sensors in it that gathers data about everyone walking in my room. You can’t just say the price of the TV is all your data true case out in the wild.

Justin Daniels  13:28  

So I guess, Michelle, here’s what I want to ask about a little bit, which is, I’m sure you read about last week where seven of the big CEOs went to the White House, and they came up with voluntary principles for AI.

Michelle Dennedy  13:42  

So sweet, indeed. And my question is, with

Justin Daniels  13:49  

all this money trashing or, you know, sloshing around Silicon Valley, and all these companies wanting to be the market leader in AI. And given our history, where you see how privacy and security gets short shrift with social media, we can pick a technology. What gives you any comfort, as both a privacy professional and also as an entrepreneur, that there won’t be pressures brought to bear or just market forces saying, hey, I’ll worry about this privacy and security later when it’s a cost of doing business. So I can be a market leader.

Michelle Dennedy  14:20  

It’s already got your perspective

Justin Daniels  14:24  

on that. Yeah,

Michelle Dennedy  14:25  

it’s already happened. I mean, I think it was fascinating to watch and I won’t say the name because we all know who it is, you know, one of them trips over without any sort of mission statement without a broken down statement of ethics without an understanding of what framework of ethics, you know, is it Western? Is it Eastern? Is it Hobbes? Is it Conte? Is it Aristotle ethics like, all of these things have meanings structurally as well as philosophically to your build? What are your what is your point of view on what this means? Because this this is a A fragile technology in many ways, and it is a bomb, in many ways is there is an existential threat to facts that don’t have any context but sound like they’re coming, you know, written on stone tablets from God. And that’s where we’re at. Right? We’ve sort of technology has become our secular religion, I think that’s the biggest existential threat of a lot of this stuff is it’s really easy to propagate bad facts with known idiots. Now, if you take the idiot layer away, and you give it, you know, Helen Mirren’s voice. Only hell. So I think hearing people go to DC and go, we need a special agency and earnestness and then not more than five days later, turn up in Brussels and say, We can never be regulated, how dare you? Huh? And then not see anyone in Africa and not see anyone in China? Like, do we think that there aren’t any humans that live there? Is this the 12th century? Again? No, we’re this is something that we really need to have a debate over. I’m not saying that it can’t be regulated. I’m not saying that it should not not be regulated. But I think that someone who’s already unleashed something that they haven’t been willing to disclose transparently, they haven’t been able to put out a debate on the table of what regulation would look like other than if you’re not a really, really rich guy, backed by billionaires that work in secret. That seems to be the only rule right now. And that’s silly. We all know that silly. And silly is not even a good word for that. What that is, it’s not technology, it’s not mathematical. So the scientists should be angry about that. Parents should be angry about that. The lawyers should certainly be angry about that. So I think we need to figure out what what do we want to kind of shape this thing, and the only thing that I am very, very loudly advocating against is doing nothing, and waiting for someone else to do it, because they will do it to us.

Justin Daniels  17:14  

I guess what I wanted to add, kind of, on your point, Michelle, is, you know, in our country seems like the one agency that has power already would be the FTC with fair or unfair and deceptive trade practices. But I struggled given the history of how social media went down. I mean, Section 230, public policy, HIPAA, those laws are from 1996, it’s almost like Congress has abdicated their role to put guardrails around the 21st century economy and AI will be the culmination of all of the bull that went before it. So just as as an entrepreneur who’s in this space, and you’re trying to do the right thing, because you’re very privacy focused, it has to be very frustrating to look at what some of these other companies that are deep pocketed back to your point, they go to the US and say one thing, but now they’re heading over to Europe, in front of the regulators in Europe and saying the exact opposite. I feel like that whole thing at the White House was just propaganda to pay lip service that we’re going to do something but the reality is no.

Michelle Dennedy  18:26  

I mean, it was it was bizarre to me, because I thought did they not know like, Twitter is not in great shape, but some of us still at least get the feed, like Did he think we wouldn’t see it’s like a little boy hiding behind a fig leaf like, oh, you can’t see any I’m in Europe now was just weird. I think the thing that should offend every person in power, I mean, forget the powerless ones. They’re just like, they don’t even know what’s coming, it’s a tsunami. The ones in power, should be very upset that we need to bring in not just the FTC because I think they are so overwhelmed by what has to be done just in the laws that they have. And I think 230 is a great example of that, and that there are people that are paid millions and millions of dollars, you know, with a with a portmanteau of some other job and their only role is to say that we are just a pipe and not responsible for any of our actions. I think the interesting player in this at least in the US is the SEC. I’m interested that they are talking about the solar winds responsibility. I think it’s interesting about you know, talking about what is material at a board level, and is materiality strictly dollars and cents to the shareholder or is materiality, something else I mean, Think that debate is very difficult because we’ve been since the 33 and 34 acts, it’s really been about, you know, return to the shareholder or fraud against the shareholder. And even in the Elizabeth Holmes case. She’s not in jail, because she hurt people. She hurt people. People thought that their cancer was gone. And she told them it was back. People were scheduling their funerals and calling their families with saying goodbye, because that that was my bleep taught it just to get another round. But that’s not what she went to jail for. She went to jail for defrauding. What’s his face? And what’s her face? It’s offensive. So I think we need to figure out what is this SEC going to do? They do strike and they strike fast. FTC takes a long time, by the time they strike. They give a 20 year, you know, naughty slap on the hand. And it gives time for these companies to grow a security business. I mean, Microsoft reacted the way you want regulation to react, they lied about their security posture. FTC gave him a 2020 year sentence. And out of that grew a really strong security practice. So that’s good. But how often did that happen?

Jodi Daniels  21:26  

So let’s bring it to what companies can do today. Tell us more about privacy code, and what can companies do?

Michelle Dennedy  21:33  

Yeah, so yeah, I know, we got to just you put me to a dark place, my friend dark place. The good news is the great news blame, do you

Jodi Daniels  21:44  

blame Do you not me? Okay, I’m, I’m the happy, happy person. We really all

Justin Daniels  21:48  

You’ve caused rancor between the cohosts.

Jodi Daniels  21:53  

I’m very happy. But we’re going to bring it back to Michelle. Michelle. Michelle is going to talk to us about what companies can do and more about PrivacyCode.

Michelle Dennedy  22:00  

We’ll have a little therapy along the way I love this. We’ll bring it all back together. Everyone will just be mad at me. That’s typically how this goes. No, I think that the good news is a there have been people working not just on AI and really high on on machine learning on computer vision on the trolley car problem and ethics for on the AI and machine learning for years and years and years and years and years. I mean, I think it’s amazing to talk to folks that have been doing this. They they’re sort of alarmed at the publicity. Like they’re sort of amazed by what like, they’re like, what I’ve been sitting here my you know, Montreal has this amazing AI group and Stanford and some of these places. And these guys have been like begging for grant money for 30 years. And they’re like, what? So the good news is, we have people with like boots on the ground. And you know, when they called it things like data quality, systems engineering, wicked problem solving, statistical fidelity, fairness, bias, all of these things. There’s a lot of deep practitioner level, and particularly academic research. So interestingly, because we do have these types of machine learning things, there’s a lot of academia that has been sort of stuck. And so ironically, and I think it’s going to both irritate and delight them, we can actually free up some of this thinking with generative AI, some of this academia of how do we actually think about what are the rules and parameters of bias? What do we mean when we say bias? And this isn’t this idiotic perversion of the the term wokeness I don’t even know what it means. It used to be something nice that I was like awake and aware of other people’s feelings. Now apparently, it’s something terrible that I’m like, overly sensitive. So it’s not about correcting for bias to be politically correct, or to be harming one group to harm another. It’s really understanding A: what is your data model actually doing? We can do that today. So for one thing we can understand for as privacy professionals, the source of our data, is the source of our data, the laws about privacy, and is the goal of our data, to illuminate the tasks that must be unlocked by breaking down those laws and those regulations and those recommendations within those laws so that it can be read by a practitioner. If that is our goal, and that is our data set. It is not only possible, it has been done that you can put that into a library and you can assign it weighted values and you can put it into a place where you can read in a policy that came off of a chatbot. And it can say, here’s what that policy says in GDPR speak, or here’s the activities that would imply, if you’re gonna say this out loud in your policy, this is what must be happening in your enterprise, across your enterprise and across time, and then you get to work. So it’s possible to first use generative AI, to our favor. And then the second part is understanding and breaking down AI instead of this monolith of risk and fear and telling your teams not to use it, breaking down and looking at the data supply chain that is available. Because all of your CEOs want to use this technology, it is irresistible to have the voice of God telling you which tennis shoe to buy, because your buyers are gonna want to buy that shoe. So you’re not going to say don’t use don’t use this to write your, your, your catalog descriptions, you’re going to potentially want that. So you’re gonna want to break down what does that supply chain? Do we have access to this? Do we have the proper rights to use this stuff? Are we refreshing the library? Is there anyone who knows what the provenance of data is? Are we mixing for for decades, we’ve been doing data swamping and putting things efficiently into large scale compute, bubbles. access and control is critical. Again, security is absolutely paramount, understanding the who, and having a system of record. So you know, over time, who, what, why, where, when, and having that system of record for you. So that God forbid, the Daniels family wants to go on vacation. That doesn’t mean that all the wisdom about your data center goes to the lake to go fishing. It actually lives somewhere in a place where you can look at what’s the program look like? What are the products look like? What are the services look like? And you can unpeel? Who built this? What vendor was that? What does it mean, and you can look at the supply chain of data, I think that’s the promise of AI is that you can have specialty work, instead of saying that we’re going to anonymize our way out of privacy, that never was going to be possible because humans, we’d like to know each other. Instead, you can actually have services where you really know somebody and you really commit to personalization. And you protect and honor that circle of trust with the proper tools and the proper honor that it deserves. And then when it’s gone, and that relationship is over. So I go to our lace bougie restaurant, that waiters not going to follow me home. I don’t get to like show up in the kitchen tomorrow night and like grab something out of the fridge. We’ve had our moment it’s over. We know when it’s over, it’s all cleared out. And your datasets should be in a similar way. There’s a beginning there’s a middle and there’s an end. Everyone’s had a lovely experience. It’s documented. Yeah, done.

Jodi Daniels  28:03  

It’s just as simple as that. Everything mental and documented, done. Okay. See how easy this is? Indeed. There’s some conflict.

Justin Daniels  28:18  

So Michelle, with all of that in mind, when you’re not, when you’re not out there doing privacy, thinking very thoughtfully about all this. Tell us about a fun activity you’ve enjoyed or are about to enjoy this summer.

Michelle Dennedy  28:34  

I am going to be glittering. I’m going to see T-Swizz, which I didn’t know is the thing and I become a Swifty, which I also didn’t know I have a clear clasp plastic purse.

Jodi Daniels  28:44  

That is very important to have. I did not know I’m going to date and still believe in privacy. We believe in physical security after a clarifiers to be able to show everything

Justin Daniels  28:57  

And that could be a great privacy tip. That’s true and anti-privacy tip of the clear to make sure you can get into the concert or the NFL game.

Michelle Dennedy  29:07  

Yes. I didn’t know that this was a thing until last time, I lied to you guys. Before we got on the call. I thought the last concert I went to was U2 in the 80s. But I actually went to a Luke Bryan show with my friend Cynthia and I had a really nice bag. I don’t have many nice bags, and they were like, oh, that’s gonna have to stay here. And I was like, oh, no, it’s not. No, no, no, no. And there was like a line of really, really nice bag. So I was like, I don’t want to go to the concert. Let’s like pick the bag we want and run. So I did get this because I’m gonna go see Taylor Swift in the second row. I am going to be waving up at Taytay going, bravissimo!

Jodi Daniels  29:49  

Well, that means some video because I think I’m the only mom in Atlanta who did not take her kids to the Taylor Swift concert.

Michelle Dennedy  29:57  

Okay, I don’t like people you could have come with me if you were out here, Jodi. I would have taken you.

Jodi Daniels  30:00  

I know. Maybe someone has like a private jet and I can like fly across the country because if I make it to the Atlanta airport, it will take me the entire time to make it through the airport, because it’s so crazy here. But I’ll probably just have to stick with the video. So we didn’t even there’s so much to cover everyone listening. Michelle is a wealth of knowledge, you want to make sure that you are following her and learning more about privacy code. So Michelle, where can we send everyone to keep up with all of your amazing information,

Michelle Dennedy  30:29  

Go to The information there about the product is a little bit coy. Still, we’re trying to get a little bit more descriptive of the product. But there’s a demo button, hit that demo button. And if you’re building a program, if you’re trying to go deeper, and privacy engineering, if you want to know how to build a process to do responsible AI, or if you want to do a data governance council. All of this stuff is related. We have libraries for all of that stuff. And we have really good know-how partners like Red Clover, to help you walk through the process.

Jodi Daniels  31:07  

Well, we are so grateful that you are here to be able to help explain AI, the beginning, middle, and end to all the documentation and everything that you’ve described. You’re very fun analogies. And we’re so glad that you were here.

Michelle Dennedy  31:20  

It was so PG today it was so well behaved. Well, you know, we

Jodi Daniels  31:24  

are a PG rated podcast, so it worked out well. Yeah. Well, thank you again.

Michelle Dennedy  31:31  

Yes, thank you guys. It’s always a pleasure.

Outro  31:38  

Thanks for listening to the She Said Privacy/He Said Security Podcast. If you haven’t already, be sure to click Subscribe to get future episodes and check us out on LinkedIn. See you next time.

Privacy doesn’t have to be complicated.