The Importance of a Strategic Privacy Program
Michelle Dennedy is the CEO of PrivacyCode, a privacy engineering SaaS platform that translates complex privacy policies for developers. She is also the Co-founder and Partner of Privatus Consulting, a company that assists clients with privacy engineering and governance, WickedPrivacy leadership solutions, and ESG metrics.
Michelle works closely with families, executives, and innovators at all levels and with businesses and organizations at all stages to support the combination of privacy policies, practices, and tools. She has held many leadership roles in data strategy and privacy at Sun Microsystems, McAfee, Intel, and Cisco as well as startup companies.
Here’s a glimpse of what you’ll learn:
- Michelle Dennedy shares her background in privacy
- Major privacy challenges companies face
- What is WickedPrivacy?
- The role of privacy engineers and officers in systems and programs
- Advice for implementing a privacy budget
- Michelle’s top personal privacy tip
In this episode…
Data is becoming increasingly complex and nuanced, making privacy and security integral components of an organization’s enterprise — but many companies fail to budget and plan accordingly for these policies. So, how can you implement privacy strategies into your business plan?
Michelle Dennedy recommends adopting a problem-solving framework known as WickedPrivacy. This involves executing immediate, systematic approaches to address complex and uncertain privacy challenges, including ethics, public safety, user data, and shareholder access. PrivacyCode helps privacy engineers and officers identify use cases to integrate and deploy privacy programs utilizing a compliance and soft systems method.
In this episode of She Said Privacy/He Said Security, Jodi and Justin Daniels interview Michelle Dennedy, CEO of PrivacyCode and Co-founder and Partner of Privatus Consulting, about developing strategic approaches to privacy. Michelle explains the major privacy challenges companies face, the WickedPrivacy methodology, and advice for implementing a privacy budget.
Resources Mentioned in this episode
- Jodi Daniels on LinkedIn
- Justin Daniels on LinkedIn
- Red Clover Advisors’ website
- Red Clover Advisors on LinkedIn
- Red Clover Advisors on Facebook
- Red Clover Advisors’ email: firstname.lastname@example.org
- Michelle Dennedy on LinkedIn
- Privatus Consulting
- Michelle Dennedy on Twitter
Sponsor for this episode…
This episode is brought to you by Red Clover Advisors.
Red Clover Advisors uses data privacy to transform the way that companies do business together and create a future where there is greater trust between companies and consumers.
Founded by Jodi Daniels, Red Clover Advisors helps their clients comply with data privacy laws and establish customer trust so that they can grow and nurture integrity. They work with companies in a variety of fields, including technology, SaaS, ecommerce, media agencies, professional services, and financial services.
You can get a copy of their free guide, “Privacy Resource Pack,” through this link.
You can also learn more about Red Clover Advisors by visiting their website or sending an email to email@example.com.
Welcome to the She Said Privacy/He Said Security Podcast. Like any good marriage, we will debate, evaluate, and sometimes quarrel about how privacy and security impact business in the 21st century.
Jodi Daniels 0:22
Hi, Jodi Daniels. Here, I'm the founder and CEO of Red Clover Advisors, a certified women's privacy consultancy. I'm a privacy consultant, and certified privacy professional, providing practical privacy advice to overwhelmed companies.
Justin Daniels 0:36
Hi, Justin Daniels. Here I am passionate about helping companies solve complex cyber and privacy challenges during the lifecycle of their business. I am the cyber quarterback helping clients design and implement cyber plans as well as help them manage and recover from data breaches.
Jodi Daniels 0:53
And this episode is brought to you by Red Clover Advisors for anyone wondering, the drums were my head. This time, we help companies to comply with data privacy laws and establish customer trust so that they can grow and nurture integrity. We work with companies in a variety of fields, including technology, ecommerce, professional services, and digital media. In short, we use data privacy, to transform the way companies do business together, we're creating a future where there's greater trust between companies and consumers. To learn more, and check out our new book, Data Reimagined: Building Trust One Bite At a Time, visit redcloveradvisors.com, you're ready for some fun. I'm so excited because we have been talking to this week's guests for a very long time about scheduling this. And I'm so excited. We're here because we have Michelle Dennedy. Now, before I explain the fanciness of who Michelle is, when I joined privacy, Michelle was one of the very first people that I talked to. And I said, I just want to meet people and learn more. And she's been a wonderful mentor and friend, and I'm so grateful and excited that we're having this conversation today. I'm so excited to
Michelle Dennedy 2:17
I remember those very first days and she's like, Oh, what do I do and like 10 minutes later, she was in the thick of everything and still isn't happening.
Jodi Daniels 2:26
Think yell. So Michelle is a partner at Privatus, with a specialty in Wicked privacy, metrics creation and privacy engineering strategies. She is a CEO and an early stage privacy engineering automation company and has held many leadership roles and data strategy and privacy at Sun Oracle, McAfee, Intel, Cisco, and some startup companies. And like I said, she's just Michelle, and we're so excited that she's here.
Michelle Dennedy 2:57
Yay. Thank you so much for having me. I'm so excited. We finally got this to work. I know. So
Justin Daniels 3:04
let's dive in. And how did you get started in your privacy career?
Michelle Dennedy 3:08
Yeah, my privacy journey. I was a patent litigator last century. And I moved into a company that many may remember fondly Sun Microsystems in a I got hired in late 99. Started in 2000. Right after our dear Scott McNealy uttered the famous and infamous line, you have no privacy, and followed up with a hearty get over it. So as a result, there was an intellectual property lawyer, joining a firm, and every grown up, looked at privacy and said, Wow, the CEO says this is garbage and ran the other direction. And I looked at it and I thought, Well, I've been spending the last, you know, six years in legal practice really breaking apart intellectual concepts, and figuring out what each claims in a patent what does it mean? What does this mean to intellectual property, looking at branding, looking at trademarks, looking at copyrights? What are the fruits of the minds and one of the stories that they tell. And when I approached privacy, I really approach it still in that same sort of curious way of saying, really, if you look at privacy in the way we treat it as just casually in our communities, the way that regulators are sort of approaching it the way it behaves in commerce. It's not dissimilar to these types of intellectual properties. So if you look at privacy as the authorized sharing the authorized processing, deciding how you want your torque, your story told, according to fair and moral practices, that's privacy to me, so I kind of took to it natively and just dove in headfirst. I came from a long line of geeks, both my parents were patent litigators. But my dad was also a systems engineer and a security architect. So every time there'd be a technological thing that said, Oh, privacy is dead. I'd call home and say, Hey, Dad, what's up with that? And he'd say, bro, now you can segment this you can put containers on At you can encrypt over there. And so that's where we started working together and really working on a methodology to say, what is the value to a company like Sun Microsystems, you have sunrise, you have Solaris, you have containers, you have early telco systems as segmented and processed information as a differentiator. So why wouldn't you do privacy. And so that's how I got I built a business plan and moved out of legal and became son's first chief privacy officer. So that was a long, long time ago, and many, many adventures ago. But that was the origin is nobody else wanted it.
Jodi Daniels 5:37
That is a fun story. And our kids could tell you a little bit about privacy and security now. And so that gives me hope that maybe when they're grownups, maybe they do something in this field, they call us up. And we can say, here's how
Michelle Dennedy 5:51
you doing my oldest daughter's at Wayfair. Doing privacy right now. So you know, whether it's a an occupational hazard or a family problem, I don't know.
Jodi Daniels 6:00
I just bought something from Wayfair. Yesterday, I'm helping to support her her her role. So let's fast forward to today where companies do care about privacy. There's a lot of regulations out there. So what are you seeing as some of the biggest privacy challenges that companies are facing today?
Michelle Dennedy 6:21
Yeah, so I think the complexity of privacy data infrastructure, our economic climates, and changing patterns of work and culture, these these themes keep emerging in my career, right? I don't see that being any different. So when the.com bubble burst, and everyone was stuck with a lot of extra capacity and hardware, we started thinking about, Oh, we can build applications and sell them in piece part. Now, we're thinking, wow, everybody went home for a couple of years. And the way we're working and approaching work is different. How are we managing human beings, there's gonna be a lot of regulation about surveillance of employees at home, what does it mean, to have a constant facial recording and our voice recording everywhere all the time, that's going to have privacy implications. And finally, we're reckoning with these territorial differences that we we never really have addressed. I had hoped in 2000, that by 2022, we would have some treaties about here's what basic safety means online. transferring information, here's what has to happen for lawful and grass. Instead, we are still wildly wildly divergent, we're gonna have shrem 16 Before we know it. So I think it'd be
Justin Daniels 7:39
a movie like city, a movie a movie shrem 16.
Michelle Dennedy 7:43
So I keep saying that, like, it shouldn't be the next Privacy Shield, we should just call it trim. So it's trems V SRAMs. And it's all very dysfunctional. And it's. And but it's fascinating, because like, that's an actual Dude, that was a former intern at Facebook. I was this young kid decided to start suing the US government and has been incredibly successful and disruptive in the market. That's where we are in privacy. It is individual players still making big moves, revealing big secrets, and a lot of companies a lot of investment, and a lot of motion trying to say what are we what do we want? That's the real question. I think a lot of the the concern of, you know, doing a DSR having your data subject having access, that people are worried about the cost of compliance, and they're not asking the fundamental question, which is, what do we want? How do we want to communicate? How do we want to convey ethics? How do we want the lifetime of our stories to last? Is this something that should be ephemeral? Should there be something on the blockchain forever? And what how does that fit into the rest of our society? So this is where we get into what we call wicked privacy. And wicked problem isn't just like for people from Massachusetts like wicked, cool, wicked, actually, and I learned this from Colonel Bart Bradley's my partner provides wicked problems are a way of approaching and solving problems that are akin to privacy, you know, give you the example of climate change. So there are many, many, many stakeholders, big and small. Imagine a village with a river running through it. People need to drink people need to wash people need to poop. Where do you do that along the river? And does everyone do it in the same place in the river, no. different needs and desires. You've got farms that want to deliver food, you want people maybe to generate electricity, all of these different perspectives about the same landscape and the same resource. They're changing over time. Almost anything you do is a compromise in a wicked problem. So think about privacy in a similar realm. We've got users who want access to entertainment and communication to community. You've got managers who want to make sure or that they're not going to get fired for not managing their employees, you've got stakeholders like shareholders. And then you've got all sorts of, you know, public safety and an ethics conflicts, a lot of different stakeholders, a lot of different compromises. And the only way you start to address a wicked problem is by doing something. And by doing something, you have a reaction somewhere else. So by starting to think about privacy, as an organic system, a soft system with various levers and timing and change, we can kind of approach the strategy in our businesses and in our lives in a very different way. And so that's the way we look at strategic privacy as as a wicked problem. That is something that is you're not a victim. It's not privacy, not over. While we still have people, we still have privacy. But it is a wicked problem. And it does need to be managed actively manage over time. It was kind of a mouthful.
Jodi Daniels 10:58
No, well, it was a very well thought out and complete assessment of where we are. And a lot of companies look at the tactics first. So I have a regulation, I have a regulation. Because someone said, I have to go do all these things. If I don't do them, here are the consequences. And many companies will only look at fines as the consequence. And then they have all these tactics to do. And then you have other companies who will say, well, I need to look at this holistically, I have to think about my customer, what, what are my values? How do I match this with my strategy? How do I manage these tactics, which you actually have to do, do you? And that's what we do all day, we're helping companies with those tactics. So how do you actually do those, but in a way that matches with the customer, and the values that all make that all make sense.
Michelle Dennedy 11:52
And your texting, your tactics become meaningful when they're attached to a strategy. So of course, you want to communicate with your stakeholders, you know, you want the person who asked you what what data you hold, what a golden opportunity to have a conversation, you can treat it like it's something terrible that you have to do because the law tells you to or you could say, Well, someone is interested, we pay money to advertise for people to be interested, why not make that experience, something that is fruitful for both parties, where you're both learning something. So that's where I look at all of these things that we we do that are the tasks, if they're done in a rubric of we can privacy as strategic things, everything turns into an opportunity, even the stuff that's not as much fun as the other stuff. That's not fun.
Justin Daniels 12:38
We're seeing the rise of the privacy engineer, can you share more about why this is a must have for companies?
Michelle Dennedy 12:43
Oh, I'm so excited about this. I can't even tell you, like I said, I was raised on a raised floor with my dad and all his dorky friends. I think that the time to paper over or apologize for or try to policy away? What's actually going on with data is well over. And I think that's happening for a couple of reasons. One good and one bad, I think we'll start with the bad. So if you look at sort of the Upton Sinclair work at the turn of the last century, if you look at economically, so a lot of people say privacy is dead, because it's expensive. And people don't care because they're doing all these data and rich services. So obviously they don't care and you look economically at what they're doing. They're they're buying things when they see ads, etc. If you look at the turn of the last century, at the behaviors of of poor people in Chicago, New York, they were eating rats, they're eating sawdust, and they were eating bleached meat. Why? Because that's what was in the sausage. Did they know that was what was in the sausage? No, they did not. And it wasn't until there was some investigative revelations about the abbatoirs and the behaviors of the people selling meat into these communities that we started to see things like the FDA and food safety and supply chain and understanding that industrialization of food comes with risk. So industrialization of data comes with risk. And instead of serving people, you know rat dropping sausages. We now are being asked to actually have some wholesomeness. Where are you getting the data? Is it appropriate data? Is it not just permission, but is it in context, something that someone would expect. And as we go further and further into machine learning and pattern matching because of the world's complexity, and because of the bulk and the sheer momentum of the data, we can't review every single file one by one. So we're making more and more assumptions. So as we make more and more assumptions, think about AI and ML as industrialized food, where you really have to figure out what is the provenance of the ingredient? What is The expectation of the outcome are we testing to make make sure that we're still on track to have something that is adding to the overall community. And I think every privacy engineer, from the identity people all the way to the shredders, and all of the statisticians in the middle, everyone has a role to play in making sure that we have wholesome data that's going to serve our societies.
Jodi Daniels 15:25
So can you share a little bit about where you're seeing privacy engineers? The question I always get is, where should privacy sit? Well, where do privacy engineers sit?
Michelle Dennedy 15:36
Well, so I'll give you the glib answer, as a lot of them sit in engineering. So I think some people are converting themselves. And there they are. And they might even call themselves privacy engineers. Some of them are simply invested in safety for purpose or safety under context or security, for a certain type of motion, like healthcare, for example. And they may be actually doing privacy engineering, because they're not just locking out keeping bad guys out keeping leakage from happening. They're actually concerned about consent, about consensual and can contextual flow. So when you talk about data flow, even from CDOs, I'm seeing privacy engineers and Chief Data Officers, you're seeing some of them there. Now, some companies actually have them dedicated under their chief privacy officer. I mean, that is a glorious thing. When that happens, I had a couple on staff when I was at Cisco, we had to beg, borrow and plead to get those guys going. And they were really becoming evangelist in sort of a center of excellence. So we do see some companies having a mixed team of legal staff, public policy staff, as well as technical staff to be a center of excellence. And then other companies we're seeing sort of a Champions program or or a group of privacy engineers that are spread across the product set. And then there's a final sort of camp, I call it the Berkeley camp, I guess it's also can be looked at the Carnegie Mellon camp as well. But some privacy engineers, I took me years to figure out why people were so offended by by the privacy engineers manifesto, which really is about soft system engineering for for privacy outcomes. Because there's, there's a class of privacy engineer that believes that only statisticians and mathematicians are privacy engineers. I do not believe that's true. I think that all of these obscurity, and anonymity techniques are great. And they're privacy enhancing technologies and tools, we love them. But outside of the whole system, it really doesn't get you all the way there. It's kind of like the old fashioned privacy engineer was only doing encryption. Well, without key management and identity and all the things you guys do. Encryption is kind of meaningless. And it's kind of the same way. So anytime there's a competition of people, I'm more privacy engineer than you, I sort of nod patiently and go, Okay, have a couple years of experience. And you'll come back to the system because it always comes back to the system.
Justin Daniels 18:07
Companies run proof of concepts, how should the chief privacy officer and privacy counsel be involved in in these activities?
Michelle Dennedy 18:14
Oh, I think, well, I mean, I'm biased. So I run a company called PrivacyCode. And it allows you to be in on the requirements setting, you have to be in on the requirements setting, or you are missing the boat. Trying to bolt on security was an impossibility, we prove that for 25 years, trying to bolt on privacy, which is much more complex and nuanced, and situationally bound is impossible after the fact. So we want to have people who are actually figuring out what is what is the outcome we want? Is it a business thing? Is it we're trying to get in compliance with some new law that's coming down the pike? Both of those things might require some tuning and some checking. I'll give you a couple of examples of where this might happen. So people are doing their privacy impact assessments, some people do them by survey. Once you get that attestation of who's got the data? Where's it going? Where's it flowing? What are the controls, we think we have the controls we think we have including the soft system controls, which may be appointing a DPO, which may be having a policy that shows safe handling of data, then there's the mechanisms of the price of those controls. Have you implemented, who owns this? How does it actually get deployed, etc, etc. That is an ongoing flow and governance system. So the privacy officer and the privacy officer should be at least getting reporting if not having active participation in figuring out what those early requirements are. And even if its system is already built, there's going to be change management every single time you're doing your rope as are your pas or even your D scars. You should be going back through those systems again and checking for various controls. Are they still up Effective, are they still relevant. And so that's really where you can get, you can start to see the modern and the forward leaning privacy officers really being a part of the momentum of the business, you're not selling out to the business. Because you're doing safe and secure and private and ethical business, you're actually being a better advocate for your business by being that engaged.
Jodi Daniels 18:14
Now, for those controls that you just mentioned, to be testing, who do you find is best served to do that testing? Because oftentimes, you might have one person complete a PA, you know, you might have a different person making sure this whole program is running. What about the testing of controls? What have you found?
Michelle Dennedy 20:46
So again, this is deeply biased. And this is why I design PrivacyCode the way it is, we fractionalized all of those efforts. So if you are an identity management person, and you are in charge of the role based access, I expect that the person tune in that control is a combination of the HR people who are actually implementing and executing on who is allowed based on their job code and description, and how that person actually gets and loses. We forget to get people out of those apps, once you get permissioned. In. There's a shared requirements. So it's tiny, because it is what is your role based access policy? Does it exist? That's the task? How are we implementing that in the system? Are we having a person beside this? Do we have hopefully, guidelines that are everybody understands and knows? And then how does that fit with your OpSec people? And is that actually happening? And are you then checking. So each one of these tasks might be a different group, because you might have someone who's doing OpSec, you might be having someone who's doing the rollback policy. And then again, you're having managers who are responsible with saying, hey, you know, Jodi's got this access now because she's now become a manager of people. So she needs to have this additional Oh, she's decided to go and take an an IC role someone as an individual contributor. Now, she doesn't need that access anymore. So if you, if you handle all of this in a narrative, it becomes very janky and hard to handle, if you handle it in a system task by task, especially if you happen to have PrivacyCode, then it's self mapping for you. And you have a dashboard, which I you know, I love a data dictionary. Love a dashboard.
Jodi Daniels 22:27
Everyone should have both. Everyone should have both. So to have both, you need budgets and to have all these software tools. And people will you also need a budget, which you mentioned is sometimes hard to hard to find, and privacy, still fighting for budget, and sometimes even a seat at the table. Yeah, what would you suggest? What could you offer for privacy teams to help get their share of of dollars and resources and even just exposure? Like Hi, we care? Yeah, hi,
Michelle Dennedy 23:01
we care. I think this is a whole session, we should have just this, because I put it on LinkedIn the other day, and I really do mean it, I hear it all the time, oh, I don't have budget. And then that's the end of the conversation. I don't have budget and a multibillion dollar corporation is an admission of guilt. It is not an excuse. If your company is doing business. And you can afford to have enough people to be doing a billion dollars of business. You got privacy, it's like my dad used to say it's like there's only there's only two kinds of companies that need privacy and privacy budget. Company Number one, a company that has customers or employees. Company Number two, a company that wants to have customers, as the only two customers that need to have budget for privacy. So I think and you know, all tongue in cheek aside, it's not easy. And particularly if you're calling yourself a chief privacy officer, whether you're a tiny company, or huge one part of that Chief means you are taking some risk. And part of taking that risk is if you're new and seat looking around the business reading that 10k If you're in the US figuring out what is the business imperative, what is the impairment the imperative that that brings most benefit to the business? And what brings most risk? If you look at some of these 10 Ks, you know, California privacy law and safety law is like an essential, critical risk that we're telling every shareholder about, well, you've got an in where are we doing information? What are we doing with information about employees and children, etc. And who is going to benefit the most by being able to target or talk to those people. If that person can't share a budget with you for a small project, you move on to the technical support to that team. If they can't support that you go over to the legal people and you say what are the downside risks of this team and you keep punting and then you do a project that's a winner. You start in the room Setting these requirements and they find out oh my gosh, you're not here to say no to everything and just be a buzzkill. You're not here to stop me from doing business, who are you anyway, and why you call yourself a privacy person. And once you start to do that sort of stuff, they realize once I have a firm foundation, and I'm using data from people that want that data used by really using safe statistical data, and I can prove that I actually have a lot, it's, it's like working out your body, it's a pain to do every day. But now you have a lot more freedom, because you've got that muscle built. And so that's what I say to people is, yes, it's hard. But that's why they pay you the big bucks to be at the top of the stack. And even small companies. You know, I have two small businesses. Now I run Privatus with Brian and I run PrivacyCode with Christie. And it's easier for us to recognize where our data is, at this point. It used to be that small business didn't have to deal with privacy. And I've always thought that is the most ridiculous thing I've ever heard. We have less data, we have less complexity. So fix it now. So that you can grow. And if you can't fix it now, don't be one of those two companies. Don't have employees don't have customers because you don't deserve them.
Jodi Daniels 26:14
Like the you don't deserve them part. And the small piece I completely agree i It always boggles my mind. And customers don't know the nuance of privacy laws privacy, people know that lawyers know that. But the average Suzy whether she's b2b or b2c doesn't go, oh, you know what, you're, you're small, but that's okay. It's like if you went to the doctor's office, oh, you know, you're small. That's okay. You can just take whatever you want. Put it on social media. That's great. Yeah, I mean, you don't have to follow HIPAA, right? Like you could do anything you want.
Michelle Dennedy 26:42
If you want to check, but go ahead. And that's fine, you'd expect
Jodi Daniels 26:47
the same from a solo person's office as you would from a major hospital, the expectation is exactly the same as it should be when it comes to data here.
Michelle Dennedy 26:56
And we know how to do it, right. So I mean, go to a dry cleaner, and I give them my shirt, I expect my shirt back. That's privacy. That's that's interaction of like, knowing who someone is keeping track of that someone only doing what you should do for that person in context. So we use a lot of like, Flori, you know, Flori words for you. And I just made that up as a word, fancy words, to describe privacy, and its nuance and all of the human rights. But at the end of the day, particularly if you are a small business, understanding how to be respectful, and telling people your version, and your culture of respect with data, and then follow it up with your systems, your systems should absolutely mirror your values, you are not a slave to your technology, it should be your slave. Absolutely. So
Justin Daniels 27:51
what is your best personal privacy tip you can share with our audience, given your breadth of experience?
Michelle Dennedy 27:59
You should always license from PrivacyCode. I think my my best privacy tip for privacy practitioners is to always stay open minded and curious. I don't think many people go into any kind of data processing intending to harm people, even if they end up doing that. So understanding kind of what that is all about is really important to stay sort of non judgmental, and going in and saying, Okay, what's next? What's the step? What do you think's going to happen here, and, and particularly, people like myself, I mean, I've been around engineers, my whole life, but I'm not an engineer, do not be afraid to ask people to stop using their jargon and explain what the functionality is and what the outcome is. Lawyers and technology, people need to have a communication. And we're used to being experts in our domain. And it's really scary to walk over to the other side and be the new kid on the block when you're supposed to be some allotted or fancy lawyer. And it's really hard for a technologist to slow down and explain, you know, this is how this works. And this is where it works. And me saying that you can't do it in this way with my mouth doesn't mean that the machine's not doing it on the back end. So I think that curiosity and constant communication and experimentation is really, really important to a keep this really interesting and fun. I've been doing this a lot of years. And to keep building forward. I don't think we're at the place yet where we have a stack of privacy tech that serves us yet. I don't think we have the type of awareness at the C suite that we need. And we definitely do not have privacy officers sitting on public boards. I think we have one or two sitting on small public boards. And if we're going to be a digital society, I don't know how we operate without privacy expertise. at the board level, it's what the investors are investing in.
Jodi Daniels 30:04
I couldn't agree more. I'm hopeful that as privacy continues to get its cool factor that it'll, it'll join its ranks up on the up on the board with us on the dock. And I know they've anyone listening, you just say, Hi, we're here all the clothes. Michelle, when you are not building two different companies at the same time, and helping to mentor and advise up and coming privacy peeps, what do you like to do for fun?
Michelle Dennedy 30:35
Oh, is there time after that? I mean, I mom a lot. I think that's what I've been, I think. And this is why one of my kids is doing privacy now I think is I don't I've heard of work life balance. I've read about it somewhere. I really don't I do. Such a lame answer. But I mom a lot.
Jodi Daniels 31:00
I got a good answer. I'm all around too.
Michelle Dennedy 31:03
Yeah. And I you know, I have a kid who's just graduated from college, I'm super proud of my Riley. And then Quincy 16. And I tell you what, these kids challenged me in ways. You know, in my privacy world, well, I'll go on my privacy side is like these guys teach me. First of all, how deeply young people care about privacy. They don't call it privacy. But they care about who sees what it changes hour to hour who your best friend is who your crush is. And if every one of our employees were as secure as a 16 year old girl with the knowledge of who her secret crushes, we would not need cyber professionals anymore.
Jodi Daniels 31:49
Because he's good find the good story that can be used in a lot of conversations and interesting analogy. They're,
Michelle Dennedy 31:56
they're jumping from platform to platform they get you know, and they will retaliate if you don't. So I think that's that's part of it. And then I cycle a lot I like to I like to look at my world and remember that everything is not so serious. It's a beautiful world out there. People want to be good to each other. Even in the middle of all of this ray Cray craziness of the last couple years, we've seen the worst of humanity kind of on blast. But secretly if you just walk around, you get on your bike and you look around your walk your pet. You see the joyfulness you see people who are not loud, but they are kind and that's why we're in the privacy. This is because we respect human human beings and humanity. And we think their stories are interesting and fun. Yeah, that makes
Jodi Daniels 32:47
me think one of the things that people like so much about the privacy community, especially at when they're new enjoying it is how kind people are and how welcoming it is. I think there's a connection because generally, generally those that are joining it do care about others and and as a result, then are respectful to their peers.
Michelle Dennedy 33:05
I think so too. I mean, it's it's so nice. I mean, we have we both have consulting business, we send things over to red clover all the time that are like a much better fit and, and vice versa. I think it's it, there's so much to do. And there's so much to learn that it's always been such a collaborative space. And, and part of you know, what I'm building now is really, for legacy purpose. I I've been blessed that I could always pick up the phone and talk to any CPO anywhere. And that's where I got my answers. Well, we need to get that knowledge out of you know, the little phone line network and get it out into the community so that we can build on what we had before. It shouldn't be an insider's party, it shouldn't be something that's a secret. And right now it's a differentiator if you do privacy well. And I think it'll become like food safety, you should be able to expect to get into a digital environment and have nutrition and value and be safe.
Jodi Daniels 34:02
Absolutely. Now Michelle, where can people connect with you and learn more about your various companies.
Michelle Dennedy 34:08
So I'm on LinkedIn to be semi professional. I do a little stark, but mostly business over there. If you want the full, wild, Insomniac, crazy, you can see me at @mdennedy on Twitter. And please hit me up at privacycode.ai or privatus.online. I'm both places all the time. Like I said, I don't have much of a I guess this is my this is what I do for fun. It's fun. I need to get out more.
Jodi Daniels 34:38
We'll include all of that in the show notes. Michelle, it's been so much fun to chat with you today. Thank you so much for sharing all of your great wisdom and the the work that you're doing, as you said to set a legacy going forward for privacy.
Michelle Dennedy 34:51
Thank you so much. It's so good to talk to you guys. I love the left brain, right brain, male female, the whole the whole experience So the red clover team is amazing. Thank you.
Justin Daniels 35:04
Have a great day.
Thanks you do thanks for listening to the She Said Privacy/He Said Security Podcast. If you haven't already, be sure to click Subscribe to get future episodes and check us out on LinkedIn. See you next time.