Click for Full Transcript

Intro: 00:01

Welcome to the She Said Privacy He Said Security Podcast. Like any good marriage, we will debate, evaluate and sometimes quarrel about how privacy and security impact business in the 21st century.

Jodi Daniels: 00:21

Hi, Jodi Daniels here. I’m the founder and CEO of Red Clover Advisors, a certified women’s privacy consultancy. I’m a privacy consultant and certified informational privacy professional providing practical privacy advice to overwhelmed companies.

Justin Daniels: 00:35

Hello, I am Justin Daniels. I am a shareholder and corporate M&A and tech transaction lawyer at the law firm Baker Donelson. Advising companies in the deployment and scaling of technology. Since data is critical to every transaction, I help clients make informed business decisions while managing data privacy and cybersecurity risk. And when needed, I lead the legal cyber data breach response Brigade.

Jodi Daniels: 01:00

In this episode is brought to you by Red Clover Advisors. We help companies to comply with data privacy laws and establish customer trust so that they can grow and nurture integrity. We work with companies in a variety of fields including technology, e-commerce, professional services, and digital media. In short, we use data privacy to transform the way companies do business. Together, we’re creating a future where there is greater trust between companies and consumers.

To learn more and to check out our best-selling book, Data Reimagined: Building Trust One Bite at a Time, visit Red Clover advisors.com. It is very hard to try and do an intro next to you, because you have your little smirky smile and I am about to burst into giggles and then I have to rerecord the whole thing.

Justin Daniels: 01:44

Yes, but as we all know, laughter makes people what.

Jodi Daniels: 01:48

Happy and that is good. I do want to be happy and just. I can’t have a case of the giggles again.

Justin Daniels: 01:52

Well, I guess you can factor that into renegotiating my co-host contract, which is up at the end of the year.

Jodi Daniels: 02:00

Oh, so sorry. You know what? You really stuck with the attorney who might have not written that contract. Oops. What a shame.

Okay, back to our regularly scheduled program. I’m so excited for today’s guest because we have a long time friends and just absolutely dynamo privacy professional. Here we have Christin McMeley, who is the chief privacy and data strategy officer for Comcast Corporation, a role that involves partnering across Comcast’s business units spearheading the execution of enterprise privacy and data governance strategies with a focus on responsible use of data and artificial intelligence. Christin, welcome to the show. I’m so excited that you are here.

Christin McMeley: 02:44

Hi, Jodi. Hi, Justin. Thank you for having me. I’m excited to be here.

Jodi Daniels: 02:49

Also, welcome to the sillies. According to you, if we’re not laughing, we’re not having fun. And we’re not happy. So we want happy people. Happy, happy.

Justin Daniels: 02:56

I just find laughing makes me happy. Okay. Okay, so.

Christin McMeley: 03:02

Well, what I often tell my business people is if you can’t laugh, you would be crying. So we might as well laugh about things.

Jodi Daniels: 03:08

That is, I would much rather be laughing than crying. That is true. Unless it was so funny that you then have tears because your stomach hurts because it was crying laughter. There’s that too.

All right. Back. Back to privacy.

Justin Daniels: 03:19

Okay, so, Christin, we always like to start out by understanding your career journey to where you’re at now.

Christin McMeley: 03:28

So my career journey started longer ago than I’d probably like to admit. And I’ve known Jodi for a long time. But I used to work for a telecommunications boutique law firm and the, you know, the Telecommunications Act has been, you know, in the, in the Cable Act have had privacy provisions since the 80s. I have not been practicing since the 80s, but close. And so, you know, part of the early work that I did was advising cable clients on some of those privacy obligations.

I also did some intellectual property work, worked on the DMCA. There were conflicts between the cable Act and the DMCA. And then in September of 2001, after September 11th, there were a lot of law enforcement requests actually coming in related to cable customers. So we started down this path of looking at how to reconcile some of the federal privacy laws with the Cable Act. And I was able to, you know, through litigation, regulatory responses and then actual government advocacy to start work, you know, diligently in this area and then moved into a cable company where I built their privacy program, went back to the firm. 

I can’t decide exactly where I want to be, although I’m very happy at Comcast right now and have spent the last six years at Comcast, as in the cable division, as their chief privacy officer, and only within the last couple of months have moved into the corporate role where I’m trying to work across all of the entities to align and find efficiencies in their privacy practices and just data generally. Nobody does just privacy anymore, right?

Jodi Daniels: 05:18

And there’s a lot of data that is in your purview for sure. And if we think about that, you know, privacy can’t work in a vacuum. And the business needs to know what its obligations are. So privacy in the business really need to work together. And what I’d love to talk about is can you share how do you balance required privacy obligations and, you know, the interest to move quickly with new technology deployments?

Christin McMeley: 05:47

Yeah, I mean, technology deployments. There’s two different sides, right? There’s the technology that the workers are bringing in to use to build our products and services or to run our businesses. And then there’s the products and the services that we’re building, the technology that we’re building. And I think in both instances, you know, the very first thing I think that a company has to do is really think about its risk profile.

And when I say that sometimes people are like, oh, risk, like we are a law abiding citizen, you know, we don’t have, you know, we comply with all of these laws. And I’m like, that’s not the risk that I’m talking about. When you think about the privacy laws in the United States over the last six years, for example, since California and all of the other states that have passed since then, you know, it is a very quick moving regulatory and legal landscape with changes all of the time. And if you’re waiting for 100% perfect compliance. I think a lot of companies really will slow you down. 

But if you can figure out, you know, we’re good with 80, 90% and we’re going to keep working to improve over time. And that coupled with a culture of Jodi, I think you guys will appreciate this — of consumer trust. Right. Like if you’re really worried about, you know, being a good corporate citizen, establishing trust with your workforce and with your customers and, you know, just have I mean, this is going to sound so cliche, but really having a culture of kind of do the right thing and then some foundational education on, you know, what the general privacy principles are or data governance principles are. I think all of that gets your business teams. 

If that’s really ingrained in your culture and you understand where you want to be from a risk threshold. You’re 80% of the way there. Your business teams can move the lawyers or the privacy operations teams aren’t slowing them down, and it actually then allows the business to bring the edge cases to the privacy lawyers or the privacy operations team so that they’re focused on those edge cases, or the more sensitive data processing versus the 80% of routine work that is going to be done within the business.

Jodi Daniels: 08:06

So with that in mind, how or what might be some of those best practices to try and get the privacy and business teams to really collaborate well, to bring you those edge cases and not feel like some people will come on here and say we’re we don’t want to be the team of no. And sometimes people won’t always come forward because they’re concerned. It’s just going to get the big red no stamp. What could you offer to help with collaboration?

Christin McMeley: 08:32

Well, Jodi, since I know you come from a cable background, it’s very different from a historically regulated industry where you can just say no. And having worked in a law firm and seen many different Clients. Like, I think that’s part of understanding your culture as well. So I think in any business today, you can’t just be the Department of No, but you do have to have some. You have to understand where the boundaries are.

The business has to understand where the boundaries are. So what would be a hard no. Where are the places that you’re just not going to go, or the things that you’re not going to do? Understanding that. And then where are the more sensitive areas? 

So if you’re going to be dealing with children’s data, for example, you’ve got to go. Or if you’re going to be doing something on a marketing or advertising front that’s going to require some type of consent. I think getting the business to really understand kind of what those processing activities are going to take more time so that they’re prepared for it, because nobody likes surprises. I think that really helps set the groundwork for those edge cases. And, you know, it’s all about relationships. 

It’s just, you know, making sure that you have regular touch points and, you know, are working with the teams together.

Jodi Daniels: 09:54

I have to emphasize what you just said about relationships, because I think that is a massive miss in so many companies. They’re often focused on their documents, their checklists, their assessments, and they miss actually the human touch, whether it’s in person or virtual, just literally having the conversations and getting to know the person or the team. That is where you learn so much of what is actually happening and build those levels of trust.

Christin McMeley: 10:26

Yeah, for sure. And I think too, you know, like part of it is in building that trust. The privacy lawyers have to meet the business where they are. And, you know, hopefully the business meets the privacy laws where they are as well. But you can know there what I have found in the work that I’ve done.

I know how bad the privacy laws are. Not bad, but how complex they are. And I know the risk that is associated with it and the potential fines and and when it’s the only thing that I work on, like that’s my whole world and, you know, privacy operations, like we’re very focused on that. When I work with an engineering team, for example, I am one concern of many. And, you know, even the other lawyers in the legal department, you have to understand, you have to put the risk in the right framework so that you’re looking at risk across everything, versus thinking that this is the only thing that the business teams have to focus on. 

And I think that that really helps when you have an appreciation for everything that they that, that your business customers have to deal with and that your, you know that that you’re balancing this with them and you become a good partner. I think that that’s really helpful as well.

Justin Daniels: 11:51

So what are your thoughts on how companies integrate privacy into their business strategy? And maybe with what you’re doing with artificial intelligence might be a helpful use case?

Christin McMeley: 12:03

How are we — how do companies do this? I think one thing that you mentioned a few minutes ago, Jodi, were compliance checklists. And you know how people can get so down in the weeds with, you know, the documentation and all of that is 100% necessary. And I think that there, you know, to to, you know, go into AI a little bit. I think our teams, our privacy operations teams have really been focusing on what are these kinds of routine types of assessments that we can bring artificial intelligence in to help us at least get the baseline.

Then for the lawyers or the privacy ops teams to review, how can we start to do more scanning and more information collection activities from our systems? How do we build compliance checklists into some of the work that the engineers are doing, for example? And that actually takes a lot of the work off of the teams. Like what? Because privacy and security and everything else that we deal with is, you know, just one of many things that they have to consider. 

 I think that the more you can make things routine and automated, and that is a huge focus of Comcast’s teams, is how we are going to automate a lot of these processes is really important. All of that being said, though, I think one of the most important things to integrate this into your business is, again, having the teams have a general understanding of what privacy is, because if all you’re doing, if you’re taking all the work away from them and if you’re automating or if it becomes truly just a compliance exercise where they’re checking boxes, you end up in a place where there’s no critical thought that’s being provided. And in some instances, if people are thinking about, oh, I’m building this product and it has the potential to do X, Y, and Z, and then they start to put the consumer trust and, and, you know, thinking about the data that’s being used or how it might be misused. Right. They can start to think a little bit more critically and build more privacy controls into those products and more consumer choice. 

Like anytime consumers have a choice, our engineers are thinking about, you know, how can we give our customers more control over their information?

Justin Daniels: 14:35

I guess when I listen to you say that, I’m just wondering how that works with AI and the checklists and all that, because the regulatory landscape is far behind. You know, we’ve got the FTC, we’ve got the AI act, which Comcast has to deal with the Colorado Act. And yet it’s almost like you have to sit down in a room and say, okay, we want to do this, but what are some of the other unintended consequences that could happen here? Because if you wait until the regulations catch up, like you said before, you’re not going to get to that 80% or 90%. So if you’re going to start implementing stuff, how do you start to then think about what is that 80% or 90% look like with AI, where some of the unintended consequences aren’t, aren’t so clear?

Christin McMeley: 15:22

So I think that I, I mean, we are still in nascent stages of it depends on the type of AI that you’re talking about. Is it you know, just general artificial intelligence is a generative AI. And for AI, traditional artificial intelligence like machine learning, you know, or some of the large language models, you’re looking at technology that we’ve used in our systems and that many companies have used, like any company that uses IVRs, you know, for routing calls when they come in, you know, there’s a certain type of artificial intelligence that is used for that. If you think about the remote control, right, and the preferences, you know, if you ask for one type of programming, how you find other types of programming that might be similar. I think a lot of companies have used that for a long time, and have had to go through that exercise of thinking about how, you know, how does this work?

What do I want to surface up. If I request one type of programming and I’m going to have a Super Bowl party, you know what’s going to be displayed on that screen in front of everybody else. And what are the privacy implications of that? So I think that is a lot of the critical thought that I was talking about earlier. You can’t just go off of checklists and you can’t just look at what the law is covering because the law is going to lag and the, you know, if all you’re looking at is what you have to do, you’re not thinking about those unintended consequences. 

So I think that that goes back to the culture. Right. Like how is this best going to be used by my customers or by the workforce? And what I think you just have to ask those questions about what are the potential unintended consequences, and you’re not going to always get them. But now, you know, really going off if you have a diversity of, you know, views in the room. 

People from different backgrounds. Thinking about different types of issues. I think that really helps as well.

Jodi Daniels: 17:35

That’s a really good point of making sure you have different perspectives that are there. Now, if we you had mentioned before about the complexity of all these laws, and if we think about we have 19 past privacy laws, you have, you know, cable and communications laws that are here. There’s a long list of laws that companies have to do. What might you offer to say, how you can simplify this for the actual teams who need to absorb them?

Christin McMeley: 18:05

I think it’s really difficult to try to simplify this. I think a lot of companies struggle with it. One of the things that we’ve done is, I mean, within Comcast, we’ve actually built a system, a platform that has over 400 privacy security I laws already already, you know, input into that database and then tagged by various domains. And then you can actually search and pull out business guidance for when do I have to provide notices. What has to be on my website.

You know, there’s a category for assessments that are needed. So I think that was a huge undertaking and it’s a lot to upkeep. And we’re looking at artificial intelligence to bring into that platform to see how we can automate that and make it even more efficient. But I think if you can break it down into, I mean, almost the traditional fix of, you know, what are the different categories which audiences need to get this information. I mean, there’s a lot of information that is really privacy by design. 

And you’re giving to the engineers as they’re building things, or there’s a lot of information that is related to marketing and advertising. So being able to segment what people need to know and getting that information out to the right audiences, the right amount of information to the right audiences is really helpful.

Jodi Daniels: 19:43

I’d love to push a little bit further in that about to me, that brings up a lot of training and education and awareness. How have you found to be effective to get that information out? So it’s really part of the daily activity. I’m in my role and I’m really thinking about privacy. And maybe I go to that database or maybe I’m the edge case and I know I need to call someone.

Christin McMeley: 20:04

I think that our cyber team actually does a really great job, and the privacy team is learning from them. I think gamification is really helpful. And so they’ve been really great at that. And we’re looking at how do we incorporate that. I mean, everybody is hopefully at this point, all of your listeners have some sort of annual required privacy training that is going to, you know, hit all of the high, high, you know, points of, you know, the general fair information practice principles.

But we all know that that’s not enough, right? Like, you can’t give a 20 or 30 minute training once a year and expect people to remember it throughout. So it’s the combination of everything that we’ve talked about. It’s the touch points that the privacy team members, whether it’s the lawyers, the compliance part, you know, piece of the team, the privacy operations team members. It’s getting those people out regularly, having those touch points to have conversations, questions that, you know, at Comcast, the privacy teams have office hours so people can come and ask questions. 

It’s the daily activities sometimes, you know, depending again on the audience. It can be like a question as one of the agents logs on that reminds them, you know, or that asks a certain thing. And again, drawing from cyber, we’ve started having privacy tabletops. So we’ll work with various business units on a hypothetical, whether it’s building a product or a marketing campaign or whatever it is, we actually sit down with the team members and go through a hypothetical scenario to help them think about things in different ways.

Jodi Daniels: 21:53

That is a really good idea, and I’m excited that you talked about that. A privacy tabletop. Very, very good one. And now I also want to know the types of prizes. Do I get a good prize if I take one of these security or privacy trainings?

I mean, is a game I should win something?

Christin McMeley: 22:08

You get like mugs once a year. I think the company does give out the privacy cookies that we’ve seen. Everybody you. Cookies are always a very welcome gift in our world.

Jodi Daniels: 22:25

These are good. Favorite food? Only good chocolate chip cookies.

Christin McMeley: 22:28

No no no. Ice cream is your favorite food, right?

Jodi Daniels: 22:31

No. Chocolate chip cookies. Really? Oh, but now in Italy, you do as the Romans do. So then you walk around and you eat the gelato cone.

So. Very fair point. In Italy it is gelato. But in the US, where I don’t have my Italian gelato shop on every corner, it is a good chocolate chip cookie.

Christin McMeley: 22:51

And now we have a little bit more personal information about you, Jodi.

Jodi Daniels: 22:54

There you go. Honestly, anyone who meets me in the first five seconds typically learns that I love chocolate chip cookies, so it’s out there.

Justin Daniels: 23:03

Well, you brought up something interesting. About what? The privacy tabletop. Now, I’ve done a bunch of security tabletops. But when it comes to starting to integrate all that with AI, I have found a really helpful way to train people is To basically sit them in a room, have them bring up use cases for AI, and we prompt in front of them, and then you can start to see it.

And I guess, you know, I’ve never thought about a privacy tabletop. What does that do and what does that look like?

Christin McMeley: 23:33

Well, it can be your building a new application. And the application is going to have some adult content and some children’s content. And, you know, you just basically start to go down a decision tree of when you create this, are you going to have profiles? If you’re not going to have profiles, how are you going to comply with some of the new laws that require you to differentiate, not just for under 13 anymore, but under 16, under 17, under 18? What does a 17 year old look like?

That a 20 year old? You know what? What is the difference here? How would you tell? Well, if you can’t tell, what are your obligations? 

So you just kind of walk through the whole build of the, the program and, or of whatever it is, the product that they’re building or the service or the same if you’re going to do a marketing campaign or if we’re, you know, as we work with our web developers, you’re going to build this, this website. What are you putting on it? What kind of cookies are going to go on it? How are you collecting data? What are you know, you just like each step of the way. 

So I mean, whether it’s an actual tabletop, the way that cyber is where you have where you start with a problem, we can do that now in privacy because we’ve seen enough enforcement actions. We can say, okay, you get a letter from the California attorney general and it says X, Y, and Z. How do you respond? And I think that could be a good exercise as well. But we’re trying to be more on the proactive side versus reactive.

Justin Daniels: 25:11

Well, it sounds like what you’re really doing with this privacy tabletop. And the example that you just shared with us is furthering the principle of privacy by design, because it’s in the design phase. Okay. What are the profiles? We’re going to have adults.

We’re going to have kids. So that way privacy is baked into the design of the application as opposed to them saying, hey, we’ve designed this app. And then you go and you’re like, well, what about this, this and this? So it really sounds like this is a great way to really promote privacy by design principles.

Christin McMeley: 25:43

Yeah.

Jodi Daniels: 25:43

Hopefully what comes to mind for me is it’s like a live privacy impact assessment. It’s the key questions that you might have in an assessment, but you’re asking them with the real life people here to think through what it might look like.

Christin McMeley: 25:57

Yeah. And I think it’s really helpful to do it, you know, on a regular basis, because we get new team members all the time. It helps them understand kind of the way, you know, and this goes back to the collaboration, right. Because it gets your business team members and the privacy team members thinking together and understanding how each other is approaching a particular situation. And then also honestly, having the lawyers in the room.

There are very different perceptions from business people about, you know, what might be acceptable and, and, and to have the lawyers there saying this is how a regulator would, would look at this situation and, you know, helping sensitize them to that as well.

Jodi Daniels: 26:42

Now, as a privacy professional, you know a lot about privacy and security. So we ask everyone what might be your best personal privacy tip you might share with friends.

Christin McMeley: 26:55

So my nephew and his girlfriend were just visiting and they started talking about advertising and how they get, you know, they’ll talk about something on a phone call or with a friend, and all of a sudden they’re getting ads for it and they haven’t done, you know, the, the, the any kind of web search. And, and they’re convinced that it’s just showing up from their conversations. And so where we went from there is like, have you looked at the privacy settings on your phone? It’s very difficult to manage every website that you use, you know, but just start with your phone. Who has access to your location information, who has access to your microphone.

And they opened it up and they’re like, oh, why does that app need access to my microphone and or my camera or my photos? Or like, these are all good questions. So go, go take a look and figure it out.

Jodi Daniels: 27:53

Did you teach them how to read the privacy notice with every word as well?

Christin McMeley: 27:59

I like the privacy notices where you can find, you know, like the recipe for cookies in the notice. Yeah.

Jodi Daniels: 28:06

No, that’s a good tip. It’s a really good thing if you’re standing in a line staying at an airport, kind of bored. It’s a great time to just kind of go through, go through those things when you don’t have anything else to do.

Justin Daniels: 28:17

So when you’re not serving as a privacy pro, what do you like to do for fun?

Christin McMeley: 28:24

I never in my life thought I would be saying this, but I actually like camping and hiking. I mean, I’ve always liked walking, but actually time in the woods, too kind of — it’s meditative. My husband has figured out it’s the one place he can take me where I don’t have cell phone coverage and I can, like, truly get away. But this was a pandemic hobby that started. And other than that, I love cooking.

I actually, Jodi sometime I will make you my iced sugar cookies. I love to decorate sugar cookies, so.

Jodi Daniels: 29:03

That is really fun and I hope you have some good nature walks planned soon. This week in the fall with this beautiful weather. Well Christin, we are so glad that you came today to share all that you have. I’ve just absolutely love this privacy tabletop idea and very excited for more people to hear all about it. If people would like to connect with you, where might they go?

Christin McMeley: 29:28

They can go to LinkedIn and I’m the only Christin McMeley there. So search for me and yeah, I’ll look forward to it.

Jodi Daniels: 29:41

Well, Christin, thank you again. We appreciate it. Thanks.

Outro: 29:49

Thanks for listening to the She Said Privacy/He Said Security Podcast. If you haven’t already, be sure to click subscribe to get future episodes and check us out on LinkedIn. See you next time.

Privacy doesn’t have to be complicated.