
Brittney Justice is the Global Head of Privacy at Valvoline Inc., leading the company’s privacy strategy. She works at the intersection of data privacy, technology, and AI, advising on governance and risk at scale. Brittney also serves on the IAPP Privacy Law Advisory Board, shaping the future of privacy law.
Here’s a glimpse of what you’ll learn:
- Brittney Justice’s career journey from litigator to data privacy and security lawyer to the Global Head of Privacy at Valvoline Inc.
- The importance of building a consistent global privacy program with jurisdiction-specific requirements
- How privacy and security pros can position themselves as trusted internal business partners
- Strategies for evaluating new AI tools and managing related privacy and security risks
- The role of an executive AI deepfake simulation in educating employees about impersonation risks
- Brittney’s personal security tip
In this episode…
Privacy and security leaders operate in an environment where innovation moves quickly, and risk evolves just as fast. That’s why global companies need to maintain one consistent privacy program and layer in jurisdiction-specific requirements as privacy laws evolve. At the same time, organizations are adopting new AI tools while deepfakes and executive impersonation threats introduce new reputational challenges. How can companies enable innovation while staying ahead of emerging privacy and security risks?
When privacy and security teams are pulled into projects early, relationships strengthen, and teams no longer hesitate to involve them in new initiatives. Instead of being seen as gatekeepers, they become part of the conversation, strengthening trust and collaboration across business teams and prompting proactive issue spotting. That same discipline applies when evaluating and managing AI tools, where privacy leaders need to coordinate with business teams to understand what the tool will accomplish and how it could affect the company. This requires asking: what problem is being solved, what data is involved, and what the real impact would be if something goes wrong, especially when third-party vendors and model training are involved. That same mindset is critical to educating employees about AI deepfakes and executive impersonation risks, as coordinated response planning can reduce impact.
In this episode of She Said Privacy/He Said Security, Jodi and Justin Daniels talk with Brittney Justice, Global Head of Privacy at Valvoline Inc., about building a globally consistent privacy program while supporting business growth and managing emerging AI risks. Brittney explains her approach to building and maintaining one strong global privacy program without creating separate versions for every applicable jurisdiction, and the importance of embedding privacy and security teams into projects early to identify risks. She also shares tips on evaluating new AI tools, managing third-party and AI model training risks, and using executive deepfake simulations to strengthen employee awareness and establish clear escalation paths.
Resources Mentioned in this episode
- Jodi Daniels on LinkedIn
- Justin Daniels on LinkedIn
- Red Clover Advisors’ website
- Red Clover Advisors on LinkedIn
- Red Clover Advisors on Facebook
- Red Clover Advisors’ email: info@redcloveradvisors.com
- Data Reimagined: Building Trust One Byte at a Time by Jodi and Justin Daniels
- Brittney Justice on LinkedIn
- Valvoline, Inc.
Sponsor for this episode…
This episode is brought to you by Red Clover Advisors.
Red Clover Advisors uses data privacy to transform the way that companies do business together and create a future where there is greater trust between companies and consumers.
Founded by Jodi Daniels, Red Clover Advisors helps companies to comply with data privacy laws and establish customer trust so that they can grow and nurture integrity. They work with companies in a variety of fields, including technology, e-commerce, professional services, and digital media.
To learn more, and to check out their Wall Street Journal best-selling book, Data Reimagined: Building Trust One Byte At a Time, visit www.redcloveradvisors.com.
Powered by Rise25 Podcast Production Company
Intro 0:00
Welcome to the She Said Privacy/He Said Security podcast, like any good marriage, we will debate, evaluate, and sometimes quarrel about how privacy and security impact business in the 21st Century.
Jodi Daniels 0:21
Hi. Jodi Daniels, here. I’m the founder and CEO of Red Clover Advisors, a certified women’s privacy consultancy. I’m a privacy consultant and certified informational privacy professional providing practical privacy advice to overwhelmed companies.
Justin Daniels 0:35
Hi, I’m Justin Daniels, and hopefully I can read this message, you’re funny, not so funny. So I am a shareholder and corporate M A and tech transaction lawyer at the law firm Baker Donaldson, advising companies in the deployment and scaling of technology. Since data is critical to every transaction, I help clients make informed business decisions while managing data privacy and cybersecurity risk. And when needed, I lead the legal cyber data breach response brigade.
Jodi Daniels 1:04
And this episode is brought to you by Red Clover Advisors. We help companies to comply with data privacy laws and establish customer trust so that they can grow and nurture integrity. We work with companies in a variety of fields, including technology e commerce, professional services and digital media. In short, we use data privacy to transform the way companies do business together. We’re creating a future where there’s greater trust between companies and consumers to learn more and to check out our best selling book, data reimagined, building trust one bite at a time. Visit red clover advisors com. Well, in case people are wondering, yes, we actually do our intros live. And I was teasing Justin, could he read our little notes document here? Because sometimes it’s a little small on the screen, and I’ve been scolded before, but Justin can actually read just in case anyone was curious.
Justin Daniels 2:02
I don’t know why you provided context. I think it’s funny the irony of the guy reads for a living. Can you read hopefully?
Jodi Daniels 2:10
Well, that’s why I but maybe not everyone knew that we could have a new person who just learned who you are.
Justin Daniels 2:17
I understand. Well, let’s turn to our guests. Today. We’re going to have an interesting conversation with our guest today, who is Brittney Justice, the Global Head of Privacy for Valvoline Inc. And she leads privacy strategy throughout the company, and she works at the intersection of data, privacy technology and AI, advising on governance and risk at scale. She also serves on the iApp privacy law advisory board Shaping the Future of Privacy Law.
Brittney Justice 2:45
Brittney, hello, hi, guys. I’m so excited to be here.
Jodi Daniels 2:51
Well, we’re really glad that you are and we always start understanding people’s career journey to where they are today. Yeah.
Brittney Justice 2:59
So mine, this is always my favorite to talk about, because mine is kind of interesting. I when I was in law school, I thought I was going to be a fourth amendment lawyer. I loved anything Fourth Amendment, search and seizure privacy, all of that. And at the time when I was in law school, there was no way to come out of law school and directly go into a data privacy role, kind of in a corporate context. So I started out at a Big Law firm doing litigation, um, and then I quickly pivoted into data privacy once the work started to pick up. GDPR, all of that started to come online, um, moved to Baker McKenzie, so another Big Law firm doing data privacy at scale. And I always tell people, you know, as as kind of rough as Big Law firms can be. It really was the best boot camp that I could have ever asked for in terms of what I do now, working in house, because nothing scares me anymore. Guys, nothing scares me. I can see problems of every size scale jurisdiction, and I’m like, Guys, I’ve seen this before, but 10 times worse, and so, yeah, I think, you know, starting out at law firms and then switching in house after, I think it was maybe four or five years, really set me up for my in house experience, because I feel like I can stay calm and and focus on solutions at all times, and I feel like I’ve seen everything so kind of done it all right, the the law firm experience in house, and I think I’m gonna stay here. I love it. I love it.
Jodi Daniels 4:32
I really like your perspective. I think the nothing scares me is is awesome, and how you’re able to approach problems. That’s really a unique perspective and very valuable. So thanks for sharing.
Brittney Justice 4:45
Yeah, yeah. Normally I feel like people, you know, they’re like, oh, law firms are horrible. And I’m like, Guys, you see, I mean Justin, I’m sure you’ve seen this, but I mean clients, I’ve seen small clients who have incredible programs inside prior. See compliance, security, and then I’ve seen very large multinational companies who are disasters, right? And so you’ve seen it all. You’ve seen it all, right? So I feel like I’m well equipped at this point to really handle anything and like a rock that can’t be moved. I think
Justin Daniels 5:17
what’s interesting with what you’re saying, Brittney, is you’ve developed a level of I feel the same way, been there, done that. So when you get into a situation that you’re not quite familiar with or whatnot, you’re really able to use a reservoir of experience and knowledge that, hey, I’ve been through this before, I can get through it. And I think that’s a really important especially as we’re going to talk a little bit about artificial intelligence. And, you know, for people like you and me, it is incredibly disruptive to the way that legal professionals deliver value.
Brittney Justice 5:50
Oh yeah, oh yeah. And it’s, it’s moving quickly, right? I think data privacy laws coming online that also happened pretty quickly, but not at the speed we’re seeing AI, right? And I think for business folks too. I mean, it’s the new shiny product. So I think with a lot of in house attorneys who might be listening, it’s being pitched left and right. And so there’s this, like you’re trying to manage risk right at the same time as you’re trying to, obviously, you know, build out innovation, and you want to make sure, of course, to be a profitable company, right? You you’ve got your risk takers, right, your business folks who are trying to, you know, be risky, innovative and kind of move the ball forward. But it can be, can be tough sometimes, to do both at the same time and ultimately, maintain yourself as a, a good, well, trusted legal advisor.
Justin Daniels 6:41
So as many people may know, Valvoline operates internationally across many jurisdictions with very different privacy expectations and regulatory regimes. So can you educate us a little bit about how you think about building a privacy program that is globally consistent but still locally practical?
Brittney Justice 7:01
Yeah. So to be honest, I approach it the same way I did when I was in private practice, which is maintaining those kind of overarching global principles, but still maintaining that kind of local execution, depending on the jurisdictions that you operate in. And so I think you’ve always got to have your your core privacy framework that kind of applies, um, everywhere, right? So those basic principles like transparency and purpose limitation and data minimization and accountability, right? Um, that’s, that’s, I feel like that’s kind of the backbone of privacy, and then you have to start layering on those kind of jurisdiction specific requirements, especially in the United States, with all these new state laws coming online, and Canada as well, and Latin America and so I think the goal is to not create 1000 different programs. Right? Whenever I talk to our board of directors and our CEO, you know, I’m like, Look, we’re trying to make this as streamlined and easy as possible, and to create one really strong program with very thoughtful local adaptations, if that makes sense. And so I think if that foundation that based on foundation is solid, I think adding that kind of regional nuance isn’t, isn’t as difficult, maybe maintains it’s a little bit more manageable, I would say, when you take that approach, or at least I have found that way. And so it’s, I’m not going to say it’s easy. It’s getting harder. Justin, as you were saying with AI Right? Like now we’re not only dealing with privacy laws, but now there are state AI laws that are coming online. And so we’re seeing kind of the same thing that happened with privacy, but on the AI front. And so it’s it’s tough. It’s not easy. Everything is moving very quickly, but I think kind of maintaining, again, that kind of backbone privacy program, and then layering on additions where they need to be, like right now, we’re kind of trying to figure out what to do with the CCPA regulations that came on line gen one. And in a lot of ways, we have processes that are already in place that we can kind of adapt, right, like privacy, impact assessments, we were already doing those. And so you can kind of tweak things so that you’re not creating an entirely new process. But you just have to stay, and I’ll probably say this a lot throughout the program, but you have to stay agile and flexible. I think that’s the key to all of it, because you never know what could happen. There could be a new jurisdiction or new state, or, who knows, maybe a federal privacy law that we’ve all been potentially thinking about that never happens. Who knows?
Jodi Daniels 9:30
I love that you use privacy risk assessments as an example, because I use that one I feel like all the time, because you have a lot of different jurisdictions that require some form of a risk assessment. And what we find is that sometimes companies will create one big template and have sort of offshoots with specialized questions if they need to, sort of using a lot of if then logic. And other times they try and look at the majority of the laws and take the conservative approach to. Get the questions that they would need from to make sure that they’re in compliance, holistically, and take that approach. So it sounds fairly similar to what you were saying, which is, take an existing process and then you can tweak it a little bit as you need to, whether it be for AI questions or new jurisdiction that comes along Exactly That’s exactly it. Now, at the same time, we have a lot of companies where people think that privacy and security are the blockers, and especially in a complex company, how do you try and frame privacy and security. So the business thinks, Okay, they’re not so much obstacles, but they could actually enable what I’m trying to do.
Brittney Justice 10:48
Yeah, so this is, I’m so glad this is a question that I’m being asked, because this is something that I take really seriously. It was, I will say, going from in a private practice to in house. It’s kind of tough at at some points, right? Because I feel like I was initially kind of taught to point out risk, right? Be, you know, extremely conservative, and a lot of the times and in my advice, and then you kind of get thrown in house and everybody’s like, hey, we need to get this done. We have 24 hours, right? And so I would say, now this is a really big deal to me, maintaining those really close partnerships and relationships with the business. So I never show up with no I never show up with a no answer to anybody in the business who has a question or has kind of a new proposal. To me, I always show up and I’m like, Hey guys, you know, let’s talk about what you’re trying to accomplish at all points in the process. I just want to know what you’re trying to get done, right? Like, what’s the business purpose, what’s the goal? What are we trying to do here? And I have found that when you take that approach to understand the business initiative, first, you become a partner, instead of that gatekeeper, or that blocker, Jodi, that you were just saying. And so instead of saying, No, I find that a lot of the times my response is like, hey, maybe we can’t do it exactly the way that you’re pitching it to me or you’re proposing it, but I have some other ideas that can still get you where you’re trying to go, right. Like, there’s never a complete no answer. Just like, let’s tweak it a little bit, right? And I think that mindset changes everything, because we don’t want to stop innovation, right? Like, my job is to define risk, to to issue spot risk, right, to limit risk. But there’s also people in the company whose job is to innovate right, and to be risky and come up with really cool, big, grand ideas, like the people with big ideas and big brains in the company, like, they’re getting paid too, and they have a job too. And so the goal is like, how can we come together and work together and make sure that the business is able to move fast and be innovative, but in a safe way, right? And so I think when you approach it that way, at least, I have found that people pull me in earlier, like nobody’s afraid of me anymore. I think when I first joined, they’re like, oh god, there’s a privacy person here now in legal and and how they pulled me in earlier, and they’re not afraid of me, and we have great relationships. And interestingly enough, I feel like I’ve formed such great relationships with folks in the business to not see me as blockers. Is that sometimes they issue spot for me now, like, if there’s calls that I’m not on that are very early in a process or a pitch or proposal, they’ll call me, or they’ll team to me afterwards and be like, Hey, Brett, this is happening. I think you need to know about it. I’m like, this is incredible. This is exactly what I’m trying to build out, right? But it takes, it really does take a lot of relationship building and trust. And I think once you get there, you really are able, from the security and privacy side, start to have your kind of your your fingers and everything, and keeping a pulse on the company again, while being a trusted member of the business as well. Like I, I don’t view myself as somebody sitting in legal I also view myself as somebody who works in the business, right? Like I, I’m there to to to innovate as well and help them move the ball forward too.
Jodi Daniels 13:52
I love that you were saying you have people thinking about it. First, I had a call earlier today, and the company was sharing that since we’ve been working together for the last few years, the product team is so privacy minded that they are thinking first privacy and sort of a really similar thing to what you were just saying, almost, almost solving and really thinking about the issues first, which, for anyone trying to build a privacy program, it’s, it’s massive wins. I feel like we earned our own Olympic medal over here.
Justin Daniels 14:22
I guess Brittney, I guess the other thing I’m hearing when you talk this way, it sounds like you have defaulted to a perspective of, I’m coming from the perspective of, how can I enable the business, and you have made a conscious investment in your time and how you wish to be perceived that is helping to build a culture around your business teams of, hey, we do want to call Brit, because she is an enabler. She gets us. She’s trying to always figure out how to do it, and that’s their default view of you,
Brittney Justice 14:52
exactly right? Like, I want them to realize that I’m going to make your life easier, right? Like, I’m not going to slow things down on pre. Purpose, right? Sometimes things will have to be slowed down just because of the nature of the risk, but we all know this, right? The earlier you pull in your security folks, your legal folks, and kind of embed us in the process, they things do move faster, I think, and things are more efficient, right? Because we’re able to catch things and kind of pivot faster, rather than waiting till last minute roll out right, and legal coming in be like, Well, hold on, this is all messed up. We can’t roll this out right. And it seems straightforward, but it never is right. And so I find that when you when you build this kind of holistic program where it’s focused on relationship building and trust, it really does eliminate a lot of the risk, right? And things are more efficient. It’s incredible.
Jodi Daniels 15:45
So one of the big areas that people are innovating significantly is AI, and as you were talking about earlier, how you’re trying to embed yourself in those daily conversations with the business, how has that translated to AI, and how you’re thinking about managing the risk from that new technology? Yeah, so
Brittney Justice 16:14
I won’t lie, it’s it’s hard, it’s hard, and then it’s hard because it’s still evolving. And I don’t know if either of you feel the same way too, but I feel like, once I’ve kind of gotten under my thumb, and I understand how to evaluate risk, right? There’s a new product that’s launched or kind of a new capability that’s rolled out, and I’m like, Okay, now I have to figure out how I analyze this and how this can affect the company. And so at this point, I try to keep it as broad and basic as possible. So I always have kind of three questions that I ask. It’s, what problem are we solving and what? What solution will this AI product bring, of course, as the privacy person, what data is involved, which is kind of key for me, and I think always where the biggest risk lies, and then what’s the what’s the real impact if something goes wrong, and that’s always going to shift based on what kind of product it is. Is it an HR product? Is it are we building internal AI systems? What is that right? And I think, I think it’s tough, because with AI it can AI products can improve efficiency and experience, which is great, but, but they can also bring a ton of risk, and so we try, at this point, we’ve been trying to kind of pressure test data flows, which is really complicated with llms and model training and retention and oversight, human oversight, I would say the biggest area that’s been that’s been tough, and I feel like a lot of people don’t talk about, is vendor controls. So if you’re talking about vendors who use AI, the contracts are complicated, and that takes up a lot of my time, right? Like, again, maintaining a good contract to understand, or ensuring that I understand what the product does, and ensuring that we have those provisions in place in that contract that are going to protect the company, especially with model training, right? Like, Hey, are you using our data to train your models? Is your product secure? Because a lot of times we’re not talking about the Microsofts, right, or the Googles. We’re talking about startup companies, like a lot of these AI vendors and products are startups, which is really risky, right? They don’t have an ISO certification there. It’s just, there’s a lot of risk, right? But the company’s pushing for it, and so I just try to stay pragmatic about it and let people know that Look not every single shiny AI tool needs to be adopted. There are some things that are game changers and will be really great for the company, but some I always say, are just expensive toys that have a ton of privacy risk. And so we try to analyze it that way. And you know, one month from now, this might be completely different, right? Because things are just moving and shifting so quickly. But I try to say neutral. I try to say AI neutral. I find that some of my friends in privacy will be they’re either AI enthusiasts, right? And they’re like, going nuts over Claude and chat GPT and all these new products that are coming out, or they’re like fear mongers when it comes to AI. And I try to be neutral, and I find that when I approach it that way again, the business trusts me a little bit more, and I can again, maintain that more pragmatic approach when I give advice. And I think that has worked until this point, because, again, this stuff is evolving so incredibly quickly, especially with state laws. I mean, we have no idea where this is going. We really don’t.
Jodi Daniels 19:31
Might have to borrow the shiny toy comment, I love that.
Brittney Justice 19:35
It is, it is
Justin Daniels 19:37
so Brittney. Before we came on you and I were talking a little bit about deepfakes, and you know, we’re seeing a rise in deepfakes and AI driven impersonation targeting senior executives. So how does a global brand like Valvoline think about protecting executive identity and corporate reputation in this
Brittney Justice 19:57
insane environment? Yeah, yeah. So I. I love this question because it is interesting a lot, and enough something that we have been paying really close attention to, mainly because, look, we’re 150 year old company, huge global brand, and deepfakes are becoming more and more of an issue, especially, especially at that executive level. And the reason I love this question is because last year we actually, this is really cool. We created, we were working with this third party vendor. We created a deepfake video of our CEO launching a kind of a fake product and saying that we were going to create some type of like technology and start selling used cars or something out of something, something completely unrelated to our business, right? But, y’all, it was so realistic, and we actually used that video. We got it all prepared, and we we launched that video at one of our workshops that had, you know, over 1500 store employees. And I was in the room when the video was played, and you could hear, like, the gaps, the gaps from everybody in the room was like, oh my god, we’re going into the car sales business. Like, what is going on? Everybody was freaking out. And then it was really cool. At the end of the video, our CEO starts speaking German, and then she starts speaking Chinese, and then everybody’s like, what is going on, right? And y’all, nobody caught it. Nobody caught it, until the very end, where she starts saying, like, Hey, this is a deepfake. This isn’t really your CEO. And it was incredible. And the reason we did that was for educational purposes, and we wanted to kind of educate our employee base and let them know that, hey, this is this happens. This is possible. And here are some red flags that you need to look out for. And y’all, everybody loved it. They ate it up. They loved it. And so honestly, in in this day and age, I think the best thing that you can do is focus on education, especially with your employees. And then beyond that, you know, we build in layered controls in our system, so authentication processes, internal verification procedures, very, very strong Incident Response playbooks, which I think is important, and then really close coordination with our security teams, privacy team and our comms teams. Um, but I think the the key here is, if you can’t prevent everything, you really can’t this stuff again, is incredible. Um, the you how real it can look, how realistic you can look. But I think you can dramatically reduce impact by preparing people, especially your employee base, a lot of the times, and having those kind of clear escalation paths. Because I think, I think that is he when reputation is on the line. And again, these videos could say anything, right? Like ours was very, you know, elementary was kind of funny, but that was the point, right, to just be like, God, this stuff is realistic, and just keep an eye out, right? Like, You can’t believe everything you see, which is really interesting this day
Justin Daniels 23:01
and age. So Brittney, I wanted to get your take on, so I ran a deepfake tabletop at a client’s conference last week. And where mine differed from what you guys did was I had a fake CEO who was literally the CEO who ran the event. And it was a deepfake video of a leaked video chat with her talking about concerns about a product that they had from a cyber and privacy perspective, literally two hours before they’re going to announce a huge merger. Oh, wow. And what I learned from that was a couple things, one your typical Incident Response playbook doesn’t work, because now your response is measured in minutes and hours, like the tabletop you know, literally could happen in three hours, where this goes viral and it’s moving the stock price. And then the other thing I was curious for your opinion on is, even though a deepfake could be fake, if it’s a deepfake of the CEO having concerns about a product, and then the customers are like, hey, maybe we need to turn this product off. And you’ve never stress tested. Doing that, the deepfake creates a legitimate concern about a product that you get that may not have been stress tested. And I just was curious if you’d thought about anything like that, or what your reaction? Because I was kind of floored when I got into it. I thought I’d have one set of outcomes, and I learned, like several other new lessons I hadn’t even thought of,
Brittney Justice 24:30
yeah, yeah. I think, I think you make a great point that your typical Incident Response playbook is not going to be the one you can follow in these types of scenarios. I think also, I mean, when we were thinking through the kind of deep, fake, kind of little campaign we were doing, is that you’re going to rely on comms a lot more than you typically would in any other type of incident, right? You have your communications team involved. I think it almost expands the people that are involved in the process. And again, when. A publicly traded company too. I mean, that just raises the stakes, right? And so it’s tough. I think, again, you can’t follow the typical Incident Response playbook, but I think just as the we were having the conversation earlier about using existing processes and kind of building them out and kind of tweaking things, that’s how, that’s how we’ve handled this, right? So, like, of course, education first, making sure everybody understands that, hey, this is a real threat, but kind of tweaking processes that that we already currently use, kind of in the cyber security context, right? And expanding this. And again, I one of the teams that we partnered really closely with was HR and comms for this, because we’re like, Hey, you guys need to be educated on this too, right? Like, when you have tabletops now, like you’re, I mean, the room is getting bigger and bigger. I don’t know. I don’t know, Justin, if you’ve noticed that, but like, we, I’m like, Guys, we need an auditorium at this point, because it feels like the stakeholders that need to be involved are just expanding and expanding and expanding. And so it touches everybody, you know, everybody, everybody needs to be involved. Everybody has some sort of responsibility with this type of stuff, even even even you’re kind of, if you’re in retail or kind of, you know, have stores right, like even your regular employees, everybody needs to be aware. Everybody kind of has a role in this.
Justin Daniels 26:19
One final point, you brought it up about comms. The thing I learned in doing the deepfake is you almost have to have a higher level of trust and decentralization of decision making so that you can respond much more quickly. Like if you know it’s a deepfake and it’s coming out, you know, comms may need to respond immediately to say this is a deepfake. We’ll have more details to come, because then at least it’s part of the narrative. Because when these things go viral, as you demonstrate it, even you know, with what was shown to all your employees, people just assume it’s true. They’re not really discerning about it yet, and it just spreads like wildfire,
Brittney Justice 26:56
right, right? And can happen amongst your employee base too, right? Like, if we’re not even just talking about the stock market or outside parties customers, right? It could be your own employees who are like, Oh, well, I guess we’re doing this now, right? And so it just, there’s just a lot of factors at play with deepfakes. It’s, it’s terrifying, but just so fascinating. At the same time,
Jodi Daniels 27:18
knowing what you know about privacy and security. What is the best tip you might offer to someone when you are hanging out at a party we’re playing around the padella Court,
Brittney Justice 27:32
um, what I always tell people, whenever people find out I’m a cyber security slash privacy lawyer, they’re like, Oh, that is so cool, you know? And they want to hear about it. And I always, I always tell people, you know, assume everything that you touch digitally has value to somebody. That is always my advice. So you know, your data, your credentials, things, systems that you have access to, it all matters. It all has value to somebody, even if it’s not you right? You’re like, I’m just a low level employee. I’m like, No, you’re not. No, you’re not. Oh, I’m just somebody who lives in Idaho. No, you’re not. Like, everything has value to somebody. And so from the I like to sometimes take the security perspective, because I don’t get to talk about security that much these days. But you know, again, strong passwords, you know, MFA slowing down before clicking. Man, we have, he have some pretty cool phishing campaigns that we do internally. And sometimes we have, you know, you’ll, you’ll look and see who kind of fell for the phishing attempt, and it’ll be someone on the security team. We’re like, Guys, what is going on, right? And so, you know, phishing campaigns slowing down before clicking, verifying unexpected requests. You know, Justin, I’m sure you deal with this all the time doing tabletops, but like, I think whenever you’re talking to people and incorporate it’s like most breaches and things like that, don’t really start a lot of the times with any type of sophisticated hacking. It really doesn’t. It typically starts with somebody being busy and trusting the wrong email and clicking email and clicking something really fast. And so I always tell my folks who are either in privacy or not, just be cautious. Again. Everything you touch digitally has value. That is, that is always my number one piece of advice to everybody, especially this day and age. And that’s only going to expand and grow, I think, with AI, right? Like data, data is money, y’all, data is money,
Jodi Daniels 29:26
indeed. Show me the data. Show you the data.
Justin Daniels 29:32
Justin, yes, well said. Oh, you knew that one. Wow, impressive. So for 15,000 Okay, yes. So Brittney, outside of work. What do you enjoy doing for fun?
Brittney Justice 29:44
So, um, I kind of saved this for last. So the reason I’m not on camera is because I have a gnarly black guy from playing padell, which I feel like we’ve been dropping hints throughout the podcast. But I love outside of work. I mean, my favorite thing is to just be outside. Be outdoors. Um. So the big Yogi love, love traveling. But my kind of hyper fixation right now is playing padell, which, for all the listeners who haven’t played it before, it’s a mix of squash and and tennis. You play with the tennis ball, but your racket is solid and has little holes in it. Um, and it is just so much fun. Y’all. It’s like the best cardio workout. It’s good for stress. If you’re a stressed out lawyer, you got to play it. You know, I’ve I when I’m stressed, y’all, I go to the court and I’m 30 minutes later, I feel like a new person. You know, if you’re in my security folks, if you guys are stressed, man, there’s 24 hour courts you can play at any time of the day. It is so much fun. It really, it’s, great and all aspects, and it is the fastest growing sport in the world. So I hope in whatever city y’all are in, you probably have a court popping up soon. So keep an eye out. But 10 out of 10 recommend to everybody
Jodi Daniels 30:52
I see. We have BP sport battle for popularity. We have our pickle ball, we have our padell. It’s lots of courts, I’m gonna have crazy.
Justin Daniels 31:02
I’m gonna have to try it now, because I’ve heard yes another thing
Brittney Justice 31:08
I think, I think this, this episode, should be called padell and privacy. Ah, there you go.
Jodi Daniels 31:15
Well, Brandy, we are so glad that you joined us. If people would like to connect and learn more, where could they go?
Brittney Justice 31:21
Yeah, yeah. Connect with me on LinkedIn. I am. I’m not a LinkedIn influencer, but I love I don’t post a lot, but I share things that are interesting. I always respond to messages. So connect with me on LinkedIn. I’m going to be at IPP. I’m always there in April in DC. So connect with me there. Message me. Always love to get coffee and chat about these things. I’m a nerd for it outside of work too.
Jodi Daniels 31:45
Maybe you could organize a privacy padell game in DC.
Brittney Justice 31:50
See, you know what? You know I I might do that I have enough time, enough time see you guys on the court.
Jodi Daniels 31:57
All amazing. Well, thank you so very much. We really appreciate you joining us today. Bye, guys, thank you so much.
Outro 32:07
Thanks for listening to the She Said Privacy/He Said Security Podcast. If you haven’t already, be sure to click Subscribe to get future episodes and check us out on LinkedIn. See you next time you.
Privacy doesn’t have to be complicated.
As privacy experts passionate about trust, we help you define your goals and achieve them. We consider every factor of privacy that impacts your business so you can focus on what you do best.






