Sarah Stalnecker is the Global Privacy Director at New Balance Athletics, Inc., where she leads the integration of privacy principles across the organization, driving awareness and compliance through education, streamlined processes, and technology solutions.
Here’s a glimpse of what you’ll learn:
- Sarah Stalnecker’s 20-year career journey from digital marketing to Global Privacy Director at New Balance
- How to build privacy programs that balance business needs with regulatory compliance
- Tips for using consumer personalization expectations to guide privacy conversations
- Why privacy teams are naturally positioned to lead AI governance and mitigate AI risks
- Methods to embed privacy requirements into company workflows
- How to evaluate and select privacy technology tools
- Tips for measuring privacy program success beyond traditional metrics
In this episode…
Operationalizing privacy programs starts with translating legal requirements into actions that work across teams. This means aligning privacy with existing tools and workflows while meeting evolving privacy regulations and adapting to new technologies. Today’s consumers also demand both personalization and privacy, and building trust means fulfilling these expectations without crossing the line. So, how can companies build a privacy program that meets regulatory requirements, integrates into daily operations, and earns consumer trust?
Embedding privacy into business operations involves more than just meeting regulatory requirements. It requires cultural change, leadership buy-in, and teamwork. Rather than forcing company teams to adapt to new privacy processes, organizations need to embed privacy requirements into existing workflows and systems that departments already use. Leading with consumer expectations instead of legal mandates helps shift mindsets and encourages collaborative dialogue about responsible data use. Documenting AI use cases and establishing an AI governance program also helps assess risks without reactive scrambling. Teams should also leverage privacy technology to scale processes and streamline compliance to ensure privacy becomes an embedded, organization-wide function rather than a siloed concern.
In this episode of She Said Privacy/He Said Security, Jodi and Justin Daniels chat with Sarah Stalnecker, Global Privacy Director at New Balance Athletics, about operationalizing privacy programs. Sarah shares how her team approaches data collection, embeds privacy into existing workflows, and uses consumer expectations to drive internal engagement. She also highlights the importance of documenting AI use cases and establishing AI governance to assess risk. Sarah provides tips on selecting and evaluating privacy technology and how to measure privacy program success beyond traditional metrics.
Resources Mentioned in this episode
- Jodi Daniels on LinkedIn
- Justin Daniels on LinkedIn
- Red Clover Advisors’ website
- Red Clover Advisors on LinkedIn
- Red Clover Advisors on Facebook
- Red Clover Advisors’ email: info@redcloveradvisors.com
- Data Reimagined: Building Trust One Byte at a Time by Jodi and Justin Daniels
- Sarah Stalnecker on LinkedIn
- New Balance
Sponsor for this episode…
This episode is brought to you by Red Clover Advisors.
Red Clover Advisors uses data privacy to transform the way that companies do business together and create a future where there is greater trust between companies and consumers.
Founded by Jodi Daniels, Red Clover Advisors helps companies to comply with data privacy laws and establish customer trust so that they can grow and nurture integrity. They work with companies in a variety of fields, including technology, e-commerce, professional services, and digital media.
To learn more, and to check out their Wall Street Journal best-selling book, Data Reimagined: Building Trust One Byte At a Time, visit www.redcloveradvisors.com.
Intro 0:01
Welcome to the She Said Privacy/He Said Security Podcast, like any good marriage, we will debate, evaluate, and sometimes quarrel about how privacy and security impact business in the 21st Century.
Jodi Daniels 0:21
Hi, Jodi Daniels here, I’m the Founder and CEO of Red Clover Advisors, a certified women’s privacy consultancy. I’m a privacy consultant and certified informational privacy professional providing practical privacy advice to overwhelmed companies.
Justin Daniels 0:36
Hi. I’m Justin Daniels, I am a shareholder and corporate M&A and tech transaction lawyer at the law firm, Baker Donelson, advising companies in the deployment and scaling of technology. Since data is critical to every transaction, I help clients make informed business decisions while managing data privacy and cybersecurity risk. And when needed, I lead the legal cyber data breach response brigade.
Jodi Daniels 1:00
And this episode is brought to you by Red Clover Advisors. We help companies to comply with data privacy laws and establish customer trust so that they can grow and nurture integrity. We work with companies in a variety of fields, including technology e commerce, professional services and digital media. In short, we use data privacy to transform the way companies do business together. We’re creating a future where there’s greater trust between companies and consumers to learn more and to check out our best selling book data reimagined, building trust one bite at a time. Visit redcloveradvisors.com, well, we’ve had so much fun in our pre show. I know this episode is going to be fantabulous. Why are you laughing at me. It’s fun. I don’t think it’s fun to laugh at me. And I like the word fantabulous. Okay, the real world and fabulous. That’s totally a real world.
Justin Daniels 1:49
I think we’re gonna have a really good show. We are. I enjoy the company. We’ve had a great pre-show discussion. So, Jodi why don’t you hop to it?
Jodi Daniels 1:57
Oh yes. Well, today we have Sarah Stalnecker who is the global privacy director at New Balance Athletics, I might have quite a collection of New Balance shoes in our household. Sarah is the global privacy director, as I just said, where she leads the integration of privacy principles across the organization, driving awareness and compliance through education, streamline processes and technology solutions. Well, Sarah, we’re so excited that you were here. Thank you guys for having me.
Justin Daniels 2:26
So why don’t you tell us a little bit about your career journey to where you’re at now.
Sarah Stalnecker 2:32
So this is my favorite question, because I started my career 20 years ago. I promise this won’t be me hitting tick for tick. Every experience that I’ve had in those last 20 years. But if you would have told me I would have ended up in a legal department as a non-attorney, I would have laughed. And I think, you know, what’s really interesting about the privacy space is you see so many people coming from either a legal background or a non legal background. But essentially, I got my start in marketing, so I spent the better part of 15 years, really, in media agencies thinking about strategically, how do we make sure that we’re showing up right for brands like McDonald’s and Oracle? So what was really fascinating is, when I started my career, digital marketing was just getting going, and so it was a time where everybody bought TV and print, and that was very well known. And then there was this disruption of digital marketing and figuring out, how do we embrace it, and how do we make sure that we’re using it in a way that makes sense? And so that kind of continued to be a theme. I then went over to brands like Luxottica and Anheuser Busch, and there I worked in a variety of functions, but it was this idea of really understanding how and why things like digital marketing works within companies, and really this explosion of data. So, you know, you went from I’m going to buy a piece of content and make sure that my brand is adjacent to a piece of content, to really shifting to say I’m buying audiences. And so the world, I think, significantly changed, where suddenly there was first party data, there was third party data, and I think a lot of brands were grappling, how do we make sense of it all, and how do we make sure that, you know, we are staying on the cusp of what, of what’s happening so in those roles was really interesting is like you start to go from things like search engine marketing, so I was doing things like search engine marketing, convincing brands that they no longer needed print Yellow Pages, to programmatic buying, buying fans on Facebook, which now feels really old. And then about 10 years ago, I started at New Balance. And I started at New Balance in a marketing analytics function. It was a new function within the organization, and they were trying to figure out, how do we measure brand performance? And so it was putting together the frame of work, of what’s working, what’s not working in our marketing activities, and why that really quickly evolved into consumer data. So. Really understanding our customer. And in order to do that, you have a bunch of different disparate data sets that we had to bring together in order to say, Okay, I know who Jodi is, for example, because I know how she interacts on our website. I know that she’s visiting a retail store, and then I know, like, baseline demographic profiles of her. But the reality is all of that was stuck in different systems, and so bringing that together was really the focus that I had for the better part of three or four years of creating infrastructure where we could drive those kinds of insights, because we had a consolidated view of people, and then also enable things like personalization. And while we were doing that, obviously all of these privacy regulations started to pop up. And so what happened was, I was working on a task force where they were like, look, we’ve got to figure out, how do we maintain our ability to deliver relevance to our consumers? Because we know that’s what they’re demanding in a world that’s changing very, very rapidly. And so we had, at the time, something that was called pods, but essentially it was like a cross functional working group that was figuring out, how do we tackle this thing of privacy? And through that work, the legal team said, hey, look, we’re actually hiring somebody to help build the privacy program. Is that something that you want to do? And so that’s where I found myself building a privacy program about three years ago. And you know, I think what’s really interesting, and I was saying this in the pre show, is in my analytics function, it was really this idea of, how do you translate between what the business is trying to do and what technology is capable of delivering, and in order to do that, you really needed a translator function, someone that was able to navigate between those two worlds in order to deliver against the goals. And I see privacy sort of filling in that same but just adding the third piece, which is the legal landscape, is how do we make sure we’re continuing to enable the business to progress the way that it wants to progress, leveraging the existing technology set, but obviously abiding by the fact that there’s these really changing privacy laws happening in the background.
Jodi Daniels 7:07
Well, Sarah, I love your background, and we are very similar. So I had worked at media companies so very much. Remember looking at charts of when digital was going to overtake the traditional media of perhaps I had newspaper, print and radio, it was very, very fascinating to look at where it is, and then also had the opportunity in marketing. So as you had just mentioned, you’re kind of being that translator. Help us share a little bit. What is your approach to building a privacy program that’s trying to balance all these new privacy laws and retail is trying to move quickly.
Sarah Stalnecker 7:45
Yeah, I mean, you know, I think what’s been the number one challenge is making people care, making people care about privacy. And what’s been effective, I think, is recognizing the fact that the world has changed dramatically in the last 10 years. We consistently show a chart that says, you know, 10 years ago, like 10% of the population was covered under my modern privacy reg, it’s now close to 80% so that’s such a seismic shift that’s happened in a very short amount of time. And for companies, what that means is, I think we used to feel like once the data entered our four walls, it was our data, and we could do whatever we want with it. And I think now this reality is set in it’s like the No, the data still belongs to the individual. They still have rights with respect to that data. So how do we start going about the change management that needs to occur in particular, it’s not saying you can’t deliver data driven experience, you can’t deliver relevance, but you need to do it in a way that is transparent and really thinks about, what are the data? What is the data that we actually need, versus collecting everything that we can right? So I think companies used to be like, Let’s collect everything, because we might need it down the line, and the analytics team might want to use it for something now, everything is really purpose built and purpose driven, and so I think just getting people comfortable with the fact that the world has changed, and that means we now have obligations, and we have to be even more thoughtful about the way that we collect and otherwise process data. And I think it’s just a learning curve that every company, I think is going, you know, going through right now. And none of us are going to be perfect from the jump, but I think it’s this recognition that we just need to make sure we’re changing the way that we think about data and think about the people and the experiences we’re trying to power with it.
Jodi Daniels 9:39
Is there a story that you might be able to share, that you were able to get people to care? Because I hear that a lot, and I had the exact same thing. It was very hard to get people to care about privacy. Just curious if you have something maybe that you can offer listeners who are probably listening.
Sarah Stalnecker 9:57
Yeah, I got the same problem. Yeah. Well, no one really wants to talk about the law. Yes, that’s number one. But I think talking about consumer has been a really helpful way in and so really thinking about, what are consumers expectations? And so I consistently share like, here’s consumer feedback as it relates to personalization and privacy, and it’s a really dynamic tension of people expect privacy and relevance simultaneously, right? Like it’s dead center. And so what that means for, I think, brands and businesses is this idea of, how do you deliver against that relevance expectation, but do it in a way that honors their privacy? And one of the things that we say is we have an unofficial don’t be creepy policy, and that seems to really resonate, where you repeat back, like someone goes, Hey, we want to do x, y, z, this is the personalization experience that we want to power. And you say, play that back. And as a user, would you find that creepy, right? Like, is this an experience that’s valuable enough, and you think is relevant enough to the consumer, where they’re going to go, yes, that’s an expectation, and I’m okay, right with the data exchange that’s required to power that. So I think that starting with consumers and really, then putting people back in the position where they’re thinking about it through their own lens, versus I’m a brand. I want to do this. It’s Hey, if you were on the receiving end of that, how would that play? And I think that’s driven a lot of traction and actually shifted. I think the mindset of seeing privacy as the roadblock versus privacy is sort of an enabler, right? Like, let’s have the conversation about how to do it in a way that honors that consumer expectation. I also use a lot of memes. Memes and gifts seem to really work in this world. So the more you can find those, I think it just like kind of puts people into a more comfortable place of having conversations about this topic.
Jodi Daniels 11:59
I love it, memes and gifts. Why are you giggling over here?
Justin Daniels 12:02
Because it’s like privacy to forestall the creep factor.
Jodi Daniels 12:05
Yeah, don’t be creepy. Is also an official word phrase in the privacy space, absolutely.
Justin Daniels 12:12
Well, I actually want to pull a bit of an audible and kind of peel away a little bit of what we talked about in our pre show, which is, at least in my practice, every day, I’m either using encountering, advising on things related to artificial intelligence. And one of the things that’s really interesting to me is how much, particularly at enterprise clients, privacy is playing a real central leadership role. And Sarah, I was wondering if you’d talk a little bit about how that has played out within the confines of of your organization?
Sarah Stalnecker 12:47
Yeah, I think, as I was mentioning, you know, privacy teams had done a lot of work on what was happening in terms of data processing across organizations, right? So to catch up with all the documentation requirements that are privacy laws, we were doing a lot of quick lot of quick work to make sure we were understanding what was happening across, in our case, a global organization, and then you have aI coming, and what we were noticing is that every vendor that was being reviewed had some sort of AI capability or AI component, and it got to the point where we really needed to say, look, we can look at this through the lens of risk, but we also need to look at it through the lens of what’s the strategic benefit to the organization. And so really creating a governance process where the right people are getting in the room to talk through risk, risk management strategies, and then also just ensuring there’s transparency, so that when we are taking some sort of a risk, that it’s a strategic risk versus something that’s just happening in the background, and people don’t even know, right, that they’re assuming risk on behalf of the org.
Jodi Daniels 13:56
I feel like you have no thoughts brewing in that head of yours.
Justin Daniels 13:59
No, I just wanted to hear more about this, because the challenge with AI is similar to our podcast, is you have to think about it across privacy, cybersecurity, intellectual property, bias, transparency, and then you have to figure out how all of that applies to various different use cases, because you might be able to use AI, and I think it’s being used for your privacy program, but then the risks start to shift if you’re going to use it to say, be customer facing, or you want to create intellectual
Sarah Stalnecker 14:34
property, yeah, and I think we learned a lot, you know, coming out of what happened when all the privacy laws passed, I think a lot of companies were doing a hurry to figure out what was happening. How were we, you know, using data across the organization? And I think the AI piece gave us an opportunity to say, wait a minute, like, let’s take a step back and make sure that we just at least have an idea of all of the use cases. It is related to AI. So we have that documentation as these laws pass, as they’re enforced. We’re not taking some sort of knee jerk reaction to oh no, like we don’t even know how AI is being deployed across the organization. No, we have that transparency, and we can make adjustments as needed. And so I think that piece is really powerful to the extent that you have that level of documentation so that you aren’t overreacting, because this space is going to change. It’s going to change rapidly, and we can’t govern what we don’t know about.
Jodi Daniels 15:31
Yeah, Sarah, you mentioned earlier that there’s this balance of trust and relevancy. So yeah, while it and that you also are moving towards purpose driven data collection, which could still be a lot of data, sure, and we’re trying to make privacy kind of infused all the time, yeah. How have you tackled that? Just like AI, is this multi disciplinary approach privacy is two and curious, if you can share a little bit about your experiences and how that’s working.
Sarah Stalnecker 16:04
We so I’d love to say we have one process, and it meets everybody’s needs. The reality is that’s not the case. Practically speaking, what we’ve done is kind of met the different departments where they are. And what I mean by that is, if our e-commerce department, or a direct consumer team, uses a certain technology, and we need to embed a privacy process within that. We do that so we’re not always saying you need to go to use our tools, but rather we say, hey, look, we know you work. In the case of our e-commerce team in JIRA. That’s your ticketing system. So something like our website tracking, we have documentation of all the website tracking across our global websites. We put that in JIRA because we know that that’s where the team is used to it. That’s where they’re going to work on this activity. And so if we can embed the privacy requirements there, versus forcing people to go to a different place, it just means that we’re going to generate adoption much faster. And so I’d say meeting people where they are has been, I think, the single biggest opportunity that we’ve kind of taken advantage of so that we’re not trying to push but we rather are trying to kind of bring people along, similarly for marketing, we make sure that we have discussions again. We lead with consumer, we lead with trust, and we try to make sure that we’re having a conversation in a way and using language that makes sense for the audience, so that it does not feel again like my goal is to make sure we’re not a roadblock, because everyone wants to circumvent a roadblock, but rather get them to understand the why, ensure that our processes are adapted for them. And then I think that has been a much more successful way to just generate adoption.
Jodi Daniels 17:58
Yeah, change management for people is hard, so hard. And what you’ve shared is that instead of just trying to completely change how everyone has to do everything, you’re adding in some change to an existing process. And I love how you said using language. I imagine that is true throughout the whole organization. When you’re talking to the people team or the finance team, you’re using different language that they use, because, again, it’s really hard to change. It’s hard to and we’re not trying to make all these different functions be privacy experts. That’s your team, right? We just need them to understand a little bit. They’re very, very helpful in what you shared. So thank you.
Justin Daniels 18:39
So what role does privacy technology play in your program today? And then how do you decide when you’re gonna adopt a new tool versus sticking with an existing process? And to me, AI would be a great example. I can imagine you could dump all kinds of data about Jodi buying online, Jodi coming to your store and her demographics, and start to make using AI, start to make new associations that could create a marketing program. So then you’re saying, Oh, this AI seems great, but wait a second, how do I vet this tool? Or maybe we’re not quite ready for that because it’s unknown. How do you approach that?
Sarah Stalnecker 19:16
Yeah, so to clarify, are you talking about other people’s use of tools and how we ensure privacy is embedded, or the privacy tools that we use and how we evaluate our privacy tools?
Justin Daniels 19:30
I think it’s what privacy tools do you use? And if you come to say, Hey, do I need a new tool, or do I have a process? And maybe I inartfully tried to use AI as an example of used as a tool, but has significant privacy implications.
Sarah Stalnecker 19:43
Yeah, so, you know, I think for a lot of us, we got started with some of the price privacy tools just to meet the needs for things like cookie compliance and that very quickly evolved to things like, how do you want our privacy rights required? Yes. So the idea is, where does the technology come into play, where it’s actually making our lives easier? And in our case, we use it because we need, we know we need to scale process globally, and to do that using some of these privacy tools is really the only way to achieve that scale. We’re constantly reevaluating the providers. One, the space is changing really quickly. Like, I think of this as very similar to how marketing technology was in the early, like, 2010s right? Whereas, like, you saw that terrifying Luna scape of 3000 vendors that had entered into the space, sort of feel like the privacy world over the course of the last five years is in that similar domain, just not obviously, the scale of that, but but that same idea of all of the software popping up to try to solve these problems of complying with the laws we’re in a space now where we’re reevaluating what we have in place, and then how do we streamline process? Right? Because I think what you were talking about before, information security has its own process, particularly as it relates to third party vendor management. Are there ways for us to collapse and use similar processes, similar vendors, and both achieve what we need to achieve through that work, and then also have a centralized framework where we’re thinking about risk management? So to the extent where we can consolidate, that’s also something that we’re looking at. It also makes it easier for the business, because we’re using common lexicon, they sort of understand what the vendor is, what the vendor is doing, and we’re not trying to introduce so many disparate vendors in this space. So I’d say that’s really our goal right now is consolidate and make sure we have the right need and then also always making sure that the technology is helping us versus just making it more difficult for us to work through some problems.
Jodi Daniels 21:52
I always compare, or often compare the privacy tech space to the marketing space. Maybe it’s our marketing backgrounds. Probably I just see incredible commonalities between them. And I had this little smile he started to compare. Because I think it’s so true, and I think it’s going to remain that way. I think you’re going to end up with a few big behemoths, kind of like what you have in the marketing space, and then some really niche products that might bolt on, they might integrate, they might not. Some people really like a smaller niche. And the other thing I have found is people, some prefer an all in one it’s a little bit simpler from a vendor perspective, and then others prefer, nope. I want product A, product B, product C, because it meets their needs a little bit. And there’s not a right or wrong, it’s just literally the way the company operates best. So I think it’ll be interesting to continue to see that privacy tech market evolve. Which brings us to, how do you measure success of this program? You You have tech, you have process. It’s continuously changing. My prediction of more laws did not happen quite this moment, but instead we got amendment city. So now we have to go back and figure out, how do all these amendments impact things? We have new AI regulations, so it’s constantly changing and right, we’re kind of like building a foundation and trying to evolve it. So how do you measure any of this? It’s
Sarah Stalnecker 23:20
a great question. And I think a lot of companies start with the same thing, because the things that you would traditionally look at like, how many privacy requests are we fulfilling? Like things that are inherently very measurable, I don’t know. Do they tell you meaningfully? Are you changing the trajectory of how your brand or your organization is embracing privacy? I’m always heartened when I hear someone and I’m not on the call, but someone will tell me. Someone said, data minimization. We were talking through an integration of two systems, and someone said, Well, wait a minute, we can’t send all of that data because data minimization principles need to be followed. And so a lot of what we spent, you know, I think of as success is this, are we generating adoption of these principles? Is it just becoming part of workflow, part of process, versus people trying to circumvent us? And so that’s what I think we’ve paid the most attention to. You know, our information security team always goes, well, privacy is a thing here, which is good. So to me, it’s like that unofficial. But helps to say, Are we changing the culture? And I think that, for me, is the biggest win when I see this adoption of principles and not seeing us as, hey, we’re going to have to tick off that box at the end with the people that you know want to kill all of our fun. But it’s rather oh, we’re doing this process soup to nuts. We need to involve privacy. And I’ve seen more and more traction over that over the course of the last couple of years. And that, to me, says change. Change is happening. It’s not going to be overnight, but I’m heartened to see how much change we’ve seen in, you know, the last three years,
Jodi Daniels 25:09
That’s so exciting and probably very rewarding.
Sarah Stalnecker 25:13
I mean, I think it’s, you know, when you bring it back to human and you bring it back to trust people in an organization respond to that, because it’s going we’ve spent over 100 years building trust with customers and consumers. It can be undone so quickly, as marketers all talk about all the time, but it’s this idea of if there’s a value exchange and we’re collecting the data people expect us to have and use it in a way that people expect us to use it, that trust continues, and I think that’s resonated very much with the brand and the business, because there’s this level of understanding on a human level of like, yeah, I wouldn’t want my data used for XYZ, or I wouldn’t expect that company to have that piece of data. So that’s where the focus, I think, has really been, is just making sure people understand. It’s ultimately about trust. It’s complying with laws, but it’s also trust.
Jodi Daniels 26:10
Yep. Now when you are not building a privacy program, what do you like to do for fun?
Sarah Stalnecker 26:15
Right now, I’m just sitting on baseball fields endlessly because I have two boys playing baseball, so that’s taking up pretty much every weekend, day and night. But outside of that, love to travel, love to walk around new cities, having time to wander is like the ultimate luxury, right? Where you have no destination in mind. So I’d say if that’s if that’s available, that’s the thing I want to do.
Jodi Daniels 26:44
That sounds super fun. One day, maybe I’ll be right. It’s something to aspire to. There’s a long list of always something to be thinking about. At least for me, there is, if people would like to connect and learn more, where could they go?
Sarah Stalnecker 26:59
So I am active on LinkedIn, so if they more than happy for them to connect with me on LinkedIn, and I do respond to those messages. So I think that’s probably the best spot.
Jodi Daniels 27:10
Awesome proof of life is over there. It’s proof of life. We’re so excited that you joined us today and how you have been building your privacy program. I know it’s been really valuable to everyone. So thank you so much. Thanks for having me.
Outro 27:30
Thanks for listening to the She Said Privacy, He Said Security Podcast. If you haven’t already, be sure to click Subscribe to get future episodes and check us out on LinkedIn. See you next time.
Privacy doesn’t have to be complicated.
As privacy experts passionate about trust, we help you define your goals and achieve them. We consider every factor of privacy that impacts your business so you can focus on what you do best.