Click for Full Transcript

Intro  0:01  

Welcome to the She Said Privacy/He Said Security Podcast. Like any good marriage we will debate, evaluate and sometimes quarrel about how privacy and security impact business in the 21st century.

Jodi Daniels  0:22  

Hi, Jodi Daniels here. I’m the Founder and CEO of Red Clover Advisors, a certified women’s privacy consultancy. I’m a privacy consultant and Certified Information Privacy professional providing practical privacy advice to overwhelmed companies.

Justin Daniels  0:36  

Hello, Justin Daniels here. I am a corporate equity partner at the law firm Baker Donelson, I am passionate about helping companies solve complex cyber and privacy challenges during the lifecycle of their business. I am the cyber quarterback helping clients design and implement cyber plans as well as help them manage and recover from data breaches.

Jodi Daniels  0:56  

And this episode is brought to you by Red Clover Advisors, we help companies to comply with data privacy and establish customer trust so that they can grow and nurture integrity. We work with companies in a variety of fields, including technology, e-commerce, professional services, and digital media. In short, we use data privacy to transform the way companies do business. Together, we’re creating a future where there’s greater trust between companies and consumers. To learn more and to check out our new best selling book Data Reimagined: Building Trust One Byte at a Time, visit redcloveradvisors.com.

Justin Daniels  1:35  

That’s our first broadcast back on Alaska Sorek.

Jodi Daniels  1:39  

It is. We had so much fun visiting the beautiful landscape in Alaska and eating Mickey Mouse waffles and then we had to come back to reality. Yes, there’s no one making Mickey Mouse waffles and a gourmet dessert every day and totally animals on my bed.

Justin Daniels  1:55  

Yes, I missed the Italian. Anyway, well, we’re back with kind of a bank since we have an awesome guest today. So today,

Jodi Daniels  2:05  

our guests are awesome. That’s true. We have an extra special guests because it’s like a mirror image of all the fun activities you’d like to do. Remember, we have to talk about privacy, not just skiing in Colorado and Utah today.

Justin Daniels  2:18  

Thanks. So today our guest is Ed Britan. He is the leader of Salesforce is global privacy team. It is a global team with 35 Plus pros all over the world covering privacy for Salesforce and its customers. Ed is based in DC. He previously spent seven years on Microsoft’s global privacy and regulatory affairs team covering global privacy and AI legal and policy issues. Welcome, Ed.

Ed Britan  2:46  

Thank you. Thanks for having me. As a privacy geek, I love these types of discussions. And I really appreciate the opportunity to talk privacy with other folks that like hearing about privacy. Lord knows my family’s tired of hearing about it. So happy to talk to you all. And I’m excited to get my data to the people sure on, which is you know, privacy is all for me. It’s all about power and empowering people, empowering organizations, and really, you know, giving power to those that are that are less powerful, the vulnerable populations of the world, and giving them the ability to control their data and control their personas. And that’s, that’s what gets me excited about it. And I’m happy to be here today to talk privacy.

Jodi Daniels  3:26  

Now, the most important question is where do we place our order? Because we want a t shirt like that.

Ed Britan  3:31  

You just got to stop at the top the DC Tableau is one of our Salesforces affiliates in the DC Office of Tableau opposite Tableau shirt. You stop I’m sure I’m sure the team will have one for you.

Jodi Daniels  3:42  

I like that. I’ll have to add that to our upcoming trip over there.

Justin Daniels  3:46  

Maybe should have a T shirt slot for one of the red clovers, especially when we have the new year.

Jodi Daniels  3:51  

Oh, yes, yes, yes, yes, maybe we will. All right. So remember, we’re not doing skiing, and we’re not doing tshirt. We’re going to talk about privacy today. Okay. Now, Ed, you are super passionate about privacy. And we always like to understand how people got to where they are today. Can you share a little bit about your career journey and how you arrived at Salesforce doing all things privacy?

Ed Britan  4:13  

Sure, yeah. I’ve always been interested in like there’s always strive on policy, which is where privacy is because privacy is such a fluid and rapidly evolving field. So you always have to have your eyes on, on what’s percolating from a policy perspective globally to understand not just where the law is, but where the law is going where the where the law is going rapidly. I started my career at a law firm, a great law firm Austin & Bird in Atlanta, headquartered firm, and I was able to work with Bob Dole and Tom Dash or two of the preeminent policy minds of our era, two former Senate majority leaders, and I got to work on all things DC related for corporate clients, and privacy was a hot topic. I had clients that were you know, I enjoyed the clients and enjoyed the time of the firm but my clients were constantly annoyed by Microsoft and companies like Microsoft, like Salesforce, pushing privacy law forward, it was making their lives more difficult causing them to have to use me more in the firm more to get them out of sticky situations. And I thought to myself, it’d be fun to be on that side of the equation. So that’s what inspired me to, to join, go in house companies that are really in the b2b space. Because when you’re in the b2b space, you can really lean on privacy because our whole business model is around protecting data, and not using our customers. They’re not monetizing our customers data, but empowering our customers to own their data, and control their data. And so Salesforce is a really great company for that. We’re a pure b2b company focused entirely on empowering our customers. And so we can really lean in on privacy and trust, which is great for me from a personal perspective. And, you know, it’s good for our business model, but it’s also something I’m passionate about.

Jodi Daniels  5:57  

That makes sense. And I love how you emphasize the b2b side, some of my favorite conversations are with clients in the b2b space, because they often are those companies who will evaluate an organization like us Salesforce and want to make sure that when they give data to a Salesforce or equivalent, that they can rest assured that their data is safe. So it’s always fun to work with those companies.

Justin Daniels  6:21  

So give it a little bit about what we talked about in the prep, and how the Salesforce team is set up. How does privacy play a role in how your team helps assist the product people with designing your CRM products?

Ed Britan  6:40  

Absolutely. Well, for starters, we sit right next to our product team, we’re all in the same organization, which I think is a really helpful org structure. We report to the same manager, Lindsey Finch, who kind of created the privacy team in practice at Salesforce and made it what it is, she’s moved up in the organization and now leads privacy product and technology. And having the product team right next to us with the team manager is super helpful, we collaborate all the time, worked very closely together, we get the product truth from them, which is so important for us to be able to analyze effectively and give good advice to the business. You know, our business model relies on trust and protecting data, as I said, we don’t monetize our customers data or generate our own value out of it. So we are constantly working to empower customers to do more with their data, including from a privacy perspective, and really promoting responsible use of their data and nudging our customers in that direction. Because that’s what’s gonna be better for their business models. Given how you know, in the tech industry, trust is so integral to business and driving business. So our products are designed to put customers in total control of their data. So they can generate insights based on the trusted relationships they have with their customers, we have a lot of products that do this, some that I’m most proud of are the privacy center product, which is really, you know, a digital headquarters for privacy management, managing data lineage, managing preferences, things of that nature, we also have the data cloud product, which is a really cool product for me from a privacy perspective given that enables our customers to have a holistic view of their data when you’re working with CRMs, their various business applications. And so the data can become very siloed, which isn’t good from a business value perspective. But it’s also not good from a privacy perspective, data management perspective and IP protection perspective. So the data cloud product enables our customers to see everything holistically and manage their data through a single interface. And then obviously, with generative AI now in the form. When I, you know, it’s an every conversation I have these days, Richard, I know we’ll get into it more. But unfortunately, our products are very focused on ensuring that we can provide this innovative technology in a privacy trust, first way I can get into that in a bit. But that’s kind of the differentiator for enterprise companies is ensuring that our customers get the same commitment. It provides them with respect to all of our services, to the general AI offerings, controlling their data, and not having their data used in ways that they wouldn’t want to be used.

Jodi Daniels  9:12  

I think it is so interesting that the privacy team is with the product team. I love that. Often people think it’s just this legal or compliance. And it’s like stuffed in those organizations. And it’s a big challenge to get to the product team. And that’s a complaint I hear all the time is no, the product team doesn’t talk to me. They go and do all these things. And that you’re part of the same organization, I think speaks volumes to the success of how you’ve been able to integrate those two functions.

Ed Britan  9:43  

Yes, you could relate to the product legal team, but the product really being close to the product legal team is helpful because they have the clients in the product organization, so we interact with them regularly. And I agree. It’s something that’s been really helpful to me at this company. And I’d recommend for other companies to emulate because Oftentimes, if you have product attorneys that are embedded with the business, they can be very defensive. And you know, they view their job as protecting their line of business. And, you know, our products, attorneys are very much of that mindset as well, that’s their clients. But they’re also have a broader view of protecting the company and doing what’s right for the company. And I think, and they work with us directly under the same manager. So I think it enables us to get the better results more often.

Jodi Daniels  10:28  

Speaking about some of those challenges, and you mentioned Data Cloud, your customers want to use data to be able to better understand their customers ultimately create better relationships and sell more stuff. And so there’s that tension between products and privacy. Can you share a little bit about how you work through some of those challenges?

Ed Britan  10:50  

Sure, yeah. You know, I think the challenge for all privacy professionals, is to get folks viewing us as a value add and as a differentiator, as opposed to as as a call center and a wet blanket on issues. And, and that’s something we constantly strive for is demonstrating our value. You know, I think everyone in the tech industry is coming to understand how heavily regulated, we’re becoming. And doing privacy, right and getting it right can give customers confidence, to use the services more to share more data with the services and to interact more services, and get more value out of the services. It’s a threshold issue for using a lot of the functionality across the organization. And I think, you know, the more we can speak to that upfront, the more we can open the door, to sit to selling to the product team doing more innovative things to the sales team, you know, having having more ability to sell. And so I think we’ve been doing a good job on that. And I think we’re regulations headed is just really helpful for our company, given our business model, given the focus globally on third party data in the current advertising model, as it is Salesforce, given that we’re a customer relationship management company, we help customers get value out of the first pre data collect directly from their customers. And so by its very nature, that’s a more privacy protective way to interact and engage. And so there’s a lot of advantages to us as a company. And my goal is to harness those advantages to be more privacy, protective, and drive the business model. And it just so happens that that, you know, the regulatory landscape is leading a lot customers in that direction as well.

Justin Daniels  12:27  

Now, I want to shift our conversation a little bit, what you alluded to earlier, which is a real big focus, not just in privacy, but in all of technology seems to be around the promise, but also the risk management around artificial intelligence. And so we’d love to get your thoughts about how you, as a privacy leader are starting to think about AI and how it will impact your organization and how you deliver your products to

Ed Britan  12:53  

your customers. Absolutely, I think AI and specifically generative AI, you know, it’s come, it’s permeating everything, it’s the most transformative technology I’ve seen in my career. You know, I wasn’t around for the the Advent that I wasn’t working yet, when at the advent of the internet, but it’s that it’s a game changer, and it’s permeating all of our products gonna be part of all of our products and services. It’s something that everyone’s very interested in. But this technology is not going to achieve its full potential, unless we prioritize trust, by addressing all the concerns that folks have, and whenever there’s a new technology, there’s always concerns, I think some of those concerns will go away, as people get more used to the technology. But there’s there’s legitimate concerns or an accuracy, bias, inequality, privacy, security, sourcing of content safety. So we have to really demonstrate through how we roll this technology out and integrate it that we’re addressing these concerns, and that we’re providing this technology in a manner that that keeps customers in control of their data. As an enterprise company, that’s a big thing. And customers are used to we’ve, I think we’ve driven trust in the cloud. And it’s an amazing achievement. I remember when cloud technologies, were nascent and going to conferences, and there’s a lot of concern about whether data is safe in the cloud. I think we’ve we’ve crossed that hump. But we need to maintain that trust when it comes to new services and new functionality. So I think it’s also important to remember that there’s a lot of discussion about AI regulation, it’s coming in. It’s gonna be a field of law in itself. But also this technology is currently being regulated under privacy and data protection law globally. European regulators in particular are demanding changes with respect to how ChatGPT is delivered in those markets. And so we need to deliver this technology with trust, keeping customers in control and in a fashion that that gives customers confidence that they comply with global regulation.

Jodi Daniels  13:57  

You share that about demonstrating the ways that you’re integrating AI in a in a manner that customers can trust. And a lot of people are smaller companies who might be listening today and are trying to figure out they don’t have the huge teams like you all do. How do I incorporate AI and do it in a safe way? Can you share a little bit about? How are you thinking about doing that type of demonstration? Or what’s the process been to evaluate AI tools, and kind of ultimately how you wrap that into a good AI policy?

Ed Britan  15:36  

Sure, I think, for starters, we try to do as much as we can, on our end as a service provider, to protect our customers with respect to how we’re delivering the services and technology, for instance, by restricting third party access and ability to benefit from the data, keeping customers in control of their data, you know, ensuring that we’re not sharing that when we’re during prompts, we’re only using the minimum amount and you know, implementing the data minimization principle, and not sharing more data than necessary to deliver the content that customers are asking for. And I think that’s all really important to us from an enterprise company, in terms of meeting the commitments that we make across our services and applying those to AI. from a customer perspective, I would encourage even small customers to focus on transparency and accountability, it’s clear that assessments are going to be part of the legal regime. It’s a focus of the EU AI act. But assessments are already a core part of GDPR. And global data protection law. And I know a lot of customers are nervous about assessing their AI in this way. But I found data protection impact assessments, or algorithmic assessments, to be critical tools for address for identifying and addressing potential bias and helping to instill trust in the technology. Sometimes you don’t really know what you think about something until you put it in writing. And so many times in my career, just putting things in writing and thinking about things that way, has enabled us to find better ways to do things. And I think, you know, customers are concerned that putting things in writing increases liability or is going to, they’re just such a big task, they get paralyzed. And I always encourage folks, you know, I think there’s a, you know, especially in European law, there’s a proportionality principle, and small organizations aren’t can be expected to do assessments at the level of a Microsoft or Salesforce, but they’re going to be expensive to do something, it’s a legal obligation, I would encourage them to, in good faith, conduct this sort of assessment, and build that sort of a program to be able to demonstrate to regulators, because, you know, the first thing regulator is going to ask for if and when there’s a there’s an inquiry investigation is for this sort of documentation. And so having it and getting it done at the level that even small organizations are capable of I think it’ll improve their practices, and it’ll help with compliance.

Jodi Daniels  17:58  

When you conduct those different privacy impact assessments, who are the team members that are typically a part of that? I know, that’s a question that a lot of companies are trying to figure out. Maybe the privacy person knows that they need to conduct them, But who else should they bring to the party,

Ed Britan  18:15  

you got to bring the product truth to the part, you can’t just, you can’t just do these things as privacy pros. And so I think the best model is you need lawyers, you know, not, you know, people understand privacy to do the analysis. But you need the facts. And you need a conversation between the engineers in the privacy to the business and the privacy teams to get to the truth. And so the best approach to me is building a process that creates that sort of conversation and feedback loop, such that, you know, privacy as the questions or work it’s a responsible AI team are off at Salesforce, we have an Office of Ethics and Humane use. whatever team you have leading the assessment, asking the questions, getting the input, and then asking more questions and being for me, it’s always really helpful because you can get to the the nub of things and asking the why is the five why’s of you know, okay, we’re doing this way, why do we have to do it this way, and getting oftentimes, you can uncover different ways to do things that don’t impact what you’re trying to provide to the customer, this is what I found. And so, but you really need that close partnership with the business and the clients, which, you know, you have to build trust, in order for them to get them to build that sort of relationship. It takes time, but, um, it’s really it works. It works really well, if you put in the effort.

Justin Daniels  19:50  

Ed, I want you to put on your policy hat for a second. And I wanted to talk to you a little bit about you know, as you know, we’ve had six more privacy laws Let’s get past this legislative session. So we’re up to I think 11. And based on your experience as we start to talk about AI, what are your thoughts around seeing state or federal kind of legislation around AI? Obviously, the Europeans are moving pretty quickly speaking with their AI law, do you have any thoughts from your policy perspective on how AI regulation might develop on a state versus federal level,

Ed Britan  20:28  

we need to start with a comprehensive federal privacy law that is strong and respected globally, and is interoperable with the global standard that’s already been set in most of the world. I mean, Europe is moving forward, the AI regulation and we should move forward with AI regulation as well. But I think that regulation should be built on top of a strong comprehensive privacy or data protection law. Europe’s EU AI act won’t go into effect until 2020. If it hasn’t had it there in trialog discussions right there. They’re in discussions right now to finalize the legislation, even once they pass it, it will take you two years to be effective. But the Europeans are regulating now, as I said, and they’re using their comprehensive data protection law to do that. And I think it’s really important for the US to have a voice in those conversations. You know, we appreciate the regulation, that Salesforce given our model, and that instills trust and technology. And I think some of the changes that are already being demanded by European regulators are will be helpful and helpful for drawing trust. But really, I think the US needs to have a voice in those conversations as well. And the quickest way for us to have a voice is to pass a copy of the privacy law that regulates data across the board, personal data as a baseline. I’m not seeing that sufficient, there also needs to be aI regulation. But I would hate to see us waiting for AI regulation before we regulate data. Mr. Just,

Justin Daniels  21:52  

while I agree with Ed, I just struggle with the fact that we can’t seem to regulate cybersecurity, and continue to have state privacy laws without a federal one, I guess, just based on your experience having been in policy and being being the epicenter of policymaking in our country. What do you see different that might spur Congress to act in the privacy area where it seems unable to act in a bunch of other technology areas?

Ed Britan  22:19  

Yeah, I mean, I’m not, I’m not naive. I’ve been in DC a long time. I know that it’s hard. We need a federal privacy law. But we’re not we shouldn’t wait for federal privacy, while other states are acting. A lot of the state laws are really good. And in fact, ever since California, we’ve seen the same sort of model that really aligns with the global standard for data protection pass in every state subsequently, to varying degrees. I mean, you can, you know, there’s always slight differences between the state laws, and some laws are stronger than others and impactful ways. But it’s a model that looks a lot like what exists globally. And so that’s, that’s been good to see. And I think that’s advancing the US model for privacy protection and data protection, I think, um, you know, we should, they should continue to act. And I hope they do so in a way that’s interoperable and doesn’t create a patchwork that makes life for our customers exceedingly difficult. I’ve been a lot of privacy conferences, I know, a lot of companies are really concerned about the patchwork, and they’re already finding it difficult. But from my mind, I actually think it’s been helpful that the states have haven’t deviated as much as they could, they’ve been falling generally the same basic approach with core principles around data minimization, impact assessment, the privacy rights that exist in global laws, such as rights to access, deletion, delete incorrect data. And so the states are acting. And that’s what I think is most promising and keeps me excited. And I wouldn’t say optimistic, but hope, maybe hopeful, is, um, we’ve come to a point of sophistication in this country, where the privacy laws are really strong and sophisticated, and reflect what’s happening globally. We’re also building on what’s happening globally in a uniquely American way, which I think is really awesome to see. I mean, I’ve been in the privacy debates in DC for years, it was always like, limited to should we do an opt in or opt out was kind of the debate. And now it’s so much more sophisticated and interesting, and reflects what’s happening globally. And I testified at a hearing in the House Energy and Commerce Subcommittee on innovation a few weeks ago, and what I was most struck by in that hearing was hearing Democrats and Republicans in a bipartisan fashion discussing these issues, and both recognizing the importance of regulation, and both agreeing as to what that regulation should look like I know politics can still get in the way of things but, you know, a comprehensive privacy law that I think a lot of people have issues with and there’s a lot of concern around the American data privacy protection act, but it’s a comprehensive Privacy Law that, by all estimates is strong and I think looks good compared to the global model. It passed the House Energy and Commerce Committee was founded by resoundingly bipartisan, it was 53 to two vote last Congress. So there does seem to be a building consensus around what privacy law should look like. And the fact that we need a privacy law, and so look like, even though it doesn’t look, it never looks proper, you know, I would never bet on them getting something done, but we have to keep working towards it. And I do think that something will get done eventually.

Jodi Daniels  25:38  

With all the knowledge that you have about privacy, and when you’re hanging out with your friends, or at a cocktail party, maybe talking to people trying to convince them, they should have a privacy law, what is your best privacy tip that you would offer,

Ed Britan  25:52  

I kind of get focused on the need for privacy. And I kind of explained to them why we need it, because they don’t understand. And I think like, it’s important to protect your own privacy from a personal perspective, and, you know, not always accepting things and paying attention to the user experiences and what you’re agreeing to, and making sure you have that kind of knowledge and, and take control of your own persona online, but I think a lot of my friends don’t have the time, or the energy or the desire to do that. And while I’d love to shake them up and get them into that, I want to know, I think those people should be protected as well. And I think the best way to protect us all in our busy lives, is to have strong regulation here in the US, that aligns with what exists globally, that protects people, regardless of whether or not they make person, you know, just you know, decisions with respect to user experience, affecting themselves, people should be protected, regardless of point in time decisions that they may or may not make. And so I think those baseline protections are so important. Accountability is so important. I’ve seen it firsthand. I mean, when the GDPR went to effect, I saw the whole industry invest more in regulation than I’ve ever seen in my in my life. And that is big, all the tech companies. And I think, you know, we need that sort of understanding and investment in all US companies and across the board here in this country. And I think, you know, the tech companies are trying to help and nudge organizations toward more more responsible practices. But there’s nothing that’s going to shift culture, at scale, more than passing a strong comprehensive privacy law.

Jodi Daniels  27:33  

It is an interesting

Justin Daniels  27:34  

conversation, at least three or four shots.

Ed Britan  27:38  

Well, I want people to get active. I mean, I think the biggest, the only way we’re gonna get a federal law is if people ask for it, and demand and polls show people want it. But we need to get people thinking about it, talking about it and understanding why it’s important. Right?

Justin Daniels  27:55  

So when you are not managing privacy for a global brand, what do you like to do for fun,

Ed Britan  28:03  

I like to ski as much as possible. My goal in life is to get as many days skiing with my family as I can. And it’s just a passion of mine. I love being in the mountains, I love being there with my kids. Steep trails, challenging them, watching them, you know, get over their fears, hidden gunpowder days, but you know, other I just like being in nature and being on the water or hiking or doing things as a family. And so that’s what keeps me going. I’m clearly passionate about privacy. But you know, not every day is a, you know, champagne and strawberries is. It’s Sunday’s are hard. It’s grueling. And those times my family would get me through that.

Jodi Daniels  28:51  

And trying to balance and so I am, we also learned Nietzsche, happy people and music, Justin’s years about skiing and powder days. Well, Ed, we’re so grateful that you shared all that you did with us today, if people would like to learn more, or to learn from you, where’s the best place to connect?

Ed Britan  29:12  

On LinkedIn, I’m pretty active on LinkedIn. Feel free to shoot me a message. I love talking about this stuff. And I’d love to meet people and chat with people hear different perspectives. And that’s what’s so great is there’s so many different perspectives. There’s so many different aspects of privacy law, data protection law, that, you know, it keeps me it’s endlessly fascinating to me. And I know, to the folks in our community, they’re equally passionate about these things. I love talking with you with folks that are passionate about these issues. So I’d love to hear from you on LinkedIn and in chat it up and meet in person.

Jodi Daniels  29:48  

Well thank you again for sharing your perspective with us today.

Ed Britan  29:53  

Thank you so much for giving me the opportunity

Intro  29:59  

Thanks for listening to the She Said Privacy/He Said SecurityPodcast. If you haven’t already be sure to click subscribe to get future episodes and check us out on LinkedIn. See you next time.

Privacy doesn’t have to be complicated.