Click for Full Transcript

Intro  0:01

Welcome to the She Said Privacy/He Said Security Podcast. Like any good marriage we will debate, evaluate, and sometimes quarrel about how privacy and security impact business in the 21st century.

Jodi Daniels  0:22

Hi, Jodi Daniels here. I’m the founder and CEO of Red Clover Advisors, a certified women’s privacy consultancy. I’m a privacy consultant and certified informational privacy professional providing practical privacy advice to overwhelmed companies.

Justin Daniels  0:36

Hello, Justin Daniels here I am passionate about helping companies solve complex cyber and privacy challenges during the lifecycle of their business. I am the cyber quarterback helping clients design and implement cyber plans as well as help them manage and recover from data breaches.

Jodi Daniels  0:53

And this episode is brought to you by — hello, we’re jumping, slowpoke, Red Clover Advisors, we help companies to comply with data privacy laws and establish customer trust so that they can grow and nurture integrity. We work with companies in a variety of fields, including technology, ecommerce, professional services, and digital media. In short, we use data privacy to transform the way companies do business together. We’re creating a future where there’s greater trust between companies and consumers. To learn more, and to check out our best selling book, Data Reimagined: Building Trust One Byte at a Time. Visit redcloveradvisors.com. Why am I excited? We’re gonna talk privacy engineering today.

Justin Daniels  1:36

Yes, it’s quite clear, you’re very excited.

Jodi Daniels  1:39

Well, I like happy topics and privacy is fine. And we have a fun and engaging guest. What’s not to be happy about sunny outside.

Justin Daniels  1:48

I think what’s driving you today is the weather.

Jodi Daniels  1:51

The weather, I think so. What will make me well, so let’s get started. Are you gonna — who’s gonna kick us off?

Justin Daniels  1:59

No, I’m going to, I’m going to give the con to you.

Jodi Daniels  2:03

Well, today we have Jay Averitt. Jay began his career as a software engineer, and went to law school and practiced for 10 years as a corporate attorney specializing in software license agreements. Jay got exposed to privacy during this time, and it was loved ever since. So, Jay, we are so excited to talk to you today about the technical side of privacy. Welcome to the show.

Jay Averitt  2:28

Hi, thanks so much for having me. I’ve always enjoyed your podcast, so it’s great to actually be on it.

Jodi Daniels  2:37

And you get to see the banter in real life.

Jay Averitt  2:41

It’s so much more fun to actually see your faces there. Yeah, it’s because I didn’t know from listening to the show. I never realized that you guys were actually in the same room. I thought for some reason. You’d be split screen, but it makes it —

Jodi Daniels  2:55

This way I can elbow.

Justin Daniels  2:59

She can send me nasty post-it notes.

Jodi Daniels  3:05

Yeah. Never listened to the show the same ever again. All right, that’s your turn. My turn. It’s your turn.

Justin Daniels  3:13

Okay. Well, Jay, given the introduction we gave you, please tell us a little more about your career journey that led you to your work today.

Jay Averitt  3:23

Yeah, um, thanks so much. I mean, I always loved tech, from an early age. And I asked for a computer when I was like in sixth grade when computers cost ridiculous amounts, and nobody actually had them in their house and really got was into tag. But I also kind of always had some kind of fascination with the law because my dad was in risk management and was dealing a lot with lawyers. And then for some reason I fell in love with John Grisham books and thought practicing law was actually like a John Grisham book. And so I was kind of struggling on what kind of career I was going to do. And my parents luckily convinced me to actually major in something technology-related, Management Information Systems, did that and ended up starting as a software engineer until the bubble started bursting during the .com era. And I was like, “Oh, God, there’s not gonna be software engineers in the future, which crystal ball?” That was an idiotic thought. And so I decided to de-risk and go to law school, and then look for ways to sort of bridge my technology background with a law degree, which led me to software license agreements, like we said, and which was finding all but didn’t really feel passion redlining contracts. And really when I started seeing privacy as a way of using my law degree, but also really getting my hands back on technology. And when GDPR came out, realizing that was something I could actually do, from an our technical operational side, and not just as a more on the legal side, I decided that was a lot of fun. And, and since I’ve made that transition, been really happy, and yeah, loving what I’m doing.

Jodi Daniels  5:30

Well, it’s funny, you mentioned John Grisham, that was the book and the author that I loved reading when I was in high school. I’ve not done very many book signings. But John Grisham came, and it was the first time I ever did that. I went by myself. And I had, I did musical theater, and I had a dress rehearsal that night, the line was so long, it was hundreds of people long, I’m all the way at the back. There was no way it was going to make it and after hours standing in line, I couldn’t not get the book science, but how to ask every single person in front of me. Hi, I have this dress rehearsal. I’m in the show, I’m not going to talk because he was talking to everyone. I don’t know if you’ve ever done a signing, but he’s gonna sit and talk and take a photo. quite lovely, except when you have to go somewhere. So I had to ask every single person so I could get to the front of the line to get his book signed.

Jay Averitt  6:17

Yeah, you know, it’s funny that I think it may be the only book signing I’ve ever been to. And I was — my parents actually checked me out of high school, either take me to a book signing. And I remember that vividly. And I remember talking, it’s funny. Before the show, we were talking about 30A Beaches. And in the firm, there was a whole part of it, it took place and dusted. And I remember talking to him about Dustin. Now that’s my little memory of that. So yeah, that’s funny.

Jodi Daniels  6:49

Yeah, it is really funny. Well, all right, let’s bring it back to privacy engineering, maybe not as cool as the beaches. But very important today. So can you share what privacy engineering is? And then how companies of all sizes can include these philosophies?

Jay Averitt  7:06

Yeah, I mean, I think that’s a great question. Because it’s one that there’s not a great answer to, or a lot of people have done a lot of thought answering. I mean, specifically on the definition of what privacy engineering is, a lot of you will put a lot of thought into answering that question. And, you know, I think the IEP P did a decent job of trying to put it in words what privacy engineering was, and I, you know, didn’t think that was fantastic. But what they did do was put buckets. These are what privacy engineers are doing. And, you know, I think that’s the hard part of it is there’s people doing a huge array of stuff under the privacy engineering umbrella, like we have people that are creating privacy tools that are really more software engineers with a privacy focus. And then we have people doing more stuff like I do, which is privacy by design work. And actually trying to embed privacy within an organization, and doing technical privacy reviews, and things of that nature, calling themself privacy engineers. And then, you know, there’s people doing all kinds of things like the privacy red team, that are actually working on the privacy incident side and trying to predict what people seeking your data would do from an offensive standpoint, and, and, you know, attacking from that standpoint. So there’s, you know, a lot of different privacy engineers doing a lot of different tasks. But, you know, I think that also kind of makes it interesting. And that, really, you know, no matter what your background is, there’s probably a place for you somewhere in the privacy engineering umbrella. And, you know, maybe one day like when security’s kind of been spread out, and security engineers kind of have their specialization, and there’s more different classifications, but at the moment, I think there’s a lot of room for lots of people, one broadens the engineering. And I guess that’s the definition. The second part of your question was, how can privacy engineers come to life in any organization or of any size? I think that’s true. I think, you know, in the startup sense, privacy engineers really can be helpful in building a privacy program. Because, you know, a lot of startups may have one privacy counsel or counsel that’s doing sort of privacy work halftime, but not really focusing from a program standpoint, and from a program standpoint, the main thing I would say is one, you know, figuring out where your data is in the first place to actually build out a program from there. And then to, once you’re actually doing further development, have privacy actually be consulted and have a voice of the table early on in the design process of new application development. So that way, you know, you’re not at the 11th hour, and then asking privacy to sign off on something. Instead, you know, privacy could be considered early, and then baked into the design itself. So, you know, even any organization could really benefit from having a privacy engineer work on that, even if it’s just one.

Jodi Daniels  10:48

You’re just staring at me alone.

Justin Daniels  10:51

So, Jay, as you talked about privacy, engineering, and how you work with cross functional teams, how have you found success working across these functional teams with competing priorities? And as we’ve let known privacy is usually an afterthought, not a design, or design feature?

Jay Averitt  11:14

Yeah, I mean, I think that’s, I think you’re, you hit the nail on the head there, I think the the issue that I have is probably the same as you’ve experienced, is that, you know, a lot of engineers are not thinking about privacy early on in the process, just as I articulated, so when you’re bringing in design, to, to me and saying a, you know, they want me to just say, Hey, this looks good, let’s go forward with it. So you really got to build those relationships. And I think that’s really starts with, you know, making them realize that you’re all trying to achieve the same goal, I’m a huge fan of technology and innovation, I want my company’s products to be the best products out there, I don’t want to slow down the process, I would just want my product company’s products to be the best, and baking privacy on in to that process. You’re actually, I think, creating world class products, because you want your customers to really trust your product. And with that, if you don’t have, if you haven’t tried to play to privacy early, and you haven’t thought through all the ramifications, and you just kind of did some kind of last minute privacy console, you probably haven’t, you know, built world class privacy into your product. So if you can do that, then you can have the potential, say customers saying, hey, you know, my company, this company really cares about privacy. And, you know, that’s important to me, and I want to support that company, and you’ve built that customer trust. So I think you I think the best way of doing it is actually showing how you can be a value add to engineering. And, you know, work with that. I mean, from the legal standpoint, working with lawyers, you know, I mean, I think most lawyers that I’ve ever worked with, and technology companies, and when I worked as a lawyer, I mean, I never wanted to say, hey, legal killed this. Because that’s, that’s you’re not going to survive very long if you’re the person in legal killing a bunch of deals, because, I mean, let’s face it, companies need to bring money in the door, and legals killing a bunch of deals. That’s not great. So I think it’s just as exactly the same as just showing we’ve got to ensure that all legal regulations are followed. But how can we do that in no way of, you know, not hindering what we’re trying to do from our release cycle. And I think the best way of doing that is bringing everybody in early on in the process.

Jodi Daniels  14:08

Hey, I’d love if you could maybe think about an example where you can share what your role was, maybe how if there was a privacy counsel or an operations person, and whomever else might be there and how everyone had their unique role, but was working together.

Jay Averitt  14:28

Yeah, I mean, I think that’s something I do almost every day in technical privacy review where an engineer like I’ll give you a very close to real life example. An engineer is trying to bake some AI into an existing application and wants to enter and insert a prompt or something into an application. Where an LM is being consulted to find the answer. What can you do to ensure that that privacy is actually being considered with that rollout? Well, the engineer really just wants to get this done because they’re under the gun of a management has said, we need to get all this out the door as fast as possible. And you know legals thinking, hey, crap, you know, this is AI, and there’s a lot of stuff that is going down the road with AI, we want to make sure that everything is there. And then I’m thinking sort of the same as is legal that look, from a privacy standpoint, we really can’t forget about privacy fundamentals when it comes to AI. And let’s figure out how we’re going to ensure that all privacy fundamentals are putting this while understanding that, look, this is stuff that is extremely hot right now. And we want to get out out the door as quickly as possible. So I mean, I think, you know, I worked with legal to figure out what their view is, in terms of a are we is what we’re doing from a like, for example, what we’re doing from a document retention standpoint, compliant with the GDPR CCPA, or whatever the privacy reg was for the GAO that we were rolling this out. And, and then, you know, I’m looking at it from an overall privacy standpoint, are we doing things that we should from a privacy principle standpoint, like? Are we considering data minimization, and not storing additional information that we don’t need? Or are we being clear and transparent to our users about what data we’re actually collecting? Yeah, so things like that are the stuff that I would be considering. And then working with the engineers to figure out a way to implement those privacy controls without, you know, pushing the release cycle way down the line.

Jodi Daniels  17:13

So helpful, I really appreciate you sharing those extra details. Now, your LinkedIn banner says that you are a privacy evangelist. Can you tell us a little bit more about that?

Jay Averitt  17:29

Yeah, I mean, I like I said, what’s our film privacy, I just fell in love. And I think what I love about it is that, you know, I feel like I know a thing or two about privacy. But the more I learn, the less I feel like I know, because there’s always some new regulation coming out. Or there’s always some new technology, like alarms and AI has been around for a while, but it certainly wasn’t something that I was thinking about every single day, up until last year. And so that’s something new that I had to learn and think about. So I love privacy, because it just really satisfies the lifelong learner in myself. And I like to try to spread what little bit of knowledge I have out to everyone else, because I just feel like that. Especially right now, I feel like privacy is a spot that maybe security was 10 years ago, where, you know, people think about cybersecurity. Now, even individuals think about cybersecurity, they think about maybe, maybe not everyone is using the absolute best passwords out there. But their people are at least cognizant that they should be thinking about that. And you know, things like your phone rolling out, suggesting super strong passwords and stuff. Security is definitely maturing to the point that is being contemplated, whereas I think, yes, there are some people that hear about big data breaches and stuff like that, and they’re like, “oh, yeah, you know, that’s, that’s bad.” And, you know, but that’s also not something that they’re just thinking about in their daily lives. And so whatever I can put out there to make people actually think about privacy, both as consumers and as organizations, that this is something that you shouldn’t really be considering, I think is what I’m trying to do to get the message out there.

Jodi Daniels  19:39

Yeah, lots of passionate messages. Mr. Justin? Sounds like you could relate. I can, and am a lifelong learner.

Justin Daniels  19:48

I think, in this day and age, you’re either a lifelong learner or you’re obsolete.

Jay Averitt  19:54

That’s probably true. Yeah, I mean, it’s not going to have you Stuck in what you learned 10 years ago, then. Good luck.

Justin Daniels  20:06

Yes, I’ve had many fun evenings understanding what Microsoft Copilot is.

Jodi Daniels  20:14

We appreciate all the great work that you’re doing for us, Jay.

Justin Daniels  20:19

Yes. So privacy and being at the same time is really hard. But also simplified like certain people tried to do. What can privacy pros, from your perspective do to help make companies comply? Compliance with privacy laws easier?

Jay Averitt  20:39

Yeah, that’s a great question. Because I mean, I think the alphabet soup of privacy, like GDPR, or robos, DPI A’s. And all of that makes everything seem super complicated from a privacy standpoint. And, you know, to be honest, like a lot of the questions that I’m asked, I don’t have a clear answer on and I don’t think anyone really has clear answers on because there are questions that are not easily they’re not answered by the GDPR. They’re not answered by anyone. And it’s really, you’re using your judgment, to analyze what the privacy regs may say, or just analyze what best practices are from a privacy standpoint. So there’s no clear answers on that. But I guess from why I think it can be easy. If you look at the underlying cause underlying principles of privacy, it’s pretty much common sense, like a data minimization, fancy word for just like, let’s not take stuff we don’t need from a data standpoint. I mean, let’s, if you don’t need somebody’s full address, and social security number, why are you collecting all this information? And what are you going to do with it, then maybe you shouldn’t be doing that. And, you know, things like that. I mean, so I think, just really common sense. Instead of like, breaking through all of this alphabet soup and what various privacy regs say, yeah, those are important. You need lawyers to actually understand that, and all that. But a lot of it is just like, let’s look at privacy principles, which even those words can be complicated, but it really is, okay. I mean, a lot of times when I’m looking at a privacy problem, the way I look at it is, okay, if I’m the user of this product, is what they’re doing from a privacy standpoint, going to be something that I find unacceptable. And if I find it unacceptable, then probably most users are going to find it unacceptable. So, you know, looking at things like that, which I mean, I think, fairness and transparency, those are principles baked into the GDPR. But they’re just also common sense. Like, you shouldn’t be rolling something out that is going to anger your customers. And, you know, you think big companies look at things like user status. These are sad sat and, you know, that’s, that’s something they’re looking at, and everything, but really looking at making sure that they’re okay with your users are okay with the privacy should be one on one and that user side.

Jodi Daniels  23:35

Building upon what you just shared with some of those basics, which I really agree with you, obviously, there’s nuances to all the laws that people want to pay attention to. But at the same time, if you boil it down, there’s some really simple concepts in there. Using that as a baseline, what do you think every privacy pro should be practicing in their role?

Jay Averitt  23:58

Yeah, I mean, I think the first part of it is really what I’ve said before is shifting privacy left in the software development lifecycle. Just having privacy talk to as applications are being developed, I can solve so many problems, because I mean, problems tend to bubble up when engineering has worked super hard on this release. And they’re about to roll it out. And then probably the last minute is saying, A, you’ve got a big problem here. And I mean, nobody wants to have their innovation stopped there at the 11th hour and we reroll it back and fix it and it’s gonna be much more costly and more expensive to sit there and try to rework something than pay. We came early on in the process. We talked to privacy and privacy tools and we were working, they may or you may still You haven’t might be having to have these a lot of our discussions. But if your organization values privacy, I mean, chances are — and most engineers, they say they’re not, you know, against privacy, most of them just aren’t thinking about it, because it’s just not wired into what they’re thinking about. So, once they understand the issue, they’re like, Okay, cool. We’ll try to work that into our design. So I think that the first step is really just doing that for simple. And then. Yeah, I mean, just fundamentally, looking at data minimization, and just overall fairness. And if you’re actually evaluating something, bro privacy standpoint, like, like I just said, don’t collect stuff you don’t need, and then try to treat others like you want to be treated. And if you don’t want to be treated that way, then you might be okay under the GDPR, CCPA. Probably a lot of things are seemingly okay from a legal standpoint that you maybe should just not be doing. Because you don’t think it’s fair. And, you know, I think that is going to create more trust for your customers and ultimately, help the company down the line.

Jodi Daniels  26:20

Okay, have you ever had an experience where you wanted to do the right thing, so I think many of our listeners agree, and the people that they have to work with, are not really open to hearing what they have to say. And I’m wondering if you have an experience where you were able to convince someone to really listen to the privacy risks that you were trying to present? And if there might be learning? What made you successful to be able to convince that person to listen or to perhaps change course?

Jay Averitt 26:49

Yeah, I mean, I mean, you have that a lot. I mean, but I’d say like, I mean, sometimes, you know, you will want to say, Hey, this is a problem from a privacy standpoint, they, the engineers, or product managers, or whoever you’re working with, or be like, well, you know, way up the chain, this has gotta roll out, privacy can’t be an issue. And you know, at the end of the day, a lot of legal problems and privacy problems are more of a risk decision than, you know, an absolute stopper, and some executive of the line is going to make the call of, hey, this is a risk we can live with. And, you know, you’ve got to educate that executive on what the potential risks are. And I think privacy, the GDPR actually gives you pretty a pretty good tool to make the executives aware of the risk when you when you come to him and say, hey, look, your risk is 4% your revenue, then that’s going to make them less and in which I don’t, that’s not the way weapon that I want to yield bursts. But that does tend to get an executive to listen, you don’t want to be found breaching the GDPR GDPR. I mean, that’s not something you want to do. So I mean, that is one way of doing it. But I mean, I think a lot of it is just what I’ve been talking about before is okay, yeah, I get that we don’t want to slow down this release. But let’s talk about what will happen if we alienate our customers. And we lose trust with our customers by not considering privacy. You may not. I mean, chances are, I mean, there’s so many, I think egregious privacy issues out there that no regulator is going to touch because there’s just, there’s not enough time, or regulators aren’t seeing everything out there. And, you know, depending on the size of the company, they just may not be going after it. So you know, that you may not get hammered by some regulator, but you could, you know, have a 23 and Me situation where your entire revenue just goes down the tubes, because you’ve lost trust with your customers. So, I mean, I think you can, I think that it deserves — if I was ever you gotta use that, you know, dagger of the GDPR. And that’s the only way they’re gonna listen, fine, but I think the trust angle is the better angle because, I mean, that’s actually the more likely harm that you’re gonna cause.

Justin Daniels  29:38

I couldn’t agree more. So what is your favorite privacy tip to share when you’re hanging out with friends at a party?

Jay Averitt  29:48

Yeah, well, you know, I try to avoid even as a privacy evangelist, I don’t like boring my friends and family and they don’t tend to do that. Uh, even most of them don’t, my wife, I think has a decent understanding of what I do. But 90% of the people I come into contact to don’t know what privacy does. But I think I favorite tip I actually is, I mean, I, I get super annoyed when I get my phone number out somewhere and I started getting random text from things like, you know, I was, so my, my wife’s friend, other days, she goes to some spa and like, they’re texting, hey, this is your reminder that Botox is coming up or something like are late and I was like, a real bad privacy problem that your phone is constantly, you know, trying to solicit you for Botox and stuff like that. So, you know, I feel like, you should kind of safeguard your cell phone number. And, you know, the problem with that is, everyone asks for it. So what do you do, I mean, I think Google Voice or some other voice over IP, where you can give a phone number out, and you can check that you want to see all these solicitations for Botox and all what other southern tides having to sell or whatever, you can check all of that. But you don’t have to be bombarded with a ton of text and or text, you’re having to unsubscribe all the time. So I think, you know, one using a Google Voice or something like that. And then just having an email dedicated to handout to anything other than super important communication, I use a Gmail address that I use for super important communication. But I also have another email address that I’m going to get about, you know, whenever I’m making any kind of online transaction.

Jodi Daniels  31:53

And Jay, when you are not talking and practicing privacy, what do you like to do for fun?

Jay Averitt  32:02

Um, I like to play tennis. It’s way like you were mentioning the weather is actually finally getting better. So the sun is shining, so excited about going back and playing some tennis because it’s been a couple months. But yeah, I mean, I’ve got a six year old, a three year old. So they are keeping me busy during the weekends. But it’s also a lot of fun to go to Iceland to basketball games and just hang out and enjoy life with two young kids.

Jodi Daniels  32:35

What we’re so grateful that you came and joined us today, where’s the best place for people to connect with you?

Jay Averitt  32:41

I mean, LinkedIn is what I use to evangelize privacy. I actually work at Twitter, it was on X can’t even figure out how to get back into my account after I left there. So not not on there anymore. But LinkedIn. Happy to connect with anybody out there. And yeah, thanks so much for having me. Absolutely.

Jodi Daniels  33:04

Well, thank you again. We really appreciate it.

Outro  33:11

Thanks for listening to the She Said Privacy, He Said Security Podcast. If you haven’t already, be sure to click Subscribe to get future episodes and check us out on LinkedIn. See you next time.

Privacy doesn’t have to be complicated.