Click for Full Transcript

Intro 0:01

Welcome to the She Said Privacy/He Said Security Podcast. Like any good marriage, we will debate, evaluate, and sometimes quarrel about how privacy and security impact business in the 21st century.

Jodi Daniels 0:22

Hi, Jodi Daniels here. I’m the Founder and CEO of Red Clover Advisors, a certified women’s privacy consultancy. I’m a privacy consultant and Certified Information Privacy professional, providing practical privacy advice to overwhelmed companies.

Justin Daniels 0:35

Hello, Justin Daniels here. I am a partner at the law firm Baker Donelson and tech lawyer. I am passionate about helping companies solve complex cyber and privacy challenges during the lifecycle of their business. I am the cyber quarterback helping clients design and implement cyber plans as well as help them manage and recover from data breaches.

Jodi Daniels 0:59

And this episode is brought to you by Red Clover Advisors, where we help companies to comply with data privacy laws and establish customer trust so that they can grow and nurture integrity. We work with companies in a variety of fields, including technology, e-commerce, professional services, and digital media. In short, we use data privacy to transform the way companies do business. Together, we’re creating a future where there is greater trust between companies and consumers. To learn more, and check out our best selling book, Data Reimagined: Building Trust One Byte at a Time, visit Well, today, is going to be super fun. I’m really excited for our guest. Today, we have Aaron Weller, who leads the global privacy engineering COE at HP, where he builds on over 25 years of privacy and security experience to deliver privacy enablement and experiences. And Aaron, it was so fun to see you in person last week at the IAPPPSR (we need some more acronyms) conference. And it’s a delight to have you here today.

Aaron Weller 2:06

Thanks very much.

Justin Daniels 2:08

I wonder if I should have a LinkedIn poll that says if your spouse goes to a conference, should she bring back any swag for you?

Jodi Daniels 2:17

Well, we have quite a collection. So I didn’t think we needed any more. I was being very sustainable. Let’s see, I was helping the Earth. Our younger daughter would be very proud.

Justin Daniels 2:28

She would So sustainability over all else.

Jodi Daniels 2:31

Yeah. All right. Yes. That was what I went with.

Justin Daniels 2:34

Can I remember that? For your birthday card? I’ll just have an electronic one. So sustainability doesn’t count for your birthday

Jodi Daniels 2:41

No. Just conferences. Fair enough.

Justin Daniels 2:43

No. Selective. Alright. So Aaron, how did you get to where you are today?

Aaron Weller 2:50

Yeah, so it’s it’s a long journey. But I did. As you kind of hinted at earlier, Jody, I started off in information security. So I call privacy my third career. I really started off in, in audit, and compliance. I got into information security, probably around the late 90s. The year 2000. Started off doing ethical hacking, ransom forensics teams worked my way up to be running security for a couple of companies in California. At one of those I inherited a privacy incident. That turned out to be a class-action lawsuit. And although I was nominally responsible for privacy, we didn’t really have I didn’t have that much privacy experience. And we didn’t have anyone on staff that did. So that was really when I started to get in more into the privacy space. So I got my CIPP/US in around I think 2007 2008. And then went to work with PwC. Running, helping to build their privacy consulting practice. I was there for a few years, and then helped get eBay through GDPR. And kind of in that crazy run up to, to mid 2018. And then I co-founded and ran a privacy tech company for a few years. We sold that last year. HP was a client of mine. And when I was looking around for something else to do and having conversations with people, as you do, I talked to my friend who was the chief privacy officer at HP at the time, said, Hey, we’ve got a role that might be a good fit for you. So that’s how we ended up at HP.

Jodi Daniels 4:24

Well, I’m excited to hear a little bit more about that because privacy engineering is a very common phrase that we keep hearing over and over again. And I’d love if you can explain a little bit more about what is a privacy engineer and also where do you find them fitting in the organization. Like where do they tend to you know, sit quote unquote?

Aaron Weller 4:46

yeah, so it’s a it’s a good question, one that I think even the IPP has struggled to, to answer there was a an infographic they put out earlier this year where there’s like eight or nine different flavors of privacy engineer and it’s everything from user experience. Design infrastructure design, actual software development, privacy enhancing technologies, there’s a lot of different bits and pieces. And I think that’s partially why, you know, when I try hire for a privacy engineer, people come to that role with a whole range of different experiences as well. So for me, there’s, there’s a couple of main flavors. One is somebody who, that is fundamentally an engineer, so they know how to develop software or infrastructure. And it’s really for me, that split between people who are good at kind of doing the plumbing kind of work, where you might have something like a data scanning tool where you’ve got to connect it into systems and really, you know, respect the rules around what you can do with the data from those source systems. And then where my team is more focused is more around privacy enhancing technologies. So how do we build controls into where we already have the data, or we want to use it for something where we may not have the permission to do that, or we want to do it in a more privacy protecting way. So kind of building those capabilities as well. Those are really kind of the main engineering kind of roles that I see. So you’re either building the infrastructure or building the ways of manipulating the data. But my team is a little bit broader than that. We also, as you kind of hinted at earlier, I have somebody who’s really focusing on how do we get the other engineers, so he, of course has thousands of software, developers and engineers? How do we get them enough privacy knowledge or to be able to answer their privacy engineering questions in an efficient way? So building out those channels, communities of practice, you know, what sort of guidance that really turns kind of the, this is what you need to do minimize the data. Okay, great. What does that actually mean? And turning that into kind of a how to guide. So really enabling the broader engineering community? And then the piece that I’m still looking at building out and kind of have done some preliminary work around is really that, how do we think about privacy experience as a design kind of pattern? So when we’re looking at all of those different touch points you have with somebody, you know, everything from the privacy notice privacy statement, you know, you want to request a DSI, you’re looking at kind of what consent or preferences you have a lot of them particularly in large organizations, it’s all built by different teams. So how do you think about someone who’s really got that design mindset, but also that kind of privacy chops to be able to say, we’re going to build this in a way that’s really transparent, and it generates trust meets all the compliance boxes, but it’s, it’s more engineered than kind of, I hate to say slapped together. But a lot of these, a lot of organizations that I’ve seen, people just work on different pieces, and they don’t look at it as being one experience, because all of these are run by different teams. And they’re not necessarily someone with that overall design view. So to me, that’s privacy engineering as well, that UX piece has actually picked up in some of the stuff the IPP has put out. But I think, because those are such broad, different skill sets, you know, get them all in the same person. That’s kind of why it’s hard to say, you know, a really short answer to what is a Profasee engineer,

Jodi Daniels 7:57

I have a really strong passion for that design experience. And I’m excited to see where it’s gonna go in the next couple of years. Because I think that’s a big opportunity that companies are missing. We spend so much time trying to get me to buy something in particular, and very little on the explanation of why are you actually collecting this little piece of data? And I think that’s, that’s a misstep, because if you design it properly, to help explain why I should give you XYZ data, then that’s the opportunity. I’m going to feel more engaged with you. I’m going to trust you more, I’m going to give you what it is that you’re asking me to do, as long as you don’t ask me right away in the very first time.

Aaron Weller 8:37

Exactly. And I think there’s a you know, when you’ve got people say, Oh, now we’ve got laws that say don’t do dark patterns, so dark patterns bad, but it’s like, okay, we eliminate the dark patterns, but we’re kind of missing that upside opportunity. And I think that’s a lot of what I’m seeing the engineering world is being it’s, let’s not do the bad things. But we can also do some good things we can really enable, we can get more value out of the datasets that we have, we can do the same things with less privacy risk. So that’s kind of where I’m really focused on kind of that ROI of, we’re not just a compliance function, we’re really an enablement function that just uses engineering to, to be able to use the data in ways that are both protective, but also profitable, right? Because that’s important too.

Justin Daniels 9:21

Yes, I just find it interesting. When you said that about I remember your first cup of coffee in San Francisco a couple years back or when I was asked for my cell phone number when I was just wanting to valet my car.

Jodi Daniels 9:35

Yes, I do. Remember my cup of coffee. My very first engagement was we’re so excited that you’ve come here. Thank you so much. We’ve automatically added you to the list because you swiped your credit card using a payment system we already knew your email address on and please give me your birth date. That was their first engagement with me. That didn’t go well.

Justin Daniels 9:51

So interestingly, I think of privacy engineers usually a bigger companies because they can afford to have such a skilled person. So how does this work at a scrappy startup? How could they include privacy engineers? Yeah, I think that’s

Aaron Weller 9:51

it’s a good question. And to me, it’s, you could look at degrees and degrees of privacy engineers, right. I have often said that. One of the great things about working in privacy is that there’s people who have very, very different backgrounds. And privacy engineer fundamentally can be a privacy person who can do engineering, or an engineer who can do some privacy. And in a smaller organization, you’ve probably got a lot more engineers running around and finding somebody who you can say, hey, if I give you enough privacy knowledge that even if you’re issue spotting primarily, and bringing those things back to somebody who is maybe the only privacy professional, but getting something built into their system development lifecycle process, that’s probably a good place to start without having to go and find kind of the unicorn of, you know, somebody who’s really a dedicated privacy engineer with a lot of background. Because it may be that you’re not trying to implement complicated privacy enhancing technologies or things you just want to know, what are people doing with the data? And how do they not make some of those basic mistakes, and then kind of build from there?

[continue to page 2]

Jodi Daniels 11:15

If someone is interested in getting into engineering, what would you recommend as a starting point?

Aaron Weller 11:22

So I’m involved in that there’s a couple of different things. One is I’m involved in the training faculty, for the IPP/US, you as you well know, Jodi, and the new CIPT certification was 50%, rewritten a couple of years ago, to make it much more I think, usable and practical. For people who are saying, well, how do I build privacy into kind of what I’m thinking about that design process? What are some of the things that I should look at. But I think there are also some other great groups out there that does a lot of open source material. And that’s another good thing about the privacy community is lots of people are really willing to help. So some of the groups that I’m part of, there’s a group called OpenMined, that’s around more you how do we use data in a, in a protective way, they’ve got lots of open source code that you can go and download and play with. So I think depending on kind of which flavor of an engineer you’re looking being, there are really a lot of resources out there. And even some of the while the maturity of the standards is not the same in privacy, as I’m used to insecurity. There are increasingly standards that are out there from ISO NIST and others that I think give you a good sense of kind of what do you need to know about? I was involved in helping to write ISO 31-700 That came out I think, last year now around privacy by designing consumer products and services, that kind of almost gives a roadmap from when you’re designing something all the way through to when you end of life. How do you train the salespeople about privacy features, we really kind of went broad with the scope of that. And that can give you a good roadmap to say, you know, which of these pieces are involved in in my day job? And how do I start asking some of those questions that have been built into the standard?

Jodi Daniels 13:06

Very helpful, and some of those I hadn’t heard of, so I’m looking forward to learning more.

Justin Daniels 13:10

Aaron had me at NIST. Oh, that’s so cute.

Jodi Daniels 13:13

Oh, that’s so cute. He had you at NIST.

Justin Daniels 13:23

I’m glad that you mentioned that. Because on every podcast these days, we ask people about AI. So where do you see the integration between AI and privacy engineering as we stand today?

Aaron Weller 13:41

Yeah, so it’s a good question. So my team is involved, we have a process around generative AI with a sandbox that’s kind of managed and constrained, as you would expect. And my team is involved in the privacy aspects of looking at those use cases that come through. So we’ve developed both myself and some of my other team members were involved in helping to write the new IPP AI governance certification. So we’ve got some of that background that we apply to, to these use cases. But really, one of the things that we’re doing is to almost disambiguate. What does AI mean, right? Because we’ve had machine learning models for decades at this point, we’ve got generative AI, that’s pretty new, and then whatever is coming next. So it’s looking for kind of what are the different types of privacy risks and threats that apply to each of these different types of AI? What kinds of reviews do we need to do? Where Where should we be really concerned about some of these risks? And I would even go back to what GDPR says, right? That when you’re using new technologies, where you’re you’re still not kind of sure about them yet. That’s when you should have a heightened attention to looking at some of those risks. So we are involved in in many aspects of there, but I think to really answer your question, often that there isn’t where there aren’t AI emphasis that have been around for for decades, or they’ve very, very rare having people who do understand and care about the data and are looking at some of those same fundamental principles around fairness and bias, and some of the other things that are coming out in some of the newer AI frameworks, I think that’s a good starting point for a privacy professionals who say, I can help with some of this AI stuff. But I do I’ve learned so much in the last year, because there’s so much that I hadn’t been involved in or didn’t know as well. So I do think it’s a good starting point. But there is definitely a lot to learn that is more AI specific around, you know, the difference between model risk, and then analysis risk, and how do you put all those pieces together?

Jodi Daniels 15:41

And one of the other things you mentioned earlier, was privacy enhancing technologies. And one of the panels I was on, we had lots of fun with the PETs. And I feel like you also couldn’t walk around the conference without talking or hearing about PETs. I would be curious if a company is interested in getting started, what might you suggest a good place for them to begin?

Aaron Weller 16:05

Yeah, so that there are, I think it’s really understanding the problem you’re trying to solve. So privacy enhancing technology is fundamentally manipulate a data set in a way that reduces privacy risk. But there are also a lot of trade offs as well. So techniques like differential privacy, add statistical noise into the dataset, that breaks some use cases. So that’s not going to work for everything. So one of the first things that my team started to develop is what, what I call MUPPET, which has many utility preserving privacy enhancing technologies, it’s really a portfolio approach to looking at PETs and saying, Hey, what is the problem you’re trying to solve? We have these 15 or 20 different potential PETs? How do we kind of go through that decision tree as to which was, which is the right one for you that we can then work on? Because I think all too often, one of the things that really does get overused is people say, Well, I anonymized this dataset. And my response is, usually well, what’s the risk of re identification? You calculated that right? No, that so I think that’s the, when you’re talking about PETs, it’s understanding that not any one of these tools will solve every problem you have. And it’s really important to understand kind of what are you looking at input privacy risk or output privacy risks? Are you trying to prove something? There’s PETs, and sometimes combinations of PETs that can do all of these things? But I think that’s a challenge. If you’re starting off with not many resources, you may only have the ability to implement one PET. Which one should that be? So I think that’s kind of where I’m looking at the research around how do you make that decision easier for people who aren’t specialists, so that we can start to gather some of the getting the problems defined in such a way that we can actually have something that works, rather than, well, we actually preserve the privacy, but all the utility is gone.

Jodi Daniels 17:55

Can you expand a little bit more on the example with differential privacy where there’s a potential if you add in that noise, it might not work? In all use cases? Can you share a little bit more about maybe some one or two use cases where that might be a challenge? Yeah, so

Aaron Weller 18:11

some of the and it depends, again, whether we’re talking about so differential privacy has multiple flavors, some of the bigger ones would be local differential privacy, where the noise is added on the device, right? Apple will use this on iPhones for for some of the things location data, where they will add the noise in there, and then move the data over when it’s already had the noise added. Or you’ve got centralized differential privacy where all that noise is, you know, all the data is it comes in accurate. And then the noise is added kind of more on the output end. But some of the use cases could be that if you never have access to the underlying dataset, how do you know that that is actually you know, the, it’s a fair representation, so a lot of testing doing on that front end, and then with the output if you’re looking for, but the whole point with adding the noise is that it’s statistically, you know, some characteristics around this, there’s statistical nature of it. So you can try and tune it, to where you’ve got the noise versus kind of the fidelity of the output that you’re looking for the so called epsilon value and privacy budgets and some of the, you know, the mathematical pieces of the model. But we’ve got examples where if you’re trying to match two things, and one of them has gone through differential privacy, it’s not going to match if one sides got noise on one side doesn’t. Or you could only do kind of more fuzzy matching. So that’s why it’s really important to understand kind of, what should we add the noise after the matching should we you so kind of understanding those data pipelines and saying, what are the privacy risks at each point? And then what is the right technique to apply so we can still get that utility?

Jodi Daniels 19:49

Thank you very, very helpful for those examples.

Justin Daniels 19:53

So what are some common missteps that you see companies make and what are your thoughts around what makes a successful privacy engineer?

Aaron Weller 20:02

I think the most common example, which is I always go back to back from my security days, there was a report that came out every year from Verizon around data breaches their team had been involved in. And they would do the analysis of kind of what went wrong. And every single year, over 90% of the breaches were caused by the failure of cheap or simple control. Right? It’s not the complicated stuff that actually usually gets us it’s doing the basics well. And I think one of the common missteps that I see is organizations where they have data that has a certain purpose attached to it. And they use it for the purpose, it’s because they don’t have a way of saying, either understanding what purpose it was collected on, hey, Jodi, we collected you and bought a coffee, once you use whatever payment system it was. Now, we’re going to use that data when you go to somewhere else, and they’re going to start sending you even like, it’s that kind of death rate of kind of the what did we actually what was the to be more technical? What was the legal basis that we had? What was the scope of that legal basis? What use cases apply? And therefore, what analysis can I do down the road? And that’s some of the use cases where we’re seeing privacy enhancing technologies can be useful, where we say, well, we’re going to take that data, or we’re going to aggregate it or properly anonymize it, so that you’ve got a data set that can’t be re identified back to the individuals and there’s more things we can do with it. But I think that’s one of the major missteps is the it’s either the lack of understanding or the inability to track, what are we allowed to do with this data? Because otherwise, people tend to assume that access controls are keen. And if I’ve got access to the data, that means I’ve got the ability to use the data. And that’s kind of I think, my number one throughout my privacy career, trying to re-educate people that, you know, access control doesn’t necessarily mean usage control. And those are two very different things.

Jodi Daniels 21:49

Like I say that all day long. And it’s funny, you mentioned about my coffee story, because then when it happened at the second coffee shop, you can see that said payment processor has a series of emails that are a template design. The second coffee shop is using the exact same template asking for the exact same information. And neither of them got what they wanted. Well, Aaron, with so much privacy and security knowledge, we always ask our guests, what is your best privacy or security tip?

Aaron Weller 22:28

One of the things that I say what I, I’ve been a consultant for a lot of my career. So seeing how a lot of different companies work, my best tip would be to go and actually take your privacy statement or privacy notice, because every company has one and read it and look at it and say, How do I know that this is true? Because even just taking that privacy statement, and saying, Okay, I’ve got a section around what data we collect? I’ve got a section around how we use it. How do I know or how can I prove that this is true? That in itself is it’s not really the traditional kind of compliance perspective. But the policy that the way that I look at kind of an external facing notice is that’s the law, you get to write yourself that is held to certainly in the US, the FTC will hold you to that as much as it will to any other thing that is a legal or regulatory requirement. So a lot of people I think, under use their own kind of privacy statement as a tool to help them work out. You know, is their program actually doing what you say or you’re committing to, to your customers?

Jodi Daniels 23:34

Make sense? Thank you.

Justin Daniels 23:36

So when you’re not being a privacy engineer, and thinking about privacy, what do you like to do for fun?

Aaron Weller 23:44

So I have a couple of kids in high school. So I wouldn’t say it’s fun driving them around all over the place. But now that the older one can drive that’s a big weight off. But yeah, I enjoy going to the games and the concerts and the things that they do. So they take up a big piece of my time with the last couple of years of them, them living at home before they’re off into the big wide world. But outside of that, you know, my wife and I like just exploring the Pacific Northwest where we live. There’s a lot of great variety within a couple of hours drive. So just getting out and seeing new things is is always fun, whether it’s in the US or somewhere else.

Jodi Daniels 24:24

Aaron, if people would like to learn more, including anything on you kind of had hinted yet that you’re leading some of the efforts on the IPPAI governance, where should we send them for them to connect and learn more?

Aaron Weller 24:40

If you want to connect with me directly, LinkedIn is usually the best way. I can’t promise to be super responsive. My inbox is always overflowing. But feel free to reach out to me that way. And for the IPP stuff specifically the IPP has some great resources if you can’t find what you need, again, feel free to reach out to me directly and I can get to the right person.

Jodi Daniels 24:58

Well, thank you so much for sharing all of your amazing knowledge on privacy engineering, privacy enhancing technologies and the course can’t have any podcasts these days without AI. Thank you so much for all your wisdom.

Aaron Weller 25:13

Great, no thanks for the invite and it was great to see you in person last week albeit briefly.

Outro 25:22

Thanks for listening to the She Said Privacy/He Said Security Podcast. If you haven’t already, be sure to click subscribe to get future episodes, and check us out on LinkedIn. See you next time.

Privacy doesn’t have to be complicated.