HP’s Aaron Weller on Privacy Engineering, PETs, and Information Security

Aaron WellerAaron Weller is the Leader of the Global Privacy Engineering Center of Excellence at HP, an international IT company developing personal computers, printers, and 3D printing solutions. Aaron provides technical leadership for privacy engineering, enablement, and experience for HP’s global operations.

As a seasoned privacy and information security veteran, Aaron has offered his knowledge and experience as a department head for various companies, including PwC and Blueprint. He is also a Co-founder of both Concise Consulting and Ethos Privacy, a consulting firm offering privacy strategies. Aaron is a sought-after thought leader who’s presented at national and international conferences and universities. He’s also been quoted in mainstream publications, including The Wall Street Journal and Forbes.


Available_Black copy
Available_Black copy

Here’s a glimpse of what you’ll learn:

  • Aaron Weller shares his career trajectory
  • What is privacy engineering?
  • Aaron offers advice to aspiring engineers
  • Integrating AI and privacy engineering
  • Implementing privacy-enhancing technologies
  • Common missteps in privacy engineering

In this episode…

Privacy engineering is an emerging field of engineering. What is the role of this profession, and how can companies benefit from their expertise?

Seasoned information security professional Aaron Weller explains the categories of privacy engineering, including user experience, design infrastructure, software development, and privacy-enhancing technologies. PETs are tools and techniques that help companies and individuals control and protect their personal information — they can be used to encrypt data, anonymize individuals, and control access to information. Privacy engineers have various responsibilities, such as implementing systems that provide acceptable levels of privacy. Aaron advises that smaller organizations can integrate privacy engineers by educating existing engineers to build their system development lifecycle process.

In this episode of the She Said Privacy/He Said Security with Jodi and Justin Daniels, Aaron Weller, Leader of the Global Privacy Engineering Center of Excellence at HP, expounds on privacy engineering, PETs, and information security. Aaron discusses the integration of AI and privacy engineering, how companies can implement privacy-enhancing technologies, and offers advice to aspiring engineers.

Resources Mentioned in this episode

Sponsor for this episode…

This episode is brought to you by Red Clover Advisors.

Red Clover Advisors uses data privacy to transform the way that companies do business together and create a future where there is greater trust between companies and consumers.

Founded by Jodi Daniels, Red Clover Advisors helps companies to comply with data privacy laws and establish customer trust so that they can grow and nurture integrity. They work with companies in a variety of fields, including technology, e-commerce, professional services, and digital media.

To learn more, and to check out their Wall Street Journal best-selling book, Data Reimagined: Building Trust One Byte At a Time, visit www.redcloveradvisors.com.

Episode Transcript

Intro 0:01

Welcome to the She Said Privacy/He Said Security Podcast. Like any good marriage, we will debate, evaluate, and sometimes quarrel about how privacy and security impact business in the 21st century.

Jodi Daniels 0:22

Hi, Jodi Daniels here. I’m the Founder and CEO of Red Clover Advisors, a certified women’s privacy consultancy. I’m a privacy consultant and Certified Information Privacy professional, providing practical privacy advice to overwhelmed companies.

Justin Daniels 0:35

Hello, Justin Daniels here. I am a partner at the law firm Baker Donelson and tech lawyer. I am passionate about helping companies solve complex cyber and privacy challenges during the lifecycle of their business. I am the cyber quarterback helping clients design and implement cyber plans as well as help them manage and recover from data breaches.

Jodi Daniels 0:59

And this episode is brought to you by Red Clover Advisors, where we help companies to comply with data privacy laws and establish customer trust so that they can grow and nurture integrity. We work with companies in a variety of fields, including technology, e-commerce, professional services, and digital media. In short, we use data privacy to transform the way companies do business. Together, we’re creating a future where there is greater trust between companies and consumers. To learn more, and check out our best selling book, Data Reimagined: Building Trust One Byte at a Time, visit redcloveradvisors.com. Well, today, is going to be super fun. I’m really excited for our guest. Today, we have Aaron Weller, who leads the global privacy engineering COE at HP, where he builds on over 25 years of privacy and security experience to deliver privacy enablement and experiences. And Aaron, it was so fun to see you in person last week at the IAPPPSR (we need some more acronyms) conference. And it’s a delight to have you here today.

Aaron Weller 2:06

Thanks very much.

Justin Daniels 2:08

I wonder if I should have a LinkedIn poll that says if your spouse goes to a conference, should she bring back any swag for you?

Jodi Daniels 2:17

Well, we have quite a collection. So I didn’t think we needed any more. I was being very sustainable. Let’s see, I was helping the Earth. Our younger daughter would be very proud.

Justin Daniels 2:28

She would So sustainability over all else.

Jodi Daniels 2:31

Yeah. All right. Yes. That was what I went with.

Justin Daniels 2:34

Can I remember that? For your birthday card? I’ll just have an electronic one. So sustainability doesn’t count for your birthday

Jodi Daniels 2:41

No. Just conferences. Fair enough.

Justin Daniels 2:43

No. Selective. Alright. So Aaron, how did you get to where you are today?

Aaron Weller 2:50

Yeah, so it’s it’s a long journey. But I did. As you kind of hinted at earlier, Jody, I started off in information security. So I call privacy my third career. I really started off in, in audit, and compliance. I got into information security, probably around the late 90s. The year 2000. Started off doing ethical hacking, ransom forensics teams worked my way up to be running security for a couple of companies in California. At one of those I inherited a privacy incident. That turned out to be a class-action lawsuit. And although I was nominally responsible for privacy, we didn’t really have I didn’t have that much privacy experience. And we didn’t have anyone on staff that did. So that was really when I started to get in more into the privacy space. So I got my CIPP/US in around I think 2007 2008. And then went to work with PwC. Running, helping to build their privacy consulting practice. I was there for a few years, and then helped get eBay through GDPR. And kind of in that crazy run up to, to mid 2018. And then I co-founded and ran a privacy tech company for a few years. We sold that last year. HP was a client of mine. And when I was looking around for something else to do and having conversations with people, as you do, I talked to my friend who was the chief privacy officer at HP at the time, said, Hey, we’ve got a role that might be a good fit for you. So that’s how we ended up at HP.

Jodi Daniels 4:24

Well, I’m excited to hear a little bit more about that because privacy engineering is a very common phrase that we keep hearing over and over again. And I’d love if you can explain a little bit more about what is a privacy engineer and also where do you find them fitting in the organization. Like where do they tend to you know, sit quote unquote?

Aaron Weller 4:46

yeah, so it’s a it’s a good question, one that I think even the IPP has struggled to, to answer there was a an infographic they put out earlier this year where there’s like eight or nine different flavors of privacy engineer and it’s everything from user experience. Design infrastructure design, actual software development, privacy enhancing technologies, there’s a lot of different bits and pieces. And I think that’s partially why, you know, when I try hire for a privacy engineer, people come to that role with a whole range of different experiences as well. So for me, there’s, there’s a couple of main flavors. One is somebody who, that is fundamentally an engineer, so they know how to develop software or infrastructure. And it’s really for me, that split between people who are good at kind of doing the plumbing kind of work, where you might have something like a data scanning tool where you’ve got to connect it into systems and really, you know, respect the rules around what you can do with the data from those source systems. And then where my team is more focused is more around privacy enhancing technologies. So how do we build controls into where we already have the data, or we want to use it for something where we may not have the permission to do that, or we want to do it in a more privacy protecting way. So kind of building those capabilities as well. Those are really kind of the main engineering kind of roles that I see. So you’re either building the infrastructure or building the ways of manipulating the data. But my team is a little bit broader than that. We also, as you kind of hinted at earlier, I have somebody who’s really focusing on how do we get the other engineers, so he, of course has thousands of software, developers and engineers? How do we get them enough privacy knowledge or to be able to answer their privacy engineering questions in an efficient way? So building out those channels, communities of practice, you know, what sort of guidance that really turns kind of the, this is what you need to do minimize the data. Okay, great. What does that actually mean? And turning that into kind of a how to guide. So really enabling the broader engineering community? And then the piece that I’m still looking at building out and kind of have done some preliminary work around is really that, how do we think about privacy experience as a design kind of pattern? So when we’re looking at all of those different touch points you have with somebody, you know, everything from the privacy notice privacy statement, you know, you want to request a DSI, you’re looking at kind of what consent or preferences you have a lot of them particularly in large organizations, it’s all built by different teams. So how do you think about someone who’s really got that design mindset, but also that kind of privacy chops to be able to say, we’re going to build this in a way that’s really transparent, and it generates trust meets all the compliance boxes, but it’s, it’s more engineered than kind of, I hate to say slapped together. But a lot of these, a lot of organizations that I’ve seen, people just work on different pieces, and they don’t look at it as being one experience, because all of these are run by different teams. And they’re not necessarily someone with that overall design view. So to me, that’s privacy engineering as well, that UX piece has actually picked up in some of the stuff the IPP has put out. But I think, because those are such broad, different skill sets, you know, get them all in the same person. That’s kind of why it’s hard to say, you know, a really short answer to what is a Profasee engineer,

Jodi Daniels 7:57

I have a really strong passion for that design experience. And I’m excited to see where it’s gonna go in the next couple of years. Because I think that’s a big opportunity that companies are missing. We spend so much time trying to get me to buy something in particular, and very little on the explanation of why are you actually collecting this little piece of data? And I think that’s, that’s a misstep, because if you design it properly, to help explain why I should give you XYZ data, then that’s the opportunity. I’m going to feel more engaged with you. I’m going to trust you more, I’m going to give you what it is that you’re asking me to do, as long as you don’t ask me right away in the very first time.

Aaron Weller 8:37

Exactly. And I think there’s a you know, when you’ve got people say, Oh, now we’ve got laws that say don’t do dark patterns, so dark patterns bad, but it’s like, okay, we eliminate the dark patterns, but we’re kind of missing that upside opportunity. And I think that’s a lot of what I’m seeing the engineering world is being it’s, let’s not do the bad things. But we can also do some good things we can really enable, we can get more value out of the datasets that we have, we can do the same things with less privacy risk. So that’s kind of where I’m really focused on kind of that ROI of, we’re not just a compliance function, we’re really an enablement function that just uses engineering to, to be able to use the data in ways that are both protective, but also profitable, right? Because that’s important too.

Justin Daniels 9:21

Yes, I just find it interesting. When you said that about I remember your first cup of coffee in San Francisco a couple years back or when I was asked for my cell phone number when I was just wanting to valet my car.

Jodi Daniels 9:35

Yes, I do. Remember my cup of coffee. My very first engagement was we’re so excited that you’ve come here. Thank you so much. We’ve automatically added you to the list because you swiped your credit card using a payment system we already knew your email address on and please give me your birth date. That was their first engagement with me. That didn’t go well.

Justin Daniels 9:51

So interestingly, I think of privacy engineers usually a bigger companies because they can afford to have such a skilled person. So how does this work at a scrappy startup? How could they include privacy engineers? Yeah, I think that’s

Aaron Weller 9:51

it’s a good question. And to me, it’s, you could look at degrees and degrees of privacy engineers, right. I have often said that. One of the great things about working in privacy is that there’s people who have very, very different backgrounds. And privacy engineer fundamentally can be a privacy person who can do engineering, or an engineer who can do some privacy. And in a smaller organization, you’ve probably got a lot more engineers running around and finding somebody who you can say, hey, if I give you enough privacy knowledge that even if you’re issue spotting primarily, and bringing those things back to somebody who is maybe the only privacy professional, but getting something built into their system development lifecycle process, that’s probably a good place to start without having to go and find kind of the unicorn of, you know, somebody who’s really a dedicated privacy engineer with a lot of background. Because it may be that you’re not trying to implement complicated privacy enhancing technologies or things you just want to know, what are people doing with the data? And how do they not make some of those basic mistakes, and then kind of build from there?

[continue to page 2]