Click for Full Transcript

Intro 0:01

Welcome to the She Said Privacy/He Said Security Podcast. Like any good marriage, we will debate, evaluate, and sometimes quarrel about how privacy and security impact business in the 21st century.

Jodi Daniels 0:22

Hi, Jodi Daniels, here. I’m the founder and CEO of Red Clover Advisors, a certified women’s privacy consultancy. I’m a privacy consultant and certified informational privacy professional providing practical privacy advice to overwhelmed companies.

Justin Daniels 0:36

Hello. I am Justin Daniels, I’m a shareholder and corporate M&A and tech transaction lawyer at the law firm, Baker Donelson, advising companies in the deployment and scaling of technology. Since data is critical to every transaction, I help clients make informed business decisions while managing data privacy and cybersecurity risk. And when needed, I lead the legal cyber data breach response brigade.

Jodi Daniels 1:00

And this episode is brought to you by Red Clover Advisors. We help companies to comply with data privacy laws and establish customer trust so that they can grow and nurture integrity. We work with companies in a variety of fields, including technology e commerce, professional services and digital media. In short, we use data privacy to transform the way companies do business together, we’re creating a future where there’s greater trust between companies and consumers to learn more and to check out our best-selling book, Data Reimagined: Building Trust One Byte at a Time, visit redcloveradvisors.com Well, this is fun. We’re recording a podcast where I can still see snow on my deck.

Justin Daniels 1:38

Right in Atlanta. Aren’t we heading into what is this year, five of podcasting?

Jodi Daniels 1:44

Well, from a year perspective, but we’re, if you counted, it’s like four years and four does the former CPA, okay?

Justin Daniels 1:57

You sure it’s out to the right amount of decimal points?

Jodi Daniels 2:00

You know, I was actually at dinner, and someone was talking about what conversation we should talk about. They were trying to come up with conversation starters, and someone suggested we could talk about pi and how many decimals are in, you know, what, which decimal they wanted to go out to and talk about those numbers. So, see, some people really want to do that.

Justin Daniels 2:20

Maybe that’s when you needed your AI avatar to have that conversation on your back.

Jodi Daniels 2:25

Alright, we did not talk about that, but let’s get back to privacy and security fun. Today we have Aaron Painter, who is a deepfake expert. I cannot speak properly today.

Justin Daniels 2:37

I can’t wait to ask him about his deepfake expertise. We’re gonna have to delve into that.

Jodi Daniels 2:41

Alright. Aaron is the CEO of Nametag, Inc, an identity verification company that is at the forefront of stopping social engineering attacks at the employee IT help desk. And in case anyone was wondering, no, I’m not gonna edit out my fun over here. That is what makes our podcast fun and real. So Aaron the deepfake expert, we are so glad that you are here with us today.

Aaron Painter 3:07

Thank you. I am excited to have fun with both of you. I feel like I have joined your dinner table conversation, and hopefully we’re gonna have some really good topics to cover.

Jodi Daniels 3:15

I think we are .

Justin Daniels 3:18

So Aaron, would you like to discuss with us your career journey to date, and include anything about deepfakes that you might have to share with our audience?

Aaron Painter 3:30

This is definitely the first feet have not often been a topic of conversation on podcasts. Coincidentally, though, in the world of Gen AI, feet and hands are sometimes the things that are a little bit difficult for the computational power to put together. So maybe you’re onto something. My background was heavily in tech. I spent 14 years at Microsoft. I started in product in Redmond in Seattle, Microsoft headquarters. I was the first product manager for Office as we brought together the individual apps of Word and PowerPoint, Excel into the system, as we called it then and now, 365 and the rest of my career at Microsoft was outside the US. I ran elements of international expansion, opening Microsoft in 31 new countries. Ran the windows franchise on the ground in Brazil for a few years, and then ultimately ran Microsoft China for five and a half years, based two years in Hong Kong, three enough in Beijing. So a lot of my career experiences were getting the lens of how people were using technology and enterprises as it was. You know, cloud technologies were booming. Productivity tools were coming on the market in new economies that were trying to think about new ways to solve problems. I loved it. I left. I wrote a book. I then was hired by a private equity firm to sort run a company that was based in the UK focused on cloud migrations and helping large enterprises move to the cloud. I lived in London at a time that was just super exciting, but it was a ton of travel, and I decided just before the pandemic to move back to the US, and I sort of moved to Seattle about a month before the pandemic started, and thought, okay, great, I’ll just settle here for a little bit. And ended up spending quite a bit of time there. And it was sort of in that experience of sort of pandemic landing and starting that I realized in the market there were challenges around identity verification due to some really personal experiences. And that’s eventually what led to starting Nametag almost five years ago.

Jodi Daniels 5:25

Aaron identity verification is a really big challenge to companies, and it’s also an area of vulnerability, and they’re not really aware of all the different issues that exist today. So you were just sharing a little bit. Maybe there was a personal story. Can you share a little bit more about the issues that are here today? What companies need to be mindful of?

Aaron Painter 5:47

Yeah, for me, there was something I had, an experience that probably many of us had, which is sadly for reason, that a lot of friends and family who had had their identity stolen the start of the pandemic, and they were in a while, all of a sudden, their accounts were taken over, and they were getting these emails. You spent this money and you authorized this and that. And I said, oh my gosh, how it feels like the world is falling apart in so many ways, but we’re going to fix this, you know, together. We’re going to jump on the phone. I’m going to be a good friend. I’m going to be a good son. We’re going to call these companies. We’re going to sort this stuff out. And every time we called the company, they would ask us these sort of silly questions. And we all know them. We’ve been through them, these sort of security questions. You know, what’s your favorite color? What street did you live on in next year? You know, what’s your last four of your social and it turned out that someone had found the answers to those questions and was able to answer them when they called before us to sort of do damage on the account. And so I said, How is it in this modern era that we don’t actually know who’s behind the screen, that we don’t have a way to verify who someone is, and different ways to do kind of identity verification. Turns out it’s a very large category. Social Security Questions are one, social security numbers are sometimes another, different ways we have passwords or user names or PIN codes. And then it turns out, there’s been this market that’s existed for quite some time around when you open a new bank account, there’s a process called KYC right know your customer, typically for anti money laundering requirements, where the bank is required or the financial institutions required to plausibly have checked the identity of who someone is, typically, that’s done with the driver’s license or passport. And so there’s technology that it developed to be able to do that experience of, hey, take a photo of your ID, take a photo of yourself, and we can do that remotely. But the technology that had existed for these had always been in web browsers. Turned out, everyone on the market does this in a web browser, maybe on your mobile phone, but still in a web browser, and that technology meant it was good enough for regulatory compliance, kind of check the box, but it wasn’t good enough for security, and that’s one of the key reasons we found that people but when you call up that tech support, you called customer support, they weren’t asking you to scan your idiot and take a selfie, because that that infrastructure, the bank had you do that to open the account, but when you want to Call and do a transfer, they ask you the security question, because the process that was built for KYC simply wasn’t built for security, and that’s what led us to try and find it. You know, is there a better way? Is there another way this can be solved?

Jodi Daniels 8:12

That’s so interesting. I do hate those questions. I hate them every time. They’re so basic, and I think there has to be a better way.

Justin Daniels 8:20

Well, until there is, you should actually answer them dishonestly, so that people can’t find out the information. But the problem with that is, where do you store all of these answers? So it’s like you solve one problem, but then you potentially create another. But you know, Aaron, one of the things I wanted to chat a little bit about was some of the things that we’re seeing now, you know, deepfakes, where you could have your public company CEO saying things that moves the stock price that are completely false, authenticating the identity of people. And hey, Aaron, I gotta send you a wire. Okay, we’ll jump on a zoom call, and everybody on the Zoom call, except for me, is fake. We’ve seen that in Hong Kong. And so just, you know, give our audience a taste for some of the real challenging issues that companies are grappling with now when it comes to identity verification. And you know how your company helps to try to solve this problem?

Aaron Painter 9:26

Yeah, let’s start with manual education, because I know that you used a great job in the podcast, usually, of educating kind of listeners on new tech and issues. And let’s start with deepfakes. So the term itself typically emerged in 2017-2018 by a user on Reddit who was basically referring to adult content that was created using celebrity look alike, so the face of a celebrity mix things around, and you had sort of a fake piece of multimedia content that was simulating the likeness of someone else. And that term has really run wild, and it’s run wild, particularly in the last couple of years, as we’ve seen. The rise of these Jim AI tools, because Gen AI has made it easier to create these, these deepfakes that can then be used in a whole bunch of different aspects of society. One really big category is the, you know, announcements content. Was this a video of the CEO? Was this a video of this politician saying or doing, impersonating someone else, the likeliness of someone else doing or saying something. One big category, the other big category is people using deepfakes to impersonate someone for security purposes, and that’s usually meant taking over someone’s account, or we’ve talked about how it happens and when, but typically pretending that you were the rightful account owner, and for whatever reason you are locked out of your account. Let’s say you’ve added multi factor authentication onto your work account or your bank account, or you name it. You upgraded your phone. You’ve lost your phone. It’s simply not working. Whatever it might be, as strong as that security is, the weakness is simply calling and saying you’re locked out. Because if you call and say you’re locked out of that, it’s left the hard working person to the help desk to try and assess if you are the person who pay for the account owner. And so deepfakes are being used in sort of both of those categories. The way a deepfake works typically, is, there is software that allows you to create some sort of variation using Gen AI tools. A lot of it is extremely accessible, easy and quite high quality today. And you can create video deepfakes. You can create voice deepfakes. You can create still image deepfakes. All of it is gotten to the point where, in the category of voice, for example, you know, five, seven years ago, you would if you read a specific script for 20 minutes, you could train a piece of software on what your voice might be like using certain words and a lot of input, Microsoft researchers have now proven, with three seconds of someone’s audio, right, you can recreate a voice deepfake of that person, and it might not be super high fidelity, but sometimes super high fidelity isn’t what’s necessary to to fool someone. And so there are different types of deepfakes the way they then are deployed, no matter how you create them in a digital context, you’re sort of you to give them a flashing, you know, holding up the fake that you’ve made. That there are two types of ways that they’re off and you set up in a virtual process. One is what’s called an injection attack. And an injection attack, the easiest way to think about it might be on Zoom. The ability on Zoom to select a camera or microphone, it is as easy to select a piece of software that, let’s say, is creating a real time deepfake, something called an emulator, something that is projecting, you know, you were behind the camera, but this software is changing what you look like to look like someone else, and you’ve simply selected the different camera on Zoom and that other that piece of software is projecting the deepfake image, instead of projecting you into the video call. That’s the concept of an injection attack, and something that’s gotten very real. The other concept is what’s called a presentation attack, and that’s simply when you hold up, let’s say, photo or a video, or you hold up your phone that’s playing a video or showing a photo to the webcam, right? Or your mission, Impossible style, wearing a mask, you’re trying to trick the system by presenting what is otherwise doctor or fake information. Those are the two ways that people use or deploy deepfake. It was interesting and important to understand how they’re used, but also then you know how they’re created, voice, etc, and then how they’re deployed or put into harm. And those are the two common ways.

Jodi Daniels 13:20

Okay, well, that was really scary.

Justin Daniels 13:21

Well, I think we haven’t gotten to the really scary part, which is, what do we do to mitigate the impact and effect of these things?

Jodi Daniels 13:34

Well, yes, that is the next part that we should be talking about. So what is it that maybe we should start with what companies are doing today. You shared a little bit with that know your customer experience. Not everyone is doing that. They’re doing the password version. So maybe Aaron, if you can share a little bit about where we are today and really where we should be going.

Aaron Painter 13:56

Yeah, it’s so much of what we’ve seen happen in the last 18 months is rooted from this particular attack at MGM, and poor MGM, and then they’re just becoming the poster child of this. But it could have been many others. It got so big that 60 minutes has done an episode on this, and it’s that kind of mainstream at this point, and an employee, or, sorry, a bad actor called MGM, protected an employee and said, I’m locked out of my employee work account. My multi-factor authentication isn’t working. The help desk rep did their best, had some questions, went through a process. Roughly eight minutes later, they reset access to that account, giving the bad actor access to that employee account, being that bad actor then went in and using that employee’s credential, you know, impersonating them essentially, and caused a bunch of harm, deposited ransomware and took MGM kind of offline for two weeks. And this started sort of an epidemic of these attacks over the last 18 months where hundreds and hundreds of companies of all sizes, and some of the very largest that we interact with on a daily basis have experienced. This type of attack, because it is, fortunately, very easy to do. It is a human vulnerability, not necessarily a technological one. And so in light of this attack, you know, leading security company Okta, that actually helped protect MGM and MGM case, their CISO came out and said, you know, the advice we recommend is that when a user is locked out of their account. You should do some form of what they call visual verification. And visual verification can mean, let’s see the person, the person who’s claiming to be locked out in person, you know, come into the office, come into the branch, or, you know, maybe they’re a remote worker or the remote and so you could do something, you know, a video call. You could have a team call, or Zoom call with them and sort of have them move. Maybe have them hold up their ID. Maybe you ask them questions, you interview them a bit and do this visual verification before you proceed with letting that person reset the account. It seemed like good advice. The challenge that happened was that these platforms, like Teams, Zoom and others, weren’t really meant to prevent against people using deepfakes. They weren’t built to prevent against injection attacks and selecting a different camera microphone, that’s a feature, right? That makes team zoom wonderful, but suddenly you could use that feature to cause harm. And so we then saw this very notable example, right of you know, the large financial services firm. We initially learned a little bit more nuance, but I was reporting to be a CFO based in London, a controller based in Hong Kong. The CFO said, you know, controller, I need to do a few wire transfers for me. The control was slightly suspicious. The CFO said, well, a bunch of the leadership team were on a video call. Now here’s the link. Why don’t you join? The controller joined the call and they recognized the faces the voices of members of the leadership, but they were deepfakes. They were deepfake emulators being used in a video call. And so the controller rightfully thought this was a legitimate thing. Of course, everyone agreed with it. They went off and processed the 25 million wire transfers, and it turns out that was a deepfake in a video call. And so that sort of scared the whole industry, because we’re like, oh my goodness, the solution we thought we could rely on, suddenly we can’t really rely on and so that’s led to so this crisis of confidence now, and people saying, again, how do we remotely know who’s behind the screen? Can we trust that person and that that really has become kind of a scary thing.

Jodi Daniels 17:11

So Aaron, tell us a little bit about Nametag and your approach to helping solve this problem.

Aaron Painter 17:19

We took an approach that said there’s value in trusting credential issue by government, like a driver’s license or a passport. We’ve all seen that in the real world, in the in person world, but the identified was that the weakness was how that information is captured and collected, meaning it’s in a web browser. And so if you take the same experience, and you move it exclusively to mobile phones and ask a user to use their mobile phone to scan their ID and to take a selfie, you could get a wildly different level of security assurance in that experience. And so we created identity verification technology works only on phones and in a really novel way, and how it appears on your phone, but it takes advantage of the security features native in the mobile phone, namely the what’s called the secure enclave, the cryptography element that makes a phone secure, takes advantage of things like the three dimensional face ID camera, not for face ID, but to use it to take a selfie of a person. So you get a three dimensional selfie, for example, of that person, things that give you all together a much higher level of assurance. So you could do this form of identity verification in a remote way. And we started with this, and we said, All right, we have this really novel way of doing it. Wow. You know, gosh, there’s, like you mentioned the mission over I’ve read Glover that there’s a sense of crisis of trust. You don’t know who you can trust. We can really solve that on social media platforms and other things. And then we got pulled by the market to say, yeah, that’s all nice, but I have a very specific problem. I have users who are calling my help desk and claiming to be locked out. Can you solve that? And so we started working with some large enterprises to say, hey, let’s give your help desk agents a tool they can use, instead of asking security questions, to send a verification link that can send back sort of a high assurance outcome on who the person is behind the screen. That worked really well. And then they said, Hey, by the way, half of our support desk calls, coincidentally, are people who are locked out. Wouldn’t it be better if they didn’t have to call the help desk? And so we created a self service way, first for employees and then eventually for customers to be able to reset themselves when they are locked out of MFA. There was a long time there was sort of a Forgot My Password button on a web page, but when you added MFA, that button went away. And so being locked out of MFA meant you had to call the help desk, and we brought back a self service way for people to do that in a high assurance way, such that you didn’t have to call that desk. And for us as a company, that really became kind of our turning point moment. And now, you know, we’re seeing these other use cases that companies we work with are bringing us to say, Oh, can I also use it here or there? But that self service MFA functionality and equipping help desks has really become our bread and butter use case of the IDD that we do.

Jodi Daniels 19:55

I know a particular large company that will remain nameless. I had MFA on it, and their MFA broke, and it wouldn’t work, and I just went in a circle. I could reset the password, but I couldn’t get through in the account. It is such a large company that there is no human to talk to, and I basically had to shut the whole account down. This sounds like lovely technology. I could avoid that, uh, that huge hassle. But at the same time, Aaron, I really like how you’re listening to your customers and hearing the real world problems that you’re seeing to be able to tackle them. I think that’s incredibly smart, and can only imagine what new problems we’re going to run into next.

Justin Daniels 20:37

It’s funny you say that because I view Aaron’s story a little differently.

Jodi Daniels 20:44

Of course you do. That’s why I’m the privacy person. You are the you said.

Justin Daniels 20:47

She said, no, that’s why you’re, she said, and I’m he said,

Jodi Daniels 20:50

Well, there’s that too.

Justin Daniels 20:51

What Aaron said, I guess. Aaron, what I found interesting about your story is you have a company. You think you have a problem that you’re trying to solve, and it might make sense, but it may not be the biggest pain point that your prospective customers trying to solve. And they’re like, that’s interesting, but this is really what my problem. It’s almost as if your customers helped you do the product development for you. And just say, Hey, can you help solve this? You know? Like, Well, we do this, but if you want us to solve this? Okay, we’ll do that.

Aaron Painter 21:24

100%. It is fully everything that we have found commercial traction in and humbly, it’s become very significant. Commercial traction has been because we’ve listened to our customers on what to go build. And the fun part is, I’m traveling today. We’re doing a kind of a product summit with the team, and we’re getting so much great feedback, and all the other areas where identity verification is causing this absence of trust and this, oh, can I use it here? Can I use it there? And we’re using that input and that feedback to say, hey, let’s co create it together. Let’s go build, you know, other out of the box ways for you to be able to use it in new scenarios. Use identity verification to stop fraud and other other places. And so it’s for us, it’s — privacy is a whole thing we can talk about, but there is an element of doing the right thing and doing good and protecting people’s accounts. And fundamentally, if our accounts are where so much of our data lives, protecting people’s accounts, we feel also is a fundamental way of protecting their privacy, let alone the way that we actually go through our process alone is being privacy preserving.

Justin Daniels 22:22

So one of the things you remarked about was how you’re using camera for my face, but you also talked about using a passport or a driver’s license, which you know that can be potentially fraudulently manipulated. So my question is, you know, what do you see the vision in terms of going forward, of people continuing to use passports or driver’s license? Do you think we’re going to have a situation here where it’ll be some kind of digital token or something else that you’ll then pivot towards, to using that for the identity verification?

Aaron Painter 23:01

Yeah, I think we found some remarkably unique ways to be able to trust the documents that are issued today. And that was kind of an important criteria, because there are often interesting initiatives. There were for a while in crypto with decentralized identity, and there are some states, for example, in the US doing digital identity program, all that’s super interesting, and we follow that extremely closely and work with many of them. The challenge, though, is that you need an element of universality. You can’t have. You can’t call that customer support line and then send you a link and it only work with Android, or only work with iOS, or only work if you have the decentralized crypto credential. It’s got to work with something that people have. And so we had to find a way to use modern technologies to work with that thing that many people have in their pocket. So for us, the underlying technology we you know, sort of catchy, what we call a deepfake defense. Deepfake defense has a few components. One is using the cryptography in the device. And turns out the ability to prevent injection attacks is now very useful for deepfakes. But prior to that, you’re right. Many people do make fake ID documents, but the main way they made them was digital. They would modify, let’s say, a PDF of a passport, and then in KYC tools, there’s literally an upload PDF button. So if you have made a doctor PDF, you save the PDF, and you literally just upload your fake thing, and you’re all set. And so 80, 90% of the way people make fake documents is actually just on screen. When you get to the next level of, hey, I’m gonna make a physical fake and I’m gonna print those out, or in a really high, you know, high fidelity way that exists. But then we analyze that, we get to look at a whole bunch of advanced data from that phone, the three dimensional space. We call it adaptive document verification. But we’re getting a lot of data and very high fidelity photos when we’re looking at those images, right? So someone has an injected, manipulated PDF, we’re getting very high fidelity on the document itself, and then we get these really advanced selfies, like you might get if the person was in person. So that’s kind of how we use technology to solve it. There are. Programs that are super cool. I mean, India is kind of a role model. India has a nationwide digital identity program that is a gold standard in many emerging markets around the world, called otter. We have some really, really cool integrations with Otter because we’re able to trust what the government sends us digitally from that user. But unfortunately, again, that’s universal, mostly in India, but not globally.

Justin Daniels 25:23

Well, I wish I had a brunette wig, because I’m gonna put on my Jodi hat. Oh, maybe I’ll have to have that at some point this year. That would be fun, I can only imagine. But I guess, Aaron, I wanted to ask you a follow up. And I, like I said, I’m going into Jodi’s part about privacy. So in order for you to do your verification, you’re collecting a driver’s license which has some highly sensitive personal information on that driver’s license. So I guess that means you have to have a pretty robust privacy program. Or are there ways that you sandbox or otherwise minimize what you need that driver’s license to do?

Aaron Painter 26:00

There are many elements of privacy. And it’s, you know, was joking before we went on, because I was telling some of my colleagues I was coming to shout. They were so excited, because, like all we think about is privacy and security. This is the intersection of what we do every day, and privacy for us is a fundamentally important concept, and it’s both fundamental for the end user themselves as they’re going through that process. How can they trust? How can they explicitly consent? How can they self revoke? We are not an opt out model. We are a proactive opt-in model. But one of the practical examples might be, you know, when you go to a bar, and this bar is there’s a bouncer at the door, and they’re in the US, they need to go over 21 right? And they’re trying to verify, are you over 21 they don’t need your home address. They don’t even need your name, right? But yet, we give them all the information at once. They turns out they might look at it. They also often, increasingly scan it. Whole of the story on where all that data goes and how it’s used, but it’s way more than they need. That is over sharing. And so one of the fundamental constructs that we had from day one we called privacy masking, which is this ability to limit, even though you have scanned your ID, because we are checking to make sure it is valid, it doesn’t mean that you need to share all that information, let’s say, with the company. In fact, you can share very limited elements of it, and in some cases, the company asked for any of it. And so that’s a core element for the end user, multiple layers of consent. They get this shit. They want to do this. They get the consent to specifically what they’re sharing with a specific company, and for how long privacy is critical there. It turns out, though, as we entered the workforce space very, very fully, we learned that part of what it meant to be workforce grade was that enterprises have a whole bunch of different privacy requirements and a spectrum of choices they often want to make as a company. And so we offer an enormous amount of granularity and who holds the data. Where does the data live? How long do you hold the data? Can you? Is the employer governing it for the employee, or is the employee governing that for themselves? All these bits of nuances, we have defaults that are sort of industry best practices, but we’d like to give enterprises the ability to adjust those, what’s best for their demographic or user base.

Jodi Daniels 28:06

Aaron, with everything that you know, I can only imagine your cocktail party conversations and people may be asking, Well, what can I do to protect myself? So what is your best personal privacy tip?

Aaron Painter 28:20

You know, I take advantage of knowing that there is a trust deficit with many companies today. And so if most companies don’t know who the person is behind the account, right? Even in good scenarios, like I’m locked out, I want to get back in. They don’t know who I am. And so one of the things I do is I don’t necessarily trust that my username or my email address or the name on that account should really be who I am. So if you think in the olden days, you might have made a reservation and you give a different name, you know, maybe in case you didn’t show up, or because you wanted to be anonymous, I take that to an extreme, personally. So I had many I use different names or not my name on many, many accounts, you know, my ride shares, my anywhere where I know the system isn’t going to actually ask for who I really am. Maybe they don’t need to know who I really am because they don’t necessarily trust them with that data or the connections they’re going to make based on that. And so in a very practical sense, something I do a lot of is kind of using alternative names in many aspects of my personal life.

Justin Daniels 29:20

Have many others say the same. Indeed. So, Aaron, when you’re not helping people and protecting their security and their privacy, what do you like to do for fun?

Aaron Painter 29:34

I startup life initially and now, kind of scaling company life is pretty consuming for me. So I love that, but I also really love travel, and traveling and living and working in different parts of the world has been a big part of who I am and my personality for as long as I can remember. So I’m very frequently in new places, new countries, exploring new sort of cultures and ways of doing business with. I loved one of the things I loved most of doing at Microsoft. And just a big part of my life.

Jodi Daniels 30:05

Is there a big next destination, or maybe a few destinations that are on your horizon?

Aaron Painter 30:11

You know, I liked I do less frequently traveled places. I had a goal, which is kind of crazy, to visit 100 countries earlier in my life, and I kind of passed that and so it and then I sort of stopped counting, and set up. Became really Dean play. I’m very interested in some parts of Eastern Europe, so it’s sort of between West like the Stans, Uzbekistan, Turkmenistan, for example. I’m fascinated with starting to look at a trip to Uzbekistan in particular. Now, you know, there I’ve only been to, I don’t know 20 to 25% of Africa, like there are many more places in Africa to go visit, those are the kind of things that I just love exploring.

Jodi Daniels 30:53

Well, thank you for sharing. We look forward to maybe seeing some pictures of some of your favorite places. Where could people go to learn more?

Aaron Painter 31:01

You know, LinkedIn is great. It’s probably my most frequent platform. I post a lot of content. As a company, we try and post a lot of relevant content. Just educate people current on what we’re learning in the market, what we’re seeing. And it’s a great place to engage. And I love feedback. So when you’re finishing your commute or lawn mowing or chores around the house, and you get back to your computer, I’d love if you reached out.

Jodi Daniels 31:24

Amazing. Well, Aaron, thank you so much for coming and sharing. We really appreciate it.

Aaron Painter 31:31

Thanks for having me — kind of fun.

Outro 31:37

Thanks for listening to the She Said Privacy/He Said Security Podcast. If you haven’t already, be sure to click Subscribe to get future episodes and check us out on LinkedIn. See you next time.

Privacy doesn’t have to be complicated.