Click for Full Transcript

Intro  0:01 

Welcome to the She Said Privacy/He Said Security podcast. Like any good marriage, we will debate, evaluate, and sometimes quarrel about how privacy and security impact business in the 21st century.

Jodi Daniels  0:21  

Hi, Jodi Daniels here. I’m the Founder and CEO of Red Clover Advisors, a certified women’s privacy consultancy. I’m a privacy consultant and a certified Information Privacy professional and provide practical privacy advice to overwhelmed companies.

Justin Daniels  0:36

Hi, Justin Daniels, the admonished husband here, I am passionate about helping companies solve complex cyber and privacy challenges during the lifecycle of their business. I am the cyber quarterback, helping clients design and implement cyber plans as well as help them manage and recover from data breaches.

Jodi Daniels  0:55 

And this episode is brought to you by Red Clover Advisors. We help companies to comply with data privacy laws and establish customers trust so that they can grow and nurture integrity. We work with companies in a variety of fields, including technology, SAS, ecommerce, media agencies, and professional financial services. In short, we use data privacy to transform the way companies do business. Together, we’re creating a future where there’s greater trust between companies and consumers. To learn more, visit redcloveradvisors.com. Mr. Justin, who do we have with us today, I look

Justin Daniels  1:35

forward to introducing our guests so we can take the focus off of my mistakes or perceived foibles and put them into a great conversation with today’s guests.

Jodi Daniels  1:46  

That sounds fabulous.

Justin Daniels  1:49  

So, as a successful female tech startup founder, Lisa Thee built a safety focused software startup one, which helped scale a leading cybersecurity AI company to a $40 million valuation. And now, less than two years later, she’s leading launch consulting data for good practice. Keep an eye out for her TED talk on the topic of using technology to disrupt human trafficking in the digital age. Welcome, Lisa.

Lisa Thee  2:19  

Thank you, I’m so happy to be here

Jodi Daniels  2:21  

is a pleasure to have you a fellow Ted talker you ingest into have lots of Ted talking comparison notes to chat about.

Lisa Thee  2:31

I’m looking forward to it, I fresh off the experience. So I’ll be interested to see how his experience ages over time. If the stories get better or worse, as they get older,

Jodi Daniels  2:43  

well, that might be better served for cocktail hour. So to get us started. We always like to understand how people found their path to what it is that they’re doing today. So walk us through a little bit about what you’ve done career wise.

Lisa Thee  2:58  

Sure. So I came out of college with an engineering background. I moved to California, right before the.com bust to go try my luck. And technology grew up in Michigan, Detroit. So the right path to go was to get a degree from the University of Michigan and get a job in the automotive industry. One day, get married, have 2.5 kids and maybe if you’re lucky to have a lake house up north, and that was a definition about

Jodi Daniels  3:24  

the white picket fence and the dog running around. Yes, yes. So

Lisa Thee  3:27

with that in mind, technology was very new to me. So I had the opportunity to start with a large multinational tech company, right out of college, try a lot of different hats there, which was wonderful. 18 years later, fast forward, I was the director of our AI Solutions Group at Intel Corporation, working on our AI for good practice, helping to disrupt the problems of human trafficking in partnership with some of our tech partners, like Google, Microsoft, as well as with nonprofit partners like Thorne and the National Center for Missing and Exploited Children. After having that experience, it was hard for me to go back to selling technology for technology’s sake. So I decided to try my luck out as an entrepreneur after 40, which is probably not the most traditional path. But with that, in mind, was able to raise a seed round to focus on improving outcomes for kids and online safety, specifically around the areas of child sexual abuse material. In working with those partners. At my day job, we learned that 40% of the all the images that are reported to law enforcement from large tech companies that need to be followed up for prosecution actually are generated by children themselves. And we wanted to make sure that law enforcement had the time to focus on the prolific cases that were involving predators that were abusing hundreds of children at a time versus getting into this adolescent curiosity. silly things we probably would have done if we had cell phones that can take those kinds of images when we were growing up. Big difference between adolescent curiosity and the Coordinated Care In the criminal behavior of adults to prey on children, so we want to take the signal out of the noise, reduce the caseload. Last year, there were 65 million images and videos that were reported to law enforcement for follow up. That’s a over 4,000% increase in the number of reports since 2013, we needed to make sure that we start to help law enforcement be more successful, being able to follow up on this work and get the signal out of the noise. So blanch my company, minor guard with a friend of mine who was a leader at Apple, we both ditched our day jobs, decided to leverage some of the technology and innovation in the iPhone 10 with having an AI accelerator chip on the phone directly, and looked at how we could use that to make sure that a kid couldn’t actually save their bad choice picture down to their device and into the cloud. Because we knew once it gets on the internet, you effectively can never get it back. So that’s what we focus on for 2018 accident into a larger player in the market with our technologies. And after that experience, he took our roadmap back into apple. And then from there, I am now a consultant accidentally

Jodi Daniels  6:14  

was a really fascinating story. I I say this every time because I I’ve always been just genuinely curious for how people find their path and how they start companies. I started my company not too dissimilar from you four years ago. So it’s, it’s always really fun and interesting to meet fellow entrepreneurs, and welcome to the consulting world.

Lisa Thee  6:35 

Thank you, yeah, it’s been really fun to be able to be that bridge between large enterprises that I worked in for almost 20 years. And then the entrepreneurship space, because oftentimes, they don’t really know how to talk to each other, but they have mutual goals that are mutually beneficial. And so it’s fun to be able to create the magic at the seams where they can both innovate together, and be highly successful in addressing mutual challenges. So I do that with some of the largest tech companies in the world, as well as some of the largest research hospitals in the world on how to implement and bring AI into real world applications that impact positive social change.

Jodi Daniels  7:08

I think we want to dive on into that.

Justin Daniels  7:11 

Why don’t we start with how does AI help protect children online.

Lisa Thee  7:18

So there are a lot of ways that AI is used and digital safety, I think, the primary way that I see it being used most effectively, it’s not to replace the humans that work on the frontlines of protecting kids. It’s to accelerate some of the mundane and tactical work that’s required to try to get their job done. So let me give you an example of that. If I’m a detective and local law enforcement agency, and I have a report that a child is missing, and there’s a suspicion this child is likely being trafficked, 75% of human trafficking victims are sold online. So the first thing I’m going to do is go to the online ad sites, it used to be backpage.com was the primary place you go, they’ve been shut down through the ages office, that’s a whole nother story. So now there’s a bunch of others to do. What our machines great at tons of data, pattern recognition, finding the signal from the noise. So AI that one of the ways we’ve implemented AI to help with this problem is to help them use more modern techniques for facial recognition matching. So an example of this is, you know, if I’m scrolling through ad after ad after ad after ad, a lot of it’s just dead time looking at things that are clearly not going to be a match. But what if a computer can be used to do nearest neighbor searches so that as you’re looking at photos on the left side of who you’re looking for, and on the right side of who it might be, you could organize all the photos on the right side. And what does that 100,000 200,000 300,000 photos a day uploaded? It’s literally that many for these escort ads, organize them. So the closest matches the most likely going to be on the top row of my search. So now I’m spending my time building, recovering that child building the case against the trafficker, and moving on to getting more more victims recovered, versus scrolling page after page after page helping to find a needle in the haystack.

Jodi Daniels  9:24 

What are some of the challenges you’re seeing from a privacy and security perspective as a parent, I hope I never need this service. That’s terrifying to hear this story. However, I would want every tool in my tool belt to help solve this case. And I would want that for any parent. At the same time you have this weird, you’re not weird, but you have this challenge of this undercurrent of privacy and the security issues of from where some of those photos are coming from and where you How you’re able to utilize the technologies to sort through all of that, can you walk us through a little bit about what some of those privacy and security challenges are and how they intersect with AI? Yeah, so

Lisa Thee  10:11

the case that we just talked about is all records searches on publicly available information. Typically, those posters are going to be maybe missing children’s posters that used to be on the back of milk cartons, but are now on missing kids.org through the National Center for Missing and Exploited Children, publicly available information. And the escort ads are, frankly, their advertising. So you can go to any website and see them, they’re all publicly available today. So you’re not doing anything that’s private, you’re just indexing where effectively the public information available to find a match so that you could set up a fake date with that escort and actually recover that child. So I don’t think you have a lot of privacy issues there. Some of the places I’ve seen AI used from the parent side of the house of preventing your child to be in that situation in the first place, are kind of interesting. So one of the companies that I had the chance to work for for a while was BB technologies. They are an online social media monitoring company. Obviously being a business to consumer product, they fall have to follow the laws of capa with the child online protect privacy and Protection Act, they have to follow FERPA because they have some school related products. So they’re thinking through this a lot. And in that perspective, what I’ve seen industry best practices on is that companies that have a direct relationship with the parent, where they have the ability as a third party to be monitoring, and then giving the information to the parent to take action are a great way to deal with some of those privacy concerns. Because when you do AI the right way you actually can afford your child more privacy than less privacy. And the reason for that is you no longer grabbing the phone to do the spot checks, which let’s be honest, half of us don’t even know what the teen tween slang is anymore at any given time. So even if you’re reading it, you might not even understand what they’re saying like Netflix and children, you you and I might be have a different meaning. That one It means to tweens and teens. It means having sex to kids. To me, it means sitting on the couch and having a glass of wine and watching a movie. So even though you’re reading the words, you might not even understand the context anyway. So having well trained AI that understands current terminology about how tweens and teens talk is really a very positive step forward in terms of only looking at the things you need to know about in context on your children’s online social direct messages, not everything they do. They can earn as much privacy as you as you can give them, but prevent them from going from making the affordable mistakes that are normal kid development stuff into the life altering mistakes like suicidal suicidal ideation, child sexual abuse, material, violent extreme terrorism, those kind of things you don’t recover from if you make a bad decision. I mentioned earlier

Jodi Daniels  13:04 

about the AI piece as it was related to the actual physical device in the iPhone. Can you share a little bit more about that?

Lisa Thee  13:13

Yeah, so obviously, I don’t work for Apple, nor can I recommend Apple, some of our innovation inspired some of the things that are happening today. But I don’t want to take credit for it. With that in mind, having the AI the AI accelerator chip on this phone is really so that you can unlock the phone with facial recognition, right. So in order to take advantage of that what we could do at the camera level is to understand that a phone is registered to a child for a family iOS device. You know, this is a child using this phone and an image is explicit. Do not save. It’s really as simple as that.

Justin Daniels  13:46

Okay, very helpful. Thank you for sharing. Now, I want to switch gears and talk about your, your TED talk, and wondering if you could tell us a little bit about what it was about and share what that experience was like because it’s really the Top Gun of public speaking.

Lisa Thee  14:09

Thank you. Well, I have the opportunity to speak at TEDx Roseville on August 7 of this year. It was a long time coming because being selected to speak in 2020. Obviously, a few things happened in the world that delayed in person gatherings. So I had a lot of time to think about it. To be honest with you, my speech would have been very different if it would have happened in 2020. So my topic was bringing light to dark places online disrupting human trafficking using artificial intelligence. There’s basically a capstone speech about talking about the transition from corporate america into entrepreneurship and into you know, consulting and digital safety in order to make sure that we all rise to the occasion for our kids do better for this generation, because they are they are dealing with a lot of things that we just didn’t have to manage through at that young of an age. So with that in mind, um My speech in 2020 would have been really centered around educating parents about what they could do to better protect their kids online for making the decisions and helping them remember that they are the gatekeepers, but on lives of their children, their phone is on the phone is in a diary. It should not be expected to be private from your parents, and a few more of those kinds of conversations. But what’s happened in the media with all the political drama out there is there’s a lot of bills on the floor right now both, you know, domestically and internationally, around protections for regulation on technology companies. And so as a result of it getting deferred, it gave people the chance to catch up on understanding the concepts of content moderation companies, and the role that government should be playing in regulating. And so I really wanted to take the time to use this speech to motivate to start a movement to activate change. And the change that I want to see out there in the world is the implementation of the Department of Justice taskforce digital safety’s standard recommendations. So DOJ convened industry leaders, victims of these types of crimes, nonprofit experts in this area, law enforcement, the whole, everybody looking at it holistically, and came up with a set of recommendations for modifications to Section 230 to protect the digital safety of children. And my goal is that any modifications to Section 230 that ultimately happen, and there’s over 10, different perspectives from different bills about what should change on there, include those Department of Justice recommendations. And so we’ve started a petition on change.org requesting that happened. So our Congress, people understand that this is our expectation, because at the end of the day, companies are not going to be able to meaningfully make an impact in this problem. Unless there’s regulation that’s involved. It’s the same things we saw with privacy and cybersecurity, it takes some kind of regulation body to help them understand what the bar is, so that everybody is playing on an even playing field.

Justin Daniels  17:12 

So I want to talk about that for a second. So on this show, we talk often about a federal privacy law, a federal cybersecurity law. And, you know, section 230 has been in the news a lot. So from your personal perspective, what do you think the chances are that some meaningful reform of Section 230 will be discussed and passed by Congress?

Lisa Thee  17:41  

I know what I hope, I unfortunately, I’m not sure that that aligns with what I expect. I think right now, this has become a very polarized political issue about topics that I think are very personal to people and almost identity driven. What makes me really, really sad in this is that the the changes that need to happen to protect children online, are agreed upon bipartisan, nobody wants rape videos of eight year olds circulating online with no consequences. Like you don’t care if you’re Republican or Democrat, like nobody wants this. But we haven’t put the regulations in place to enforce the government to have the position to make sure that technology companies are taking this down in a timely fashion. I am very concerned, we’re going to get stuck in the rhetoric of political pandering and power moves between parties, and not focus on what we can all agree on, which is we should not be leveraging our future so that we can have liberties online. The things that we want to regulate in digital safety online are things that are clearly illegal in the real world. Like, nobody would let this stuff happen. And so I’m concerned that it’s going to be used as a distraction, a tool, a political movement. Frankly, I don’t all this stuff in the news. I don’t care about all that much. Outside of disinformation and misinformation and the division of the social fabric. Of course, I care about those things, but I’m not sure. tech companies are really going to be able to solve that because that’s what’s been proliferated for a long time. What I do care about is that when we do make those modifications, we can all pass the things we all agree on, and make sure there is at least some positive movement forward. Because at the end of the day, nothing’s ever going to be perfect, including the internet, nor do we expect it to be but we can do a heck of a lot better than 65 million images and videos circulating every year of the abuse of children.

Justin Daniels  19:55 

So Lisa, is it fair to say I could summarize one of your concerns would be the Things that all of us can agree on. But for political or personal agendas in Congress, they would rather argue about things at the fringes to help raise money or score points than actually solve something. We all agree needs to be dealt with.

Lisa Thee  20:19

I that’s a perfect way to say it with one exception. I don’t think Congress members are even educated about this enough to know that this is something we all agree on. And that’s what I’m hoping to accomplish with the petition. Everybody’s focusing on the extremes, let’s solve like the really aligned data driven things that we can do to improve outcomes for everyone.

Justin Daniels  20:43 

So one last question on this topic is, how can we collaborate better as a society to protect children online?

Lisa Thee  20:52  

Yeah, so I think that happens in three different ways. The first way is if you happen to be a guardian of a young person, being really thoughtful about when to give first bones when to give social media access. And how do you monitor that once you have it, I think, now that we are in, you know, generation, after generation of the just handing over devices, it’s pretty shocking to realize that most kids have to have a phone an unlocked smartphone in their hand, by 10 years old now. 10, there’s access to everything on the internet. And most people aren’t locking those down. So with that in mind, I think it’s important to think about maybe purchasing a phone that’s actually intended for children, not a phone that’s intended for adults, I think there’s some great options on the market. Now there. And I’ve got some listed on my website of things I’ve given to my own family. Obviously, that’s not the only thing that’s out there. But being an expert, I kind of see what’s in the industry. And this is what I feel most comfortable putting my name on. Secondarily, I think it’s really important to remember, as an adult in this world, we are trusted, safe harbor for children, it is incredibly important that you help them feel comfortable that when they see something online, or have something shared with them online, that makes them feel uncomfortable, that they have a safe and trusted adult to talk to. Kids are super resilient, they can come back from almost anything, but they need the guidance of trusted adults in their families and in their communities. And if they’re, they’re scared, you’re gonna freak out and take their phone away, they’re never going to tell you when things are going wrong. And trust me, if they want context for something they’ve seen, and you’re not willing to be a safe person to give it to them. There are plenty of people on the internet that are more than happy to explain all the things you feel uncomfortable talking about with your child. And it’s not to the benefit of your child, that they’re willing to do that. And then last but not least, and most important, I think is, you know, us all collectively rising together to expect more from our government regulation. For tech, I don’t think anybody in tech wants to be wants to have their platforms abused in this way. But in order to get the right focus from the executive suite and the boards to get the right resources in place to be able to manage this properly. And to be able to have a standards to align with, we need to have better regulation. in this industry, it’s been 20 years, like there weren’t seatbelts in the first car either, right? There had to be enough vehicles on the road, there had to be enough accidents where it made sense to mandate safety regulations for cars, I have to wear a seatbelt. Even if I’m a good driver. It’s just part of the requirement. We’re getting to that place of maturity with section 230. And having all of this third party community that we were having enough abuses of that system that we need to put the next click in to be a little bit better for our kids.

Jodi Daniels  23:46 

So on the notion of the tech companies and their involvement, you kind of mentioned we need regulation, and to help pressure the executive suite. I was wondering if you can share maybe the kind of where they are in the collaboration here as it relates to this particular issue?

Lisa Thee  24:06  

Well, I can’t speak on behalf of the executive suite of the largest tech companies in the world. But I can’t speak to you just you know, from running my own company, which is, as you’re running your own business, you kind of have to look at a few different things. Typically you have investors which are expecting a return on their investment to be strong. They want you to not get sued for something. And they want you to create a product that people love them and want to keep using on the first front. I am not incentivized to de risk my platform on something that’s not regulated. So I can go make more money, not stopping bad behavior, not looking for bad behavior, not removing bad behavior, because that cost people time, tools policy. It’s overhead. Secondly, on the de risking the platform front, if I have If I’m spending money to develop AI tools and content, moderation and all the other things that need to happen to police them all, so to speak, and my competitors aren’t doing that, I’m not going to look as advantageous to, to my shareholders, which frankly, don’t understand this level of detail, right? Nobody, this is a really, really niche thing that I work in, I don’t know how I accidentally came into this field to begin with, frankly, I just wanted to help more kids and marginalized women after traveling internationally, in business hotels, for a lot of my early career, I’m not be abused. So unless people are willing to spend the next five years on the, you know, garbage can of the internet like I have, which frankly, I don’t recommend, I’m not nearly as fun at dinner parties as I used to be. It’s just not intuitive to understand the implications of all these decisions. And last, but not least, the percentage of the population and they’re doing bad things online, is not high. So it’s easy when you have a list of 100 things on your agenda that you have to accomplish for this to be number 52. And you never get past number 26. I don’t think companies don’t care about this. I just also don’t think without any kind of regulation or financial incentives to act on it, it gets above the line of things that actually get accomplished. Yeah, I can

Jodi Daniels  26:22 

appreciate all those business points. And as a business owner and advising companies, I see that all the time. And at the same time, companies often have particular missions, and they you know, have reputations and risk and items and things that they stand for and community building and elements like that. So, you know, I, as a stockholder of many of these companies, as a user and customer of many of these companies. And knowing where some of their platform issues are, I’m hopeful that they are going to be if not already a part of the conversation. And so I suppose I would encourage anyone who’s listening here, and those that I speak to, to, to communicate and to share their voice up to them. Because if we only rely on regulation that could have no idea how long that can take. And instead, often there’s, there’s movement at a grassroots level that can push up. And I do believe that many of these companies care not only for profits, but for their constituent base. And, you know, that’s how you have an entire ESG movement of environmental, social and governance and responsibility.

Lisa Thee  27:28 

items. I completely agree, Jodi, and I probably just lean back on the things that I know how to get done a little bit easier. I’m more of a b2b person. I have saw the insides of this industry, I see the people that work tirelessly to try to buffer the impact of these unintended consequences that third party user generated content is having on their companies. I mean, that is from the technology teams, the policy teams, the nonprofit folks, the people that are on the frontlines of law enforcement, the lawyers prosecuting these cases. These are my personal heroes, because they are experiencing trauma on a daily basis to protect the world’s children. Trust me, there are much easier ways to make money. And there are much easier ways to get promoted in a tech company than doing this. These are the people that really, really care. And unfortunately, the resourcing that they have to do their jobs never seems to be at the level that feels appropriate to the evolving threat landscape. And how challenging this work really is. Because, as you know, in cybersecurity criminals constantly evolve, and these criminals are no different. Well,

Jodi Daniels  28:52 

you’ve shared some great tips for how parents can help protect their kids knowing all that you do, just in kind of your general day to day What might you offer as a Best Personal cybersecurity tip.

Lisa Thee  29:06 

Okay, so if you’re a business owner, I recommend a couple products. If you are looking for something that’s like a full soup to nuts solution that you can implement and pay a little bit of money for like a malware type of solution for this space, I really recommend the safer product from Thorn which is a nonprofit in this area, which was started by Ashton Kutcher and W Mar, they do amazing tooling for this space. And frankly, the people that design this tool have been mentors of mine for a long time and came from very big fancy companies. It’s well designed stuff. Another place I recommend if you are maybe starting out in small and medium sized business and need some support in the space is the tool from Microsoft Azure called photo DNA. It’s something they open up to the public. It is a very lightweight software you can put on that will identify when these types of illegal images are circulating on your business platforms. I think it’s a really great way to make sure that as a business owner, you’re identifying and getting rid of this stuff, and also getting rid of your risky employees that are putting your company in peril without you knowing it. And that’s free, you just go to their website and grab that. Last but not least, as a personal person. I mean, I don’t have the things I hear about at work are never things I accidentally stumbled upon in my social media feeds. But I’m not spending a lot of time in the dark web. I’m not spending a lot of time on porn sites. And so it’s not getting served up to me. So I think as users, it’s really important to set your expectations with these companies, especially with autumn. Anybody hosting user generated content, whether that be gaming, dating, social media, that you expect them to do more to keep this from bleeding into your day in day out life, and making sure that they take action. If you’re an advertiser, please put pressure on them to do more in the space. Typically businesses incentivized by customer churn plus of advertising revenue and loss of payer revenue. Those are the kinds of things I see that really change this kind of behavior.

Justin Daniels  31:10 

And the last question is, when you’re not out, talking about all these interesting issues, what do you like to do for fun?

Lisa Thee  31:17

Oh, thank you for asking. I swear I don’t talk about this all day long with my friends. I would never say friends with me. So yeah, for fun, I have a little Yorkie and he is so much fun. I really enjoy walking him every day and making sure that I see the tops of the trees and the blues of the sky. He helps me be accountable to getting a little bit of nature back in my life and getting off these computers all day. I also really enjoy zimba I’m not talented, but I’m exuberant about it. And I have elementary school aged children. So most of the rest of the time I’m on Uber with my life.

Jodi Daniels  31:55  

Yep, I feel Yeah, I’m the carpool taxi driver. So I feel like what I do is I have a little bit of exercise, work and mom and I just in some order, repeat Oh, and cook. That’s what I do.

Lisa Thee  32:07 

In fact, you might find this interesting, I’m working with another female founder friend of mine who also did an AI startup focused on human trafficking prevention called Emily kind of her name’s Emily Kennedy, her company she founded is called Marinus Analytics, they just finished in the top three for the IBM X PRIZE globally. In June, we collaborated over the summer to do some courses for women who are interested in entrepreneurship. And we’re partnering with Women in Data to launch our coursework called Spark Passion. And a lot of the things that we focus on in our coursework is frankly, around wellness community, all the things that buffer the stress of being a first time founder, especially being a first time female founder, to make sure that you have the resilience to keep working through it. Because as we know, running your own company, your best day and your worst day are usually spread out by about three hours, and might happen multiple times a day. So I think it’s important that we all come together. So we’re really excited about partnering with women and data, which is the largest community of women, data scientists and globally, to bring that kind of information. I want to see many more women stepping out of the corporate umbrella and into their own leadership, and bringing their visions to the world because I think that’s how social justice and social impact is going to happen. It’s going to happen through business innovation, and it’s probably going to be led by diverse people.

Jodi Daniels  33:26  

Well, thank you so much for sharing all of the great insights that you had, where can people find you if they want to learn more?

Lisa Thee  33:33  

Sure, I’m active on LinkedIn. So Lisa Thee, LinkedIn is is an easy place to find me. Or you can go to my website at www.Lisathee.com all one word. And that has more about entrepreneurship courses, keynote speaking topics, as well as links to my TED talk if you want to learn more about how to improve digital safety online.

Jodi Daniels  33:57  

Well, excellent. Well, thank you again for sharing so much wisdom. And we hope that all of our parents can continue to keep their kids safe online as and as citizens to keep all children safe online.

Lisa Thee  34:10 

Thank you so much. I really appreciate the time today.

Outro  34:16  

Thanks for listening to the She Said Privacy/He Said Security podcast. If you haven’t already, be sure to click subscrie to get future episodes and check us out on LinkedIn. See you next time.

Privacy doesn’t have to be complicated.