Roy Dagan

Roy Dagan is the CEO and Co-founder of SecuriThings, the provider of the first IoTOps solution designed to help organizations maximize their devices’ operational efficiency and security. He started the company after many years of building cybersecurity, risk management, and intelligence systems.

Prior to SecuriThings, Roy led product and management teams at a range of companies, including RSA, Capital Cadence, and NICE Systems.

Available_Black copy
Available_Black copy

Here’s a glimpse of what you’ll learn:

  • Roy Dagan shares the path that led to creating SecuriThings
  • Why Roy decided to focus on IoT cybersecurity
  • How did working in Israeli intelligence and spending time in the country’s military defense forces impact Roy’s career?
  • The biggest misconceptions about IoT security
  • Closing the security gap when managing multiple devices
  • How is SecuriThings changing device management and security?
  • Creating a proactive security solution instead of scrambling during a breach
  • Roy’s best tip for physical security teams

In this episode…

If you’re a large organization, chances are you have multiple IoT devices. How can you ensure those devices are always running and healthy?

There’s no one-size-fits-all solution. Your options depend on the category: enterprise, consumer, wearables, automotive, or something else entirely. It also depends on the type of device and its purpose. How can you make sure each different device is communicating flawlessly without any gap in security? Is there a way to find an option specifically tailored to your company? Enter: SecuriThings’ IoTOps solution.

In this episode of She Said Privacy/He Said Security, Jodi and Justin Daniels sit down with Roy Dagan, CEO and Co-founder of SecuriThings, to discuss how the company is changing device management and security for the better. Roy talks about the biggest misconceptions about IoT security, why your company needs a proactive cybersecurity plan, and his advice for physical security teams.

Resources Mentioned in this episode

Sponsor for this episode…

This episode is brought to you by Red Clover Advisors.

Red Clover Advisors uses data privacy to transform the way that companies do business together and create a future where there is greater trust between companies and consumers.

Founded by Jodi Daniels, Red Clover Advisors helps their clients comply with data privacy laws and establish customer trust so that they can grow and nurture integrity. They work with companies in a variety of fields, including technology, SaaS, ecommerce, media agencies, professional services, and financial services.

You can get a copy of their free guide, “Privacy Resource Pack,” through this link.

You can also learn more about Red Clover Advisors by visiting their website or sending an email to

Episode Transcript

Intro  0:01 

Welcome to the She Said Privacy/He Said Security Podcast. Like any good marriage, we will debate, evaluate, and sometimes quarrel about how privacy and security impact business in the 21st century.

Jodi Daniels  0:22 

Hi, Jodi Daniels here. I'm the Founder and CEO of Red Clover Advisors, a certified women's privacy consultancy. I'm a privacy consultant and certified informational privacy professional, and I provide practical privacy advice to overwhelmed companies.

Justin Daniels  0:38 

Hi, Justin Daniels. Here I am passionate about helping companies solve complex cyber and privacy challenges during the lifecycle of their business. I am the cyber quarterback helping clients design and implement cyber plans as well as help them manage and recover from data breaches.

Jodi Daniels  0:54 

And this episode is brought to you by Red Clover Advisors. We help companies to comply with data privacy laws and establish customer trust so that they can grow and nurture integrity. We work with companies in a variety of fields, including technology, SaaS, e-commerce, media agencies, and professional and financial services. In short, we use data privacy, to transform the way companies do business together, we're creating a future where there's greater trust between companies and consumers. To learn more, visit Now, we're gathered here today to have a podcast I we were just talking about a Genesis concert. So I just want to make sure you're not going to break into song.

Justin Daniels  1:40 

I cannot confirm or deny as to whether or not that will happen. Mm hmm.

Jodi Daniels  1:45 

Well, don't go and get ready. You might be hearing some Phil Collins students. Justin, do you want to kick us off and say who is with us today?

Justin Daniels  1:56 

I'd be happy to do that. So today, as we talk about cyber technology, we're going to be talking to Roy Dagan, who is the CEO and co-founder of SecuriThings, the provider of the first IoTOps solution designed to help organizations maximize their devices’ operational efficiency and security. He started the company after many years of building cybersecurity risk management and intelligence systems. Prior to SecuriThings, Roy held multiple roles leading product and management teams at a range of companies including RSA, the division of the Security Division of EMC, and NICE Systems.

Roy Dagan  2:41 

How are you guys good morning? How's it going?

Jodi Daniels  2:47 

It is good. We're so glad that you are here with us today. And we hear that you also might like Genesis. So if you want to, you know, have a fun tune. You are welcome to do

Roy Dagan  2:58 

Yeah, I'm not sure I'll be joining in thing today.

Jodi Daniels  3:01 

Now. So instead, all right, let's dive in to the fun of IoTOps. So right, tell us a little bit we always like to start these conversations with how you got to where you are. Sure thing.

Roy Dagan  3:16 

So I've been in tech industry for for quite a while, a few years, almost 20. I guess at this point, I started in one of the intelligence units in the Israeli Defense Forces, obviously, like every other Israeli travelled a bit after that for a few months, then came back to Tel Aviv, study computer science. But I always worked as a product manager. So I held multiple product management roles. And again, as he said, mostly in companies focused on cyber risk detection analytics companies such as RSA, yes, they used to be the security division of EMC. But now it's part of Dell, and it's even bigger now. But also NICE Systems and other companies. And I think it was always kind of clear to me that I want to start my own thing, right, the main question like many, I guess, founders, the main question was when, and I believe I was, you know, working for a while just trying to get to that level of experience and confidence and different corporates. I'm learning a lot. And companies again, RSA was it was just great. It was amazing experience over there. And I wanted to learn more before strategy, starting something from scratch. Then I called Ron. So Ron is our CTO, my co founder. We work together at RSA. And I told him about this idea of starting something where initially we started in IoT, cybersecurity, and I told him about, about the concept and about the idea got him excited, but he liked the space of it. He had some experience in that. Obviously, he liked, you know, the area of cybersecurity where he had experience also from RSA, and, you know, he was delighted to join the ride and then we just, you know, started working on this thing

Jodi Daniels  4:55 

together. Well, can you share a little bit about how you formulated this, this particular niche, we were talking about how there's so many different flavors in the cybersecurity field, what have you narrow in here?

Roy Dagan   5:11  

So, initially, we was trying to focus really on IoT cybersecurity, which is kind of a big topic, right? It's almost like saying it security, what does that mean? That means so many things. And then we will join, starting to investigate all kinds of barriers, you know, initially around the smart home and manufacturing, and different types of organizations. And then what happened is that we decided to focus initially trying to focus on manufacturers, we decided to focus on where that man really is, which is then prices, large organizations. And we saw there was just a huge demand there. And today, we're serving some of the largest, you know, tech companies in the world and financial institutions and universities and municipalities and healthcare organizations, but pretty much across the board. So that's one thing that happened. The second thing is that we saw that the cybersecurity was definitely a big concern. But the people within the organization that we were working with, have, which are responsible for the operations of these devices, and making sure that they're always up running and healthy. They were concerned about cybersecurity, as well as their counterparts from it. But they also had the, you know, more of the compliance concern, which became a big thing in the last few years, but then also the operational management aspects of these devices. And then we saw, okay, so it's actually bigger than, you know, just IoT, cybersecurity, when it comes to the enterprise, they actually includes also compliance and operation management. So we decided to kind of pull all the multiple kind of it categories together into one solution, and coined the term IoT ops as kind of an umbrella term for this category that we've been working on creating and putting in the hands of our customers and partners.

Justin Daniels  6:49 

Oh, Roy, I'd actually like to ask you a follow up question, having been to Israel and visited the country, could you share a little bit with our audience about how working in Israeli intelligence and your time in the military defense forces for Israel has such an impact not only on you, but why Israel has been so successful creating so many great companies in the field of cybersecurity.

Roy Dagan  7:17 

So I think you just get a lot of experience. So it's, it's combination, probably, of experience and stuff that you learn, but also kind of, kind of the attitude of getting things done. Right, you get into kind of that mindset, and that everything is possible. So people come out of the army after you know, doing some pretty amazing things. And then they start to start a company and they have that just a lot kind of that attitude of okay, we can help solve big problems in you know, in many of these companies are huge now, right, solving cybersecurity and other challenges. So I think it's really a combination of experience, but also kind of the, the attitude

Jodi Daniels  7:54 

that comes with that. Yeah, from this side of the world, it is truly fascinating to be able to watch and think other parts of the world could benefit from from that mindset for sure.

Justin Daniels  8:07 

But why don't we change gears a little bit and talk a little bit about is, what is the biggest misconception around IoT security, at least from my standpoint? Is there any IoT security?

Roy Dagan  8:18  

That's a good question. Really good question. So I think one of the main misconception is, think about RSA Security as one thing as whole. You know, and I think also, at the same time, there's probably not one solution that can solve it security, right? It's again, it's almost like saying it security, there's so many different categories, which are solving different it, security challenges, so So it's clear today that there are also many solutions that need to be available in this space. And if we think about IoT security, I believe, we need to ask ourselves, what are we talking about? Are we talking about enterprises? Are we talking about consumers? Are we talking about maybe wearables? Are we talking about automotive, then what type of devices are we talking about? And there's more and more questions that we really need to kind of ask ourselves to kind of focus on what the challenges is at hand, and how we want to solve the challenge. Because if you know, each one of these different areas that I just mentioned, just having a unique set of challenges, which are very different from each other. So I think that the main misconception is kind of that term IoT security. It's just there's multiple categories in that thing that is called IoT security, in my opinion.

Jodi Daniels  9:32 

People are thinking about I have multiple different devices, I need to manage all of those devices. How do you link the idea of managing multiple devices with security? So I

Roy Dagan  9:46

think it actually has a lot to do with with security, as you see organizations just you know, they're deploying all these devices at present time and they're really lacking any visibility and control when it comes to them. So it's very hard to know which vendors you have It's very hard to know which firmware versions they're running, whether they have any vulnerabilities when a passwords been rotated on these devices, or even how to rotate passwords, and there's just a huge gap. And that huge gap is impacting, obviously, the security of these devices. So a lot of those questions I just went through are really related to also the cybersecurity aspect, but the devices. So that's also by the way, so that's when we, as a company decided that we need this new type of solution. And sometimes I like to say that this is actually an equivalent of an IoT solution, but which is really tailored variety devices. And that's exactly what we do. And you know, when we speak with our customers, and then also with the IT counterparts, that's exactly how they say, and they say, okay, Ma, it's an equivalent IT solution, we're really catered for IoT devices. And that's also when we going back to where we started, kind of decided to coined that term IoT ops, because it's a new category, the people, the customers are different, the different teams within your organization. It kind of they're kind of underserved. Nobody has ever built the solution, which is really for them. They're basically basically been tasked to deal with all these devices, huge amounts of devices, but without a solution. So what you're seeing is that they're doing a lot of things manually, and reactively. And that's a big pain for them. And obviously, that then has an impact on the security of the devices, because they can't deal with the security of doses without the right tools in place.

Jodi Daniels  11:28 

I think it would be helpful for everyone to hear when we're talking about devices, what kind of devices are we talking about in this context? Sure. So

Roy Dagan  11:41   

a lot of the devices are focusing today around around physical security, but then also like building management devices. So a lot of devices, we deal with our video surveillance, so cameras and the systems to manage the cameras, access control panels, and everything, which has to do with physical security. But then there's also other devices that are part of their operation, because to know if a physical security device is working, and to tell that team over there, which is responsible for the devices and needs to make sure the device is always up and running. There's also the switches and PD use and UPS and other things, other types of devices. So the focus is making sure the physical security, then building management devices are working, but with all the other devices which are out there. So it's kind of broader than

Justin Daniels  12:26 

that. Thank you, I think that's really helpful. So how do you go about specifically addressing and disrupting IoT, device management and security? So

Roy Dagan  12:41  

I wouldn't say we are exactly disrupting IoT device management that I'd actually say we identify teams again, but those teams within the organization that nobody ever built, they're the solution for them. And in present time, what's also interesting is that there's kind of a new generation there, which actually know that things can be better. Right? They know that the where they're doing things, and no, it's not. It's suboptimal. And they're craving for a solution for that for that, because that's their day to day job, right, making sure that device is always operational. And we went ahead and we build that solution for them. It's interesting to see that in the market, they already know that things at the moment are very reactive. And when it comes to managing these devices just is becoming very, very costly. So it's you're putting more manpower, your integrator system integrators are rolling out more trucks, just because there's no better way of dealing with these kinds of situations today. And they just want to move from that kind of reactive mode to practice mode. And at the end of the day, with the way we see it, we're really helping these teams achieve five business outcomes. So first, will be improving system availability, obviously, because that's what they're charted. And that's what their main focus is making sure that that camera, when there's an issue, when there's an incident and accident, it's always up and recording and working properly, or the access control panel will always lead that person, or the truck or whatever is needed to enter the premises. So that's improving system availability, then reducing costs, because what you're seeing there, it's it's really costly, not just because of the truck rollouts, and a lot of the costs around that, but also roll back and forth between teams, because they don't when there's an issue with the device, it's very hard to know where the issue is, is it actually device or maybe it maybe it's actually the network. So it's also a lot has to do with cost reduction around that. Then ensuring compliance or giving them a picture. Okay, this is what you have. This is the versions. This is when passwords have been rotated, and more information around that, obviously, protection from cyber threats and identifying vulnerabilities in real time, and also what we call the concept of visibility for future planning. So telling them as an example, Hey, guys, this set of devices is about to be end of life. You should think about replacing them now and not kind of last minute when they're already our support and they're not functioning properly anymore. And then obviously that has an impact and ces Some availability on cost and everything else that I just mentioned.

Jodi Daniels  15:06 

You mentioned the idea of reactive. And I'm curious, when do people tend to implement these solutions and kind of connected to that as what do you think is the biggest objection that companies have when they're implementing an IoT solution?

Roy Dagan  15:26 

So I'd say that typically, if you, if you're managing a dozen, a couple of dozen devices, if you're a very, very small organization, you're probably fine. You'll do some things manually, you'll do some things reactively, but you'll probably be okay. Once you get to certain thresholds, and it can be in the hundreds of devices, or 1000s, or 10s, of 1000 devices or more, and we haven't got organization customers across different sizes and types, then it becomes a challenge. And then you really need to start implementing such a solution, because you know, it's no longer scalable, and then it just becomes liability. In terms of objection, I'd say, it's not really an objection that we're seeing. But what typically happens when we walk customers through, you know, the presentation and the deck, and then they demo. Typically what happens is, it's interesting, they ask, How much time did it take to deploy? Oh, how many months does it take to deploy to? And how much time does it take to have it really in action and production? Because it's interesting, they're used to deploying things which take months and sometimes years, right, you go for this big project, you have a new build a building, and you need to put all the devices around and takes a lot of time. And typically, our reaction day takes them, Hey, guys, it takes less than a day. And it's up and running and fully production. And it's a you know, it's good, a good surprise for them. The second thing is, because these are different departments within the organization, sometimes there isn't concerned that it may try to block such initiatives. And we actually, as a company, what we learned is that we always insist that we want it and part of the as part of the discussion, because we know that the it counterparts once they're in a pro process, and once we show them the solution, they're actually delighted about whether the solution and we can provide so to them on a personal level, it can provide also all the cyber insights, right and additional insights around that area. And for their counterparts, which are may our main end users, it provides all the compliance, the operational management and all the rest of the capabilities. So I'd say those are kind of the two types of objections we came, we run into. But typically, again, involving it, and just telling him how simple is deployed, typically solves that.

Jodi Daniels  17:40 

Or some companies very may not be aware of the risks of not implementing. So in this space in these types of devices, what are the common incidents that you see take place when they don't have one of these kinds of solutions? So

Roy Dagan  17:57 

from obviously, cyber incidents, and having devices which have been deployed there for years, without with known vulnerabilities, that the vendors already notified that there are they have they know that their vulnerabilities, but it's so hard to upgrade the devices, they just leave them as is because they're working to rather not start deal with the firmware versions, same time also passwords which have never been rotated on these devices, then it goes through a system, verifying the status of the devices. So you bought you paid hundreds of 1000s or millions of dollars to deploy these devices. Now, are they actually working properly or not? And the last thing you want is to pay all that, you know, good money, but to deploy these devices, find out that there's an incident you were trying to look for, you know, for the footage or something because you know, there were cameras around in that area to find out, Oh, they weren't recording, or they weren't working for months, because you didn't know, then incident handling. So how much time does it take to handle incidents? And because there are so many kind of people involved in these in understanding where the issue is? Is it a device? Is it something that work, it takes a lot of time to handle the incident and many times what we're seeing is that they just kind of automatically roll out a truck. Just because again, there's no system today to do that in a better way. Then also the ongoing maintenance. So just going through routinely rotating passwords and upgrading firmware pretty much impossible today. And just the last thing I would say is knowing what the compliance status is, is really, really hard. So it's a project and sometimes it's actually a project. So they will say okay, let's go through this project. Let's understand what we have out there. But then the interesting thing is that the next day is no longer relevant. Because these environments are alive, they change all the time. So you no longer know what you have out there. And you need something really which is really real time and constantly updating to solve all these challenges.

Jodi Daniels  19:52 

Well, thanks for sharing. I think people are always aware of what they're used to thinking of a digital challenge. But when you actually move it to device, people are thinking, well, that's like a physical device. And they kind of forget that they're all interconnected. And tied to my question earlier, which devices are we talking about? What are the threats, we have to really extend? It's not just the digital world, but the physical world meeting digital world? Exactly.

Justin Daniels  20:21 

What is your best security tip for these physical security teams?

Roy Dagan  20:27 

So I think it's similar to how it is in the IT world, right? It's all about visibility, and then control. Okay, so you need to see how you get to that level of visibility in which you know, what you have, how it's working, which vulnerabilities you have, then the next thing, once you have that visibility, you have a clearer picture of the entire working environment. Now you need to get to that level of control. And control also means the ability to automate actions, such as again, rotating password automatically upgrading firmware version and restarting devices, all these, you know, for us coming from the it, it sounds okay, that's simple, right? But in some spaces you find out it's It's not trivial. And it's non trivial, because it is actually hard to do, you need to make sure that when you try to go through those operations, you need to make sure that you're doing it in a proper way. Otherwise, again, we talked about it, this is a physical device, and it's out there, it's on the next office, if I make a mistake, I have something which is out there in the terminal on the other side of the world, and I now I really need to roll out the truck to make sure that it's working properly again. So again, I think getting to that level of visibility and control, which has been around in the IoT space for years is really key

Jodi Daniels  21:44 

for this industry. When you're not growing a company and trying to protect physical devices around the world, what do you like to do for fun? Gaming.

Roy Dagan  21:57 

So I've been climbing goal has been climbing rock climbing since I was a kid. I think climbing it kind of becomes part of what you are, who you are. After doing it for a while. It's also kind of the place where I can no put everything aside, I put the phone aside the laptop, everything, and kind of you know, being the zone and just focus on you know, having fun or getting another route. Done. Yeah, I've been doing it for years, and it still still gets me excited every single time.

Jodi Daniels  22:27 

So do you travel and have any special places where you've climbed.

Roy Dagan  22:31 

So actually, every time typically when I'm traveling when I'm not injured, because of climbing, I travel with a pair of shoes and a chalk bag. And when I have some time off, you know sometimes in the evening I would go to a local gym or somewhere depends where I am. Sometimes I would do a weekend and go out there you know the nature to get some climbing done. But yeah, but typically I will travel with with a climbing shoes and that chalk bag which is essential as downtime.

Justin Daniels  23:01 

How have stuff ready? The mountain out?

Jodi Daniels  23:05 

Always prepared for all things at all. Yeah, that's Justin's favorite comment he has his gym bag looks like a piece of luggage. And people will say why are you bringing luggage to the gym? And it's because he has to be prepared for every sport that you could possibly want to do. Right? That makes sense.

Jodi Daniels  23:24 

Um, but I'm glad I found you a kindred spirit here. Well, Roy, it's been such a delight talking to you. If people want to learn more, where is the best place for them to do so and connect with you? So they can either

Roy Dagan  23:37 

go to the websites or or obviously send an email or so to and we'll be happy to answer any questions.

Jodi Daniels  23:47 

Wonderful. Well, thank you again for enlightening us and helping the audience understand more the world of IoTOps. It's been a really fascinating discussion.

Roy Dagan  23:58 

Likewise, thanks a lot. I appreciate it. Thanks for inviting me.

Outro 24:05 

Thanks for listening to the She Said Privacy/He Said Security Podcast. If you haven't already, be sure to click Subscribe to get future episodes and check us out on LinkedIn. See you next time.

Daniel Barber

Daniel Barber is the Co-founder and CEO of DataGrail. DataGrail helps people gain control of their privacy and identity. They’ve developed a privacy platform that modern brands rely on to build customer trust and transparency.

Daniel is a Contributing Writer for the Forbes Technology Council. His insights have been distributed in security and privacy publications such as IAPP, CPO Magazine, Consumer Affairs, CIO Dive, and Dark Reading. Additionally, he is the CEO of GTM Orchestration and is on the Advisory Board for SignOnSite,, and

Available_Black copy
Available_Black copy

Here’s a glimpse of what you’ll learn:

  • Daniel Barber shares how he landed in the privacy and security landscape
  • What is the current state of privacy programs?
  • Spreading awareness of consumer data usage
  • Why transparency is essential for gaining consumer trust
  • How DataGrail’s software can help you locate where your consumers’ data is stored
  • The evolution of data privacy laws and their impact on businesses
  • What major privacy challenges do companies face today?

In this episode…

According to a recent survey by DataGrail, 83% of Americans want control over their information. How can businesses deliver that transparency?

It’s not easy. Most businesses only provide information that’s in two or three systems that they own, like Zoom, Slack, or Salesforce. But the truth is, there are hundreds of systems processing consumer information. How can they locate where each consumers’ information is stored? 

In this episode of She Said Privacy/He Said Security, Jodi and Justin Daniels sit down with Daniel Barber, Co-founder and CEO of DataGrail, to discuss how DataGrail’s software can build transparency by giving consumers control of their data. Daniel talks about the importance of knowing where data is stored, how to build trust through transparency, and the evolving landscape of privacy laws.

Resources Mentioned in this episode

Sponsor for this episode…

This episode is brought to you by Red Clover Advisors.

Red Clover Advisors uses data privacy to transform the way that companies do business together and create a future where there is greater trust between companies and consumers.

Founded by Jodi Daniels, Red Clover Advisors helps their clients comply with data privacy laws and establish customer trust so that they can grow and nurture integrity. They work with companies in a variety of fields, including technology, SaaS, ecommerce, media agencies, professional services, and financial services.

You can get a copy of their free guide, “Privacy Resource Pack,” through this link.

You can also learn more about Red Clover Advisors by visiting their website or sending an email to

Episode Transcript

Prologue  0:01  

Welcome to the She Said Privacy/He Said Security podcast. Like any good marriage we will debate, evaluate, and sometimes quarrel about how privacy and security impact business in the 21st century.

Jodi Daniels  0:22  

Hi, Jodi Daniels here. I'm the Founder and CEO of Red Clover Advisors, a certified women's privacy consultancy. I'm a privacy consultant and certified informational privacy professional providing practical privacy advice to overwhelmed companies.

Justin Daniels  0:38  

Hello, Justin Daniels here I am a technology attorney who is passionate about helping companies solve complex cyber and privacy challenges during the lifecycle of their business. I am the cyber quarterback helping clients design and implement cyber plans as well as the Manage and recover from data breaches.

Jodi Daniels  0:55  

And this episode is brought to you by Red Clover Advisors. We help companies to comply with data privacy laws and establish customer trust so that they can grow and nurture integrity. We work with companies in a variety of fields, including technology, Fast e-commerce, media, and professional and financial services. In short, we use data privacy to transform the way companies do business together for creating a future where there's greater trust between companies and consumers. To learn more, visit

Jodi Daniels 1:33  

You're very quiet today. You're always very chatty after I do our whole intro.

Justin Daniels  1:38  

Well, maybe you should try to start with the chattiness today.

Jodi Daniels  1:40  

No, that's your job.

Justin Daniels  1:43  

I see. Well, you have a nice haircut.

Jodi Daniels  1:44  

I do I do a nice haircut. For all those who are watching a video, I have a nice haircut. And for all of you listening, you should go check it out lovely hair day. But let's chat about privacy, shall we? We have a really awesome guest today. All right, I'll let you do the honors. So we have Daniel Barber, who is the CEO and Co-founder of DataGraill. In the new age of privacy DataGrail is the only purpose built privacy management platform that ensures sustained compliance with GDPR CCPA. And what other alphabet privacy laws we're going to come up with. He is also the advisor to several high growth startups, including And sign on site. It is very much an alphabet soup. Daniel, welcome to the show. We're so glad to have you.

Daniel Barber  2:37  

Yeah, no, thank you, Justin. Thank you, Jodi for the invite. And yeah, excited to chat with you guys today.

Jodi Daniels  2:44  

Well, so let's dive in. Our first question we always ask people is how did they get to where they are in the privacy and security landscape? So fill us in a little bit on how you came to found DataGrail? Yes, so

Daniel Barber  2:58  

Yes, so as some of you may pick up pretty quickly, I grew up in Australia, on on the south coast, and moved to the US. My father started family from here and did my MBA in Japan and was fortunate to work for a Japanese company. They were very kind to transfer me and and move me to San Francisco. So in the Bay Area. And after 14 trips to Japan in 2011. I decided perhaps I should work for a company local and started working for a company called responses which at the time responses was really the leading provider for email marketing solutions. And so when you think about email marketing at this stage right, responses was working with some of the largest consumer brands on the planet. So think, you know, Whole Foods, Southwest Airlines, Lufthansa, JC Penney and others. And what I observed there was no companies at that stage. And even today, obviously, we're using enormous amounts of personal information to drive these campaigns through through email, and collecting and sort of aggregating lots of personal information about us. Right. And so what am I actually talking about? I'm talking about, they would have your email address, they would buy demographic information from different data sources, they would buy local weather information to serve, you know, targeted email campaigns. So if you're located in Minneapolis, and it happened to be snowing that day, when you would open your email, they would look at the IP address of that location dynamically populate the content based on you know, Jodie, you know, being a being a woman and age bracket x, they would say, okay, great. Maybe you need a snow jacket and some ski boots, because it's snowing today. And so this was really cool. So in 2012, right, I was I was pretty unaware of what marketers were doing. But as we think about that outcome today, right, obviously, not really with privacy in mind. Right. And so this is pre GDPR, pre California Consumer Privacy Act. Certainly pre Cambridge Analytica as we fast forwarded, you know, I joined a smaller company, I knew I wanted to found a business and eventually solve a large problem. So I joined a company called ToutApp where we were actually doing email for salespeople. So really kind of leading the way in what is now described as sort of this sales engagement category. And so I partnered with the founder early on and had an amazing experience there, I kind of describe it as my startup MBA. So just over two years at tout app learned enormous amount. But the second kind of point of resonance around this problem with data privacy was that, you know, in 2015, I signed 200 data processing agreements. So when when you do something 200 times, you start to wonder why you're doing it. And again, this is like pre GDPR, pre California Consumer Privacy Act Like obviously, pre Cambridge analytic. And so I came to the end of the year sort of asking myself, like, Why do businesses care about this? Like, there's no fines in this data privacy agreement that I'm I'm reading, right, or data processing agreement? So you know, why is this important to businesses? And so, you know, what became clear was that there's an enormous amount of information being transferred across all of these applications. By man, I saw that in my last experience working at a company called dot Annise, where we looked at 40 million websites, it looked at the technologies that people would use to run their business. And the last kind of realization that I came to was that every business runs on technology, including the one that we're actually engaged in right now. Red Clover, we're running on Zoom, right? You have other services to improve this recording. And so what those services are, I have no idea. And so we think about that, right? If there are many applications organized in a business, but the consumer has the expectation that they're in control of their information. There's a disconnect there. And that's why we found the DataGrail.

Jodi Daniels  7:01  

Now, we're always saying that companies are in the data business. So you were kind of talking about what everyone's using technology. And everyone's utilizing data. And I remember those really early days of the dynamic content and email marketing. And I got my start in privacy by stalking people for cars, using data from autotrader. And, and anyone who bought a car, you're welcome. It's kind of the same, the same idea. The thanks for sharing. Yeah. Your smirky over there.

Justin Daniels  7:29  

Oh, why not? So from your view, how would you describe the current state of privacy programs and companies?

Daniel Barber  7:39  

Kind of in flux would be my first response, right? I think many businesses that are internationally focused, right, we're leading up to the GDPR, trying to figure out what they need to do. This was a long walk, right, from 2016 through 17, through 18. And now we're talking about that was three years ago, right. And so they might, they might have implemented something to stand up the program for domestically Blokus businesses, the CCPA was probably the first regulatory requirement that they had to jump over. Right. And so I would say, like, across the board, most companies have no idea what applications they've actually purchased. And this has accelerated in the last, you know, call it 18 months, because now we are doing work from home, right. And so, you know, people are in a dynamic work environment as well. So they might be working from home one day, they might be off the road or beach or another day, they might be, you know, who knows, right where they are. And so the application use, and as a result, the number of locations where personal information could be, has expanded dramatically. And so I think like, flux would be my one word to describe where we are today.

Jodi Daniels  8:52  

If you think about those 200 Data Protection agreements, which was actually very impressive, because I find today it's really hard for companies to execute those things when they're required, so that you had companies doing it when it was just a lovely, nice to have. If you think about where that was to what you see today. I know a scale of one to 10. Where do you How far have we moved?

Daniel Barber  9:18  

Good question. I think we have made some progress. Right? I think now, it's very common that people at least understand that they need to do a data processing agreement. Right. That is that is generally understood what a DPA is, you know, completing 200 of them. There was a reason why we were doing it. It's because tout up the solution, the provider I was working for at the time, we were tracking people's emails, right. So the service basically allowed salespeople and anyone really to use ToutApp. If they would send an email, they could see whether you open that email whether you clicked a piece of content, whether you forwarded it where you opened it The location, pretty sensitive information, right? These systems and solutions exist today and in fact, proliferated everywhere. But that's why companies were very sensitive about what we were doing with their information because it was fairly new. But I think now it is generally understood a DPA is something that any legal professional understands. Even if you're buying technology as a member of the member of the workforce, right? If you're in marketing team, or customer success team or wherever you are in the organization, you know that you probably need to do a DPA. So I think there's generally more awareness, I think the consumer is substantially more aware than they were before. Right? This is a result of, you know, many, somewhat unfortunate circumstances that have happened with data breaches, and generally just understanding of one's information and where it's going. And so I think consumer awareness has increased, consumer expectations have increased too. But the challenges as businesses remain. And so I think, yeah, that that's where there is opportunity to, you know, connect that gap.

Jodi Daniels  11:08  

And I also see from a b2b standpoint, that the awareness is increased. So we're talking about consumers, but also a b2b environment. Many times vendors are requiring this of customers, customers are like, Ah, I gotta figure this out. I might start with a DPA might start with a question. So certainly moving in that direction.

Justin Daniels  11:30  

Great. So with individual rights being a big part of privacy regulations, what does a company need to know about honoring an individual's data requests?

Daniel Barber  11:41  

Yeah. So, this is kind of interesting. If you think about it, right. Last year, we survey to 2000 Americans. So I spoke on PBS about our survey, and we had 880 3% of people respond, say they want control over their information. So I think, regardless of the regulations, which of course, we can talk about the alphabet soup, God to your point, there's lots of them. Consumers want control, right, that's what they're actually looking for here. And that's a that's like an emotional connection to they want to understand what a business is doing with their nation. And so I think we see the leading brands really leading with transparency in terms of, Hey, you, you can get access and control the information that we've collected about you, you can restrict to the sale of that information, if you don't feel comfortable with that. The challenge is, is that most companies are starting from square one, right? They have purchased software, right, it might be zoom, it might be slack, it might be Salesforce, it might be, you know, named 200, other software tools. And they've purchased those over the last five years or a decade or multiple decades. Now, they have no idea where that relation is, or what those systems even do, the number of CISOs, Chief Privacy officers that I speak to that know all of the applications that they have, and what's in them, I could count on one hand, right? Total in doing this for a while. And so I think like starting with understanding what you've purchased, is, is the only way to satisfy that data rights component, right? If you can't actually satisfy that properly. And we see a lot of businesses shortcut here, right? So they'll just provide the information that's available in maybe two or three of the systems that they they own. But in reality, they've got 100 systems processing your information. So is that an accurate representation of what they hold about you? Definitely not.

Jodi Daniels  13:42  

Without being sad with it being so hard for companies or people to know all the different systems? Where does software come in to be able to help solve this problem?

Daniel Barber  13:54  

Yeah, that's a good question, too. I think it's, it's really important to think about, like, first centralizing the applications that you use, right. And so there are solutions in the market that drive the security of those applications, right. So things like October and one login and forge rockin Ping Identity and others that will allow a business to say, Okay, any tool that we purchase needs to go through this secure process, meaning if you join a company, you get access to the 22 tools that you need in the marketing department. If you join that same company in HR, you get access to the 12 tools in HR that you would need to operate your job. That's a good starting point. And certainly the mature organizations that we see have implemented, you know, single sign on is sort of commonly what that's described as that's, that's a requirement really, at this stage, especially with a dynamic workforce. But then also the realization that not all applications will make their way into those applications, right. So even the tool that's meant To control the other tools, the tools may not actually get in there. And so acknowledging that as well and saying, Okay, we probably need to find a tool that finds the tools that are not connected to the tool that's meant to manage the tools. So you know, that's, that's an area that we we help with, and help businesses with, because it's, it is a scary thing, right? This kind of concept of shadow IT and your employees using applications that process personal information, this is hard. And so the combination of like, usually a single sign on provider, and something like got a route to help with that is a good starting point.

Jodi Daniels  15:39  

The other thing I was going to say is a lot of times so we have like your tools for tools for tools, and right software to help be able to find tools, a lot of times people off, also think if I just find a software, I'm done. And I'd love for you to share a little bit about how at software plus people who know how to use the software to be able to help make it all work together.

Daniel Barber  16:03  

Yeah, so that's an interesting area. So what we commonly see legal professionals and I imagine many of the folks that may listen to the podcasts or security professionals, right struggling with is, there is no way you could understand every application in marketing, right? You just can't. And more importantly, you can't understand the relationships between the applications, which is actually probably the more important part. So just like double clicking on that, to expand on it a little bit. If you bought Salesforce, right, and so Salesforce becomes a central application to manage your customer data, as an example, right? That application has an ecosystem of things that can connect to it. And so there might be 20 Other things downstream, that have been installed into your Salesforce environment that are also collecting the same personal information or more. And so just acknowledging that, like, it's very unlikely that you're going to try to figure out every application there, unless you really intend to, you know, go around and survey, right, which is the option go around and survey every department and try to keep that up to date, which is a monumental tasks that, you know, is really why we founded the company.

Justin Daniels  17:37  

Makes sense? So with privacy laws changing, how do you think about that evolution in terms of companies preparing to comply with multi state and this global privacy patchwork system that we seem to be going towards?

Daniel Barber  17:52  

Yeah, so this is not going to work? Like the path that we're going down right now is like not gonna work? Right. I think, generally, the, the informed class of folks that are in this, like we are acknowledged, even the way that he described that question doesn't, this is not gonna work. So there needs to be standards, right, like and global standards, right. And so there are industry groups that are working to do this and work on this area. I wrote a post in TechCrunch, a few weeks ago, specifically on this topic around standards. So the Consumer Reports is working with a consortium of different vendors datagram included, that we're trying to understand how data rights can be standardized, right? Because even just like the simplest thing, have you asked for your information from Kohl's? Right? What is Kohl's collecting on you? Does Kohl's provide the same information back in the format? That? I don't know? Sears does right? I can guarantee you one thing? They don't

Justin Daniels  18:59  

Sears still around? I think so. I wasn't sure.

Daniel Barber  19:05  

And as I said that out loud, I actually had the question. But you get where I'm going there right of every business is operating differently. And the regulations are forcing them to adopt different standards based on where people are located. This is not going to work. And so actually there needs to be technology standards that are in partnership with legal standards. Right. So what do I mean by that? Well, if we had, you know, California has this do not sell provision. That's great. I'm, I think that makes sense. I don't want businesses selling my information in certain circumstances. But that is slightly different than the GDPR. And as we know, also different than other state regulatory requirements. So if we just went down the path of 14 different parts here, that doesn't work, and so we need some technology standards in order to actually pragmatically think about how we solve this problem.

Jodi Daniels  19:58  

Yeah, it's a great point. If you think about The three states, California and upcoming Virginia and Colorado, even their required links on the homepage are all supposed to be a little bit different. We can have links, because when are you going to pick? You can do. Targeted, as always? I thought they're close, but not exactly. So it is certainly a big problem.

Justin Daniels  20:21  

So Daniel, I'm just curious as a follow up question, these state laws are proliferating, at least in the United States, because on a federal level, there's no consensus, there's no agreement over what to do about this problem. From what you see in your business and whatnot. Do you see that changing? Or is it going to take some kind of black swan event to precipitate a change to get some type of is really what we're talking about here is having a GDPR, like federal law that covers all 50 states?

Daniel Barber  21:00  

Yes So but even if that happens, though, Justin, so like, you know, just simplifying it down to like the domestic level? Yes. But if you're a global business, so you operate lgpd, you operate some national law, you operate GDP, that doesn't work, either, right? Because if if you have 15 company countries, that you're operating in all with different frameworks, like that doesn't work. Now, we're just talking about like, making it a little easier in the US, for the domestic folks. But if you're international, you couldn't afford to operate your program that way, either. So I think it actually goes further than just like a national bill, which certainly that wouldn't hurt. But, you know, I think really what we're talking about is some technology standards to make this a little easier for businesses. Because yes, you can't have four different links on your homepage. Like if you talk to a marketer and propose that solution, they're going to leave the building. Right. So like, you know, let's talk about like practical business here. I think though, there's there's enough support for at, at the local level, some form of national bill, but like, when are we going to see that? I don't know, I'm not getting a date at this point. And I think it's like, it's going to take some more pain before we see real change on that.

Jodi Daniels  22:21  

We've talked about a lot of different challenges. One of the one of the main ones, obviously, has been around just which How do you create a program to comply with all of these different laws? What are some of the other big challenges that you're seeing companies face today?

Daniel Barber  22:39  

one of them is related to is related to California's requirement around do not sell, right? Because the definition is quite wide. And, you know, obviously, the sale of information if you're a Data Broker, I mean, that's pretty obvious, right? I sold my email address to you, Jodi, you bought it for, you know, 25 cents. Okay, that's selling information. Right. But, you know, loyalty programs. Okay. I think we extend those to that probably sale of information, if you're extending that information to other service providers. Got it. Ad Tech. What do you do with that? Right? So you know, publishers are now selling your information indirectly, on the open market to other people in real time. That creates some challenges for folks of like, how they're how they're going to interpret that? And how far and how much risk they want to take to do that. So I think that's one area that's quite challenging for folks, especially if they don't have legal support, right of like, how to interpret the requirements. I think another area that, you know, you're seeing advances quite quickly. Right. So Apple's new requirements around deletion of apps, right. So meaning, you know, as a requirement through privacy regulations, it is now a requirement for folks that do have apps to be able to delete that information. That's difficult, right? People may not have actually set up their infrastructure in a way to do that. And you know, how many people have a mobile app on the App Store? A lot. So, you know, that's really hard for folks. And they probably weren't ready for that, right. Like, it's not like there was a memo that went out two and a half years ago, like the GPR, to kind of figure out what to do there. So I think that's, that's quite hard for folks that are operating, you know, with mobile apps. And then the other piece that I described, right of just that, that dynamic workplace. That's really hard. How do you try to empower your employees to share the applications they're using or find some vehicle to be able to capture that information that needs to happen because the concept of a VPN rate is nice. In theory, until you have employees not using it, and we should expect that they're not going to use it, because if you expect that they do use it every time, you know, you're going to be disappointed with the outcome.

Justin Daniels  25:16  

So, if we were at a cocktail party and we are having a this kind of conversation as a privacy Pro, what do you what is your best personal tip that you might give to our audience?

Daniel Barber  25:29  

Yeah, it's funny when I read that, because I was like, oh, that's kind of cool. You know, I think I think what I find interesting, and it may not be necessarily a personal tip, it's more just validation that we are going to be and the three of us are going to be solving this problem for the next decade. So Rick Arne is the co author of the CCPA. And, you know, I was having a conversation with him a few months ago, and he shared a little bit about his path to passing that bill, and what that looked like and proposition 24, with our civic target and the group. And something that stood out to me was just, you know, the, the legislature in California has come to terms with the fact that any bill that passes the watermark standard is 90%. Right. So 90% is the highest amount you can get in terms of support for a bill. And what is that watermark based on? It's based on human trafficking, right. So when when laws were passed around that area, support for restricting human trafficking was 90%. So 10% of people that of course, just don't support the process. And so as a result will, by default, select No under all circumstances, right. So but it's, it's you basically, the highest you're going to get is nonsense. But what was interesting move forward support for the bill was that 88%. So there was literally not a similar a bill that has been passed that received the same level of support. So the the legislature in California is supporting. And the electorate is supporting privacy reform at a level only compared to human trafficking. And neither of those topics are funny. But the point is, we're going through a change. And so I think just as a as an area of excitement, there's there is excitement ahead around trying to solve this problem. And we're going to be doing this for the next decade.

Jodi Daniels  27:43  

Well, Justin, that's good job security. Someone, we're not all going to be trying to help companies solve this problem for the next decade. What do you like to do for fun? That is not privacy and security,

Justin Daniels  27:58  

you go by consumer products and read the privacy policies?

Daniel Barber  28:02  

I may do that. No, I, you know, at this point, I have a I have traveled to a number of places, right? I'm sort of, by definition, you know, global citizen, if you were I lived in seven different countries to get to here. And so I do like traveling a lot. My fiance is also from New Zealand. And so the two of us kind of have fun wherever we can go. So yeah, traveling as much as we can. hiking around the Bay Area is is pretty popular. And so I do that a lot. Also just working, it's nice to have some outdoor time. And so combination, those two things are probably my short list.

Jodi Daniels  28:41  

We must attract similar people, because many times people always say that they love hiking, and they love the outdoors, which is very similar to it. What we really enjoy here too. Lovely. Well, it was such a lovely discussion if people want to be able to connect with you and learn more also about DataGrail. Where should they go?

Daniel Barber  29:03  

Yep. So you can find me on the white pages of the internet, otherwise known as LinkedIn. So if you just search Daniel Barber, you'll probably find me there. You can find me on Twitter. So my handle is a little strange. It's @gaijindan. That is Japanese for foreign person. So foreign person Dan to find me there. There's probably the best two places.

Jodi Daniels  29:29

Awesome. Justin any closing thoughts.

Justin Daniels  29:34  

Data privacy is job security. That could be a bumper sticker. Thank you, Daniel.

Jodi Daniels  29:40  

Thank you so much for joining us today.

Justin Daniels  29:44

Yeah, had fun.

Prologue  29:49  

Thanks for listening to the She Said Privacy/He Said Security podcast. If you haven't already, be sure to click Subscribe to get future episodes and check us out on LinkedIn. See you next time

Adi Elliott

Adi Elliott is the Chief Revenue Officer at Canopy, an industry-leading data privacy and cybersecurity software company. Adi has over a decade of leadership experience in the software and services industries. He’s led multiple marketing, sales, product, and strategy teams recognized for innovation.

Previously, Adi founded and led Relativity’s marketing and enterprise sales teams. He later led product and marketing at Iris Data Services, a leading IT service management company. When the company was acquired by Epiq, Adi began leading strategy for Epiq’s global eDiscovery business.

Available_Black copy
Available_Black copy

Here’s a glimpse of what you’ll learn:

  • Adi Elliott shares how his roots in the eDiscovery space led to his passion for privacy, security, and data breach response
  • How does AI impact the data breach response process?
  • Adi explains how Canopy helps companies form a better response to data breaches
  • How Canopy is reducing the cost of breach response by using AI
  • Adi walks through Canopy’s sale process, customer education, and how they align their tool with each customer’s company policies
  • Adi’s top security tip for Apple users

In this episode…

No one likes to think about getting hacked. But how can you plan ahead in case your company’s data is breached?

How about a team of professionals paired with the top AI software platform for data breaches? Canopy’s AI software can perform the initial data mining. Then, it evaluates: What’s the impact? Is it an incident? Is it a breach? Do we need to review it? If there is a breach with PII, the software can also help send out notices to affected clientele much quicker than any human response. So, how can you create a response plan to help your company bounce back quicker from a data breach?

In this episode of She Said Privacy/He Said Security, Jodi and Justin Daniels sit down with Adi Elliott, the Chief Revenue Officer of Canopy, to discuss the best practices for data breach response. Adi talks about how Canopy aligns their software with each client’s company goals, how they’re using AI to reduce costs, and why their software is so effective.

Resources Mentioned in this episode

Sponsor for this episode…

This episode is brought to you by Red Clover Advisors.

Red Clover Advisors uses data privacy to transform the way that companies do business together and create a future where there is greater trust between companies and consumers.

Founded by Jodi Daniels, Red Clover Advisors helps their clients comply with data privacy laws and establish customer trust so that they can grow and nurture integrity. They work with companies in a variety of fields, including technology, SaaS, ecommerce, media agencies, professional services, and financial services.

You can get a copy of their free guide, “Privacy Resource Pack,” through this link.

You can also learn more about Red Clover Advisors by visiting their website or sending an email to

Episode Transcript

Prologue  0:01  

Welcome to the She Said Privacy/He Said Security Podcast. Like any good marriage we will debate, evaluate, and sometimes quarrel about how privacy and security impact business in the 21st century.

Jodi Daniels  0:21  

Hi, Jodi Daniels here. I'm Founder and CEO of Red Clover Advisors, a certified women's privacy consultancy. I'm a privacy consultant, and certified informational privacy professional, providing practical privacy advice to overwhelmed companies.

Justin Daniels  0:37  

Hi, Justin Daniels here I am a technology attorney who is passionate about helping companies solve complex cyber and privacy challenges during the lifecycle of their business. I am the cyber quarterback helping clients design and implement cyber plans as well as help them manage and recover from data breaches.

Jodi Daniels  0:56  

And this episode is brought to you by that was a really weak and then interesting drum roll, we got to work on drum oh, we're gonna have drum roll lessons for your Red Clover Advisors. We help companies to comply with privacy laws, and establish customer trust so that they can grow and nurture integrity. We work with companies in a variety of fields including technology, SAS, e commerce, media agencies, and professional and financial services. In short, we use data privacy to transform the way companies do business. Together, we're creating a future where there's greater trust between companies and consumers. To learn more, visit Do you think

Justin Daniels  1:42  

we want to share our big announcement with our viewers about what happened on Friday night? Go on ahead. Now I want to hear you

Jodi Daniels  1:50  

do it. Now you're doing I don't want to take away from your thunder, why not?

Justin Daniels  1:53  

Alright, if I embarrass you will be my fault. Well, what we'd like to share with our viewers is in February Jodi, and I'll be taking a road trip to San Francisco, because our version of a fun weekend includes speaking at RSA on a really cool topic, which will be autonomous vehicles and drones privacy and security versus surveillance. So if you get to RSA in 2022, on February 7, come see us,

Jodi Daniels  2:27  

God and Justin shows on the road again. Well, to get started, we're so excited for today's guest we have Adi Elliott, who is the Chief Revenue Officer at Canopy and brings more than a decade of leadership experience in the software and services industries, to his role, leading Canopy’s global revenue operations. He has extensive experience building and leading high performance marketing sales product and strategy teams. Welcome to the show. Thank you so much. Well, we're so excited that you are here to join us, you did not realize that we were going to be chatting all about the fun going to San Francisco big bridge behind you.

Adi Elliott  3:09  

Yeah, that's actually very timely considering my background.

Jodi Daniels  3:13  

It's my favorite city. I love San Francisco, anytime used to answer any of those quizzes, and what which city are supposed to live in, I was always San Francisco.

Adi Elliott  3:23  

I don't begrudge that. It's a great city, and

Jodi Daniels  3:25  

deed. Well, to get us started, we always like to ask a little bit of everyone's career journey and how they got to where they are today in this privacy and security world. So please, take us all the way back. But fill us in.

Adi Elliott  3:40  

Yeah, it like it was circuitous and accidental, like many. So I came in through, essentially the eDiscovery door and that I'm a marketing and kind of product person by trade. And there was a time this is like in oh eight or so where I was living in Chicago, I wanted to stay living in Chicago. And all the product and marketing jobs were either in the San Francisco Bay area, or in Seattle, at least with big companies. And there was like a small consulting company that wanted to become a software company and, and they needed a head of marketing. So I said, Well, if this company like like implodes in a year, then I'll just take a job and like Mountain View or something, you know, like and so I took the job there leading marketing for an ediscovery company that became a much bigger ediscovery company and was pretty successful. The name of it now is like relativity, and they're like a big player in the in the eDiscovery space. And essentially, that kicked off a good probably like 15 years in ediscovery or so. And then, and then privacy and security and data breach response came along when I was kinda, I guess you could say burned out by helping companies to each other because that's kind of what ediscovery is. And, and but data breach response is like working with data. So a kind of friend put me in the CEO of Canopy together and said, Hey, you too should have a conversation. They're doing pretty interesting stuff in the data breach response space, it's somewhat adjacent to ediscovery. But it's still working with data. And I'd honestly, I had some friends that have gotten into privacy. And obviously, there's a ton of momentum in the software world there. And so from a software perspective, it was really interesting to me. And from a problems to be solved perspective, it was super interesting. And, and as soon as I really saw what was happening, and data breach, I couldn't not be a part of it. So pretty quickly, me, me and me and the CEO of Canopy were like, Let's figure this out.

Jodi Daniels  5:47  

Well, thank you for sharing you discovery is certainly morphing a little bit, I find a lot of people coming from the eDiscovery world into the privacy and security space, right? There's there's definitely some intersection and an overlap all. For me, my favorite phrase is it's all about the data, which is underlying for both of those worlds.

Adi Elliott  6:06  

Absolutely. And it's one of those spaces that like, if 30,000 feet, it looks like it looks very similar to ediscovery. But like a data breach response. But when you get into the weeds, it's actually like incredibly different.

Jodi Daniels  6:18  

Oh, apps, absolutely. By the fundamental connector is data.

Adi Elliott  6:21  

Absolutely. Interesting. Yeah. And we like living data all the time. And being able to work with and understand structured and unstructured data is super interesting. And what do you do with it? And how do you get your arms around it? And, and it's interesting problems that humanity faces these days.

Justin Daniels  6:38  

So why don't we dive in a little bit about your current role, and talk a little bit about how AI impacts the data breach response process?

Adi Elliott  6:48  

Yeah, it's really, it's really transformed it in a lot of ways. And it kind of gets it a bit of what I was saying a second ago is that data breach response is a really interesting space. And that it was created kind of by Fiat by a bunch of very sensible regulations. Like it wasn't a slow, a space that developed over 50 years. And slowly, there's a tick tock between like solutions and problem, it was all of a sudden created when both in Europe, North America, Australia, Asia, all of a sudden, every country started realizing that there are risks to it to all the data that's being held, and that the citizens of each of these countries, each of these states, that that there's real impact to all of the hacks and incidents and breaches Call it what you will. And so all of a sudden, worldwide there is the need to do data breach response because of the regulation. So it's created by regulations and what it didn't. So there was no software space for that at the time. So everyone had to just use best available Tech because the software, the GDPR CCPA, or the state of Virginia, they didn't say, hey, politely wait until a piece of software comes along that specifically solves this problem and said, Hey, right here, right now notify all the people that are impacted by this. So what people were using is just cobbled together solution. So like, at the time, it was like search terms and regular expressions were what they would use to like, say, Okay, I have a business email compromise. It's not even that big of like 20 gigabytes is like pretty typical. That might be a few 100,000 records, though. So how do you find out a few 100,000 records if there's PII in there not to notify the people and what names are in there. So in the beginning, that that part was like search terms and regular expressions, not super efficient. If you use that on on, like any data set at all, you're it's going to tell you like 70% of the of the dataset needs to be looked at which, which in reality, is way over inclusive. So right from the jump. One of the ways that AI is transforming it is building models, to to specifically address every element of PII, instead of relying on the kind of search terms and regular expressions that people were showing up with it because that's all they had. So like, that's one part of it. And that gets you closer to like the true pie level within a dataset because what you find is, if you're using AI, to in machine learning and algorithms to solve this problem, what you're finding is is that the true the true amount of PII in any given data set unless you're talking about a healthcare case, that's like crazy town and like it's all PII, it's usually like 15%, like between 10 and 20 is pretty typical. So you like reviewing 85% of the of the dataset that doesn't contain any any PII at all is kind of wasteful, and there's money to be made for some folks there, but it's not super not super cost effective. So like, right from the jump, like just identifying PII is like the perfect kind of problem that data science and AI solving. And then even when you get into it, like connecting like, hey, is this Alice McNeil is that her social security number is that her address, may allowing reviewers once they're looking at data making those elections happen really fast. It's definitely an AI problem. And then even like the entity consolidation, if you think about it, like, like God take of taking your data is I'm sure appeared as all of our data has in many incidences that become breaches. I bet when they're identifying your data, they have your name, like several different ways. They probably have maiden name, married name, they probably have Jodi spelled wrong, they probably they might have a middle name or something. So you might there might be there might have several different addresses that you've lived at. But if you're a lawyer, that is like overseeing this whole process, you're like, yeah, just tell me who I who we send in this thing to I get it, her data is compromised. What's her last name? What's her address? How do you spell Jodi, what are we doing here? And that's another AI problem with just getting down to 10 instances of various spellings and instances of people's name down to who we send in the letter to in the end. So a an endless AI problem.

Jodi Daniels  10:52  

And I see Justin smirking over here. Maybe they even have our running because sometimes we're called to all different kinds of things. Or a lot of times, especially for you, you're you're called Daniel

Justin Daniels  11:03  

or Jason or Tom.

Adi Elliott  11:06  

Elliott all the time. My name is Adi, but people don't know what to do with Adi. So they're just like, okay, Elliot. And they I'm positive that my name is just turned up in Elliott, because one day they just he like Elliot's all they know what to do with. So they're just like, we're just rolling with Elliott.

Jodi Daniels  11:22  

Oh, well, so fill us in a little bit about Canopy and how does it help companies better respond to data breaches? Where does it fit into this puzzle?

Adi Elliott  11:31  

Sure. So we're the only company in the world that's end to end focused on the data breach response problem. So there's a lot of like, adjacent cobbled together like, Oh, this is kind of on the margins of what we do. But what we do is from, from the time like, so what happens before us is the incident response process where some digital forensic or incident response organization, well, I be sure an incident has occurred. And then they'll say, this is the data right here. So maybe it's a file share, that is the subject of ransomware. Maybe it's a, the the PST that has been compromised because of a phishing attack. But they'll say, here's the data. It's right here, it's a PST, it's 30 gigs. It's 80 gigs. It's these three Ps, DS, whatever the data is, from then on is where our software comes into the mix. So what people do is they upload the data into our software. So we, we run a ton of AI models against the data that our clients like. So another thing that's interesting about us is we're appear on the data breach response side were peer channels. So we sell to people like Incident Response groups, law firms, litigation service providers, on the review side, sometimes insurance companies, if they're particularly entrepreneurial, some, but we sell to people who solve the problem. So for the CIO, all the people around the table are using our software to, like initially do that data mining. So that's like, the first thing that happens in our software is they upload the data, we run all that AI against it, and then they data mine it and just say, Okay, we've got this, you know, 80 Gig PST, what are we gonna what? What's the impact here? Is it? Is it an incident? Is it a breach? Do we need to review it? Does it need to go any further, everyone's praying, there's no PII in it. Most of the time there is unfortunately, but enough that it probably is notifiable a lot of the time more than people think. But yeah, so they, it enters our software, like what comes into our software is that like raw data, like usually unstructured data and what comes out, like a whole bunch of stuff happens that we can go into if you want, and people doing work our clients doing work on the software. And what comes out is a CSV that is like a list of names and addresses of the people that are going to be notified. So we're kind of input is the data output is the CSV with names and addresses. What happens in the middle is like the several steps like the data mining step, the review step, and then the entity consolidation step all is happening inside of our software.

Jodi Daniels  13:48  

It makes sense, and a very needed piece of the puzzle for sure. Yeah.

Justin Daniels  13:54  

So how do you see companies paying for breach response? Or how is it going to evolve? Because premiums have gone through the roof? Yep. And coverage is now starting to be significantly curtailed.

Adi Elliott  14:09  

Yep. So in a way, and this is kind of a really quirky place that we play in the biggest beneficiaries, like the two biggest beneficiaries of our software is number one, insurance companies, but they're usually not our direct client. But what we do if you think about it, so like and then let me like Ben, slice that a little if you think about it, the reason why there's two main costs that are happening that makes these incidences so expensive, one is usually paying the ransom if there's a ransomware like that's thing number one that makes it expensive. And thing number two is human beings reviewing data. So like that is actually extremely expensive and in a really easy shorthand is like $1 a document. So if you have a few like if you have 50 gigs in a PST, which is like real standard that is not a crazy PST size, but we're talking several 100,000 documents maybe even a million and just a 50 gig PST depending like the like a lot of you can see anywhere between 5000 Docs per gigabyte up to like 1213 14,000 Docs per gigabyte. So even 50 gigs like the math of that gets crazy fast. So the difference between reviewing 15% of that collection and reviewing 70 to 100% of that collection is enormous. And the the the folks who are on the hook for that ideally, or not, ideally, but like usually are the cyber insurance. So the how they pay for it usually as a cyber insurance company, I like they have a cyber breach plan. But you're right, these these because of the combo platter of holy cow, this is expensive to review. And these ransoms often are going to get paid and they're expensive as well, you put those two together, and people are maxing their policy like every single time, every single time. And what we're trying to do is, is essentially reduce at least that that that review cost and what insurance companies are realizing because up until now, everything was the same cert until Canopy came along. The entire solution for data mining was search terms and regular expressions and there wasn't really AI involved, it was just search terms of regular expressions. And for that, you're going to get either 70 to 100%. And, and so there's a whole ecosystem of like really transparently of review companies that have built their whole business around the review 70% to 100% of the collection. And it was like incredibly lucrative, like it is it is a heck of a business to do that. But Canopy comes along and says, what if you only reviewed 15% of that. And for an insurance company, that's fantastic. Once they like wrap their mind around, hey, if you like save money, like the data mining phase controls that review phase, so like save the money, a data mining, and then review fewer documents. But there's all these review companies that are like killing it on 70 to a brick reviewing dollar a document 70 to 100%. There, it's it's pretty lucrative for them. So they're gonna fight back hard on that and try and keep a technology like ours out of the game. So what we're doing so that's thing number one is saving the cyber insurance number two and think number two is companies. Because if you think about it, what we're ultimately trying to do is make the cost of doing business less crazy here. So we can all get cyber insurance. So we can so this this, so it doesn't exclude the cyber insurance market. So this doesn't, it becomes no it because if you think about it, hackers have created a tax on business via all these cyber insurance policies that we all have to have. Now. That's that's really what it is. And Canopy’s ultimate end game here is to reduce that tax as much as possible to mitigate that. And the data breach response is the reactive side of that. And then we also have like a proactive looking thing on the on the proactive side to help people mitigate as well. But essentially, that's that's, that's how they pay for it via insurance, insurance premiums go up insurance companies are like, Whoa, boy, I don't know if I want to do this anymore. And we're trying to change the dynamic there.

Justin Daniels  17:55  

And well, thank you for that. So one thing I wanted to talk about specifically in your role as chief revenue officer is talk a little bit about how the sales process works, and aligning your software with educating the customer about the data your tool might collect and company and privacy security policies. Around that data. I find too often in the sales process, this is omitted. And then when I the lawyer get the contract, and it's one sided, it results in either deals being delayed,

Adi Elliott  18:26  

or they don't close. Yeah, it's kind of it's it's tricky for us. I mean, one we obviously take security extremely seriously. We operate worldwide. We're in like North America, Europe, Australia, software's used all over the world. And and the security questions and questionnaires and all that stuff is like a standard operating procedure for us and usually gets handled before any specific project. So usually, we're working with like the law firm or the or the digital forensic Incident Response folks to like that, like they make sure up yep, your your security bonafides makes sense. But the the quirkiness of the whole situation is is that one, the data is not going to live in our software for very long, the only reason it's in there is to get the folks notified as fast as possible. And to it's breached data, like it's not this isn't like corporate data that in the traditional sense, this is like like or it's it's data that has been compromised, let's not say breached, because that's a that's a legal definition. But the essentially, like like I said, we take it super seriously. But the the endeavor is to, for the for the data to be in our software. For frankly, as short as possible. We don't want the data in our software long because the faster the data comes in, it's data mined, like they reduce that dataset, then they review it, then they get the they consolidate the entities they get the list of human beings that need addresses, and then they just delete it from our system and it's gone never to be heard from again, and it's wiped and then all they have is a CSV with a list of names and addresses. And that's the output and then and some metadata of like, what's in the, like, how many social security numbers, how many addresses, or if it's like in the UK or something like how many NHS numbers or something, but the actual PII goes away, the data goes away, etc.

Jodi Daniels  20:14  

You had shared before a little bit about how you also have some mitigation measures. You know, imagine a company goes through this whole process, they learn how messy all their data is, oh, my gosh, look at how much personal information we have. So kind of to the conversation that we just had, and what Justin was talking about, where it how do you educate companies on? Hmm, hopefully, you don't have one of these breaches again, and we need but if you do, you might want to take these kinds of steps to have a smaller pool have cleaner data, I'd love to hear a little bit more about those mitigation and proactive measures.

Adi Elliott  20:47  

Sure. So we I mean, it this is kind of a funny story. Because like our clients and and shout out Gartner a little bit like Gartner helped us invent a whole product? Because there they just explained to us that nobody does this. And then some clients were like, can we pay you for this, but we launched a after the data breach response product that that on the success of how well it is how good a job it does it p identification and data mining. We had some clients that like a company would use us in the data breach response process. And then their like CIO or chief privacy officer, somebody would hear about our software from data breach response, and they'd say, what if we use something like that, before the data breach occurs just to see what we have. And, and essentially, this here's, here's our kind of take on that is a lot of the market is is all about remediation. And what privacy audit our product is about is a is about changing human behavior. So it says, like, we know from data breach that almost every time these these compromises occur, what the company finds, once they run it through our software is that the the folks were out of compliance with policies. Now they were clicking through all the surveys, they had done the tabletop exercises, they had the right incident response, digital forensics folks in house, ever, they were taking all the right steps. And yet, the HR team, the marketing team, the sales team, whoever it was, they still had a ton of PII that nobody knew about. And there's just no no vector in to find out. So what privacy audit does is take all those algorithms that we've developed across billions of elements of PII identified on the data breach side and said, instead of trying to remediate every every bit of PII in your enterprise, because that's a fool's errand that's literally proposing running a data breach response project on your entire company, which nobody is going to do. Instead, just take a couple samples, say, Let's take two random HR people that you think represent the way HR uses data and take us in, run it through the software, see what kind of PII is there? And then instead of anecdotes, or a tabletop exercise where people are trying to pull out of their heads, what do we think we do today? It's tangible, like, hey, if we had a breach today, or we had an incident or a compromise, or somebody got fished from this department, you can like drop the report on the table and say, here's what we'd have. Now you i, we understand that you need you have a job to do, and that you didn't realize you were exporting this Excel spreadsheet out of this tool over here, emailing it to this other person, and then they would manipulate it and they drop it into fileshare. And but not like we didn't know that was happening you didn't think about that was happening. You didn't think about once it's in your email, technically, now it's in your email. But like, what do we need to do here to like to, not from like this specific spreadsheet, but from a workflow, from a from a behavior, like it allows training and policies to be updated to be specific to the people that that you're talking to, with the line of business, and it allows benchmarking, so you really know if people are in compliance or not. So that's what the product is. And, and that's the way we think about it. And so it's like I said, it's about changing human behavior and changing policies and training, not so much about, hey, let's crawl your whole enterprise and look at every bit and byte. Like, that seems that seems the I don't know that that dog never hunted for me in, in, conceptually, because that always, that was always a pitch to do data breach response on an entire company. And we know from actually doing data breach response that that's, that's there's no way because you can barely get a cyber insurance company sometimes to pay for data breach response, or just this data right here, let alone the whole company.

Jodi Daniels  24:23  

It makes sense. Thank you for sharing. And I I like the idea of the proactive piece because there's there's data hiding in a variety of different nooks and crannies inside companies.

Adi Elliott  24:36  

And it's where it like and it's not even I guess I'd say this, it's only hiding because no one has a way to look at it. Like it's not really it's just sitting there and PST is and file shares. Not it's hiding in plain sight. It's just that there's no vector into it. There's like nothing that like says Here it is. Here. Here's here's 30,000 Social Security numbers just sitting in someone's PST. There's nothing that says that.

Jodi Daniels  24:57  

Right? Hopefully they won't keep having 30,000 Security numbers. I think someone's BSD

Adi Elliott  25:02  

you'd hope not. But I would have

Jodi Daniels  25:04  

not. Yeah. But I know it happens in some companies

Justin Daniels  25:09  

more than we like to think about indeed, well, kind of changing the game a little bit is, in your experience. Would you like to share? What is your favorite privacy or security tip with our audience?

Adi Elliott  25:22  

Oh, absolutely. This is something that uh, that most people are like, it's wild to me the most people don't know about. So the caveat here is that this is, if you live in Apple world, this is going to apply. So if you have like an iPhone, and it really superduper applies if you're using a Mac. So the one of the most like, I would say unheralded, because they don't get a ton of the advertising. But the team working on authentication and passwords inside of Apple is killing it. And the person who manages that T Ricky Montello, shout out Ricky modelo is like phenomenal and doing like, amazing. That whole team is crushing it, but they now have multi factor authentication built straight in to, to pretty much the lowest level of, of Apple devices. So if you think about previously, I'd been using like Google Authenticator in my personal life to use multi factor. And the thing about that's annoying about that authenticator app is like, if you have it on your phone, one, you have to copy the codes across. And two, if you get a new phone, you have to like do this dance of transferring your multi factor from one phone to the next phone. It's like super frustrating. So now, the Apple solution is like so good. It's built straight into the autofill. Password. And if you go into like settings and passwords, and then go into any specific site, so like say, just take Twitter, if you go in, if you want to set up multifactor via apple on Twitter, you can just go into the password section of of your Apple device. And you can do it on if you're on the latest and greatest of their software, either the passwords thing and you go into the website and question, and then you can set up code or whatever it is. But it allows you to do it either via QR code, you scan it, and now it works across all your devices. So literally, when you get a new iPhone or iPad, your multi factor is ready to go. It's just right there and it auto fills. So when you get the multi factor dialog, like it'll autofill it from from the the keychain. And like this is this is like the the way normal people can implement multi factor because I think multifactor is really hard to understand stuff for your average person. And this is the way if i Any, any, any civilian, I don't know what we call like people who don't live in privacy and security space, but any regular person who doesn't want people Yeah, normal people, I would say this is the route and even if you're even if you're like one of us, I would still say it's the implementation is phenomenal, phenomenal implementation of multi factor. Highly recommend if you're if you live in Apple world, if you have an iPhone, if you have a Mac, this is this is my recommended, like it's just so easy. And so, so user friendly.

Jodi Daniels  28:00  

I'm just curious, if you have a let's say PC, computer, Windows based computer and iPhone, does it still work? Or does it get messed up? If you want to be on Twitter on the PC and Twitter on your phone? No, it

Adi Elliott  28:12  

still works fine, because you just get the code, the code is always there and that passwords location on your phone. So you'd still just be able to like look the code and like type in the six digit code or whatever. It's still right there. But it's still amazing because if you think about it, when you get a new iPhone, which we every two years or so a lot of us do. It just transfers there's no dance of voltage, there's no multi factor dance. And it really when it's the worst is if you use your current phone is like the upgrade phone and trade it in then you're really toast if you don't think about it ahead of time because you're like MultiFit you can be in multifactor purgatory. This solves like all of that. All of its solved it's right there and consider the apples like the privacy company of floors me I don't get that they don't talk about this more.

Jodi Daniels  28:53  

Well we will do our best to highlight that tip for everyone maybe even give a little shout out to Ricky so thank you know when you're not giving out awesome tips and being the privacy professional that you are like you'd like to do for fun. Um

Adi Elliott  29:07  

so one highkey the company Canopy were named that way because like our CEO one of his favorite things is like hiking with his wife and it just it like coincidentally a lot of us at the company that's my favorite thing in the world too is hiking with my wife so like hiking in the Pacific Northwest or wherever there's a beautiful nature, but I love hiking. I love watching NBA basketball. And and so that's like think number two I think number three somehow thanks to my kids, it's like Pokemon Go. I like it's just become like, the background noise of my life somehow is Pokemon Go.

Jodi Daniels  29:40  

I didn't know if that was still around. Because I remember years ago it was the thing. Like shopping centers had to have blockades and then I I don't hear about it anymore.

Adi Elliott  29:48  

It's still a thing and it's like it's it's the game itself is like not like there Yeah, I somehow don't even get how I got sucked into this world but I did. And so now I kind of played Pokemon Go, like on a on some sort of basis. And my oldest kid who like is the one who got me into like, like laughs at the fact that like, like I played more than he does at this point like, like so, but it's still there. And it's kind of the jumping off point that Niantic the the company behind all those technologies, it's like their flagship products still in there, they're still chasing, like the next thing to do with their like geolocation, AR. All the technologies behind Pokemon Go they're like always trying to like launch new products. I'm always seeing what else they do with it, but Pokemon Go was like the perfect mix of, of our intellectual property and, and technology. But it's a great game. And it's really interesting, and it gets you going around every city you go to.

Justin Daniels  30:42  

Have you purchased your first and NBA NFT

Adi Elliott  30:46  

that so I have not? I have not. And I'm like I'm kind of Taiki I guess I'm taking on the NFT thing. Like I like I get it, I get why people like it. I collected basketball cards as a kid and they're like digital basketball cards. I just, I'm proud I'm of the belief that I would probably buy the one that would immediately go down in value and like mine wouldn't be worth a lot and I just be like, Yeah, I'll just like, go to games and watch on League pass and enjoy it that way. But it's cool. I love that they're doing it and it's like one of my favourite implementations of blockchain technology.

Jodi Daniels  31:20  

One of Justin's other favorite pastimes. Well, Adi it's been so much fun if people want to connect with you and or learn more about Canopy where should they go?

Adi Elliott  31:29 That's the so is our website. All the informations there. And that's the easiest way to find all of us online.

Jodi Daniels  31:41  

Well, excellent. Well, thank you so much again for joining us today. We really enjoyed the conversation. My pleasure. Thank

Adi Elliott  31:47  

you so much. That was awesome.

Prologue  31:53  

Thanks for listening to the She Said Privacy/He Said Security Podcast. If you haven't already, be sure to click Subscribe to get future episodes and check us out on LinkedIn. See you next time.

Art Poghosyan
Art Poghosyan is a serial entrepreneur with over 20 years of cybersecurity experience. Art’s entrepreneurial journey started with Advancive Technology Solutions, a leading identity management consulting and systems integration firm. He led the company's exponential growth and eventual acquisition by Optiv Security in 2016. Now, as the CEO and Co-founder of Britive, he is solving the cloud's most challenging security problem: privileged access security.

Prior to his foray into entrepreneurship, Art served as the Manager of Advisory Services for EY (Ernst & Young) and as a Consultant for both Protiviti and Arthur Andersen.

Available_Black copy
Available_Black copy

Here’s a glimpse of what you’ll learn:

  • Art Poghosyan describes his security career and entrepreneurial journey
  • The biggest misconceptions about security in the cloud
  • Why do businesses assume their data is automatically secure if it’s in the cloud?
  • How Britive is addressing identity and access management (IAM) security for the cloud
  • The types of companies that should implement IAM security systems
  • How Britive’s security tools can help your company protect itself
  • Art’s top security tip: take time every day to research security trends

In this episode…

My company’s data is stored in the cloud, so it’s completely secure. Right?

Wrong. Unfortunately, storage in the cloud isn’t enough to keep your data secure. Cloud technologies are innovating faster than security can keep up. Plus, they can’t be protected with a firewall like traditional networks either. So what can you do to protect your data? The key is identity and access management security. With these systems, users can be authorized to receive access on demand, just for the time they need it. And, their access expires automatically when the session is over. This ensures that there is no 24/7 exposure of access that attackers love to exploit.

In this episode of She Said Privacy/He Said Security, Jodi and Justin Daniels sit down with Art Poghosyan, the Co-founder and CEO of Britive, to discuss the power of identity and access management security. Art talks about the biggest misconceptions about cloud security, the best strategies for securing your data in the cloud, and how Britive can strengthen your company’s security systems.

Resources Mentioned in this episode

Sponsor for this episode…

This episode is brought to you by Red Clover Advisors.

Red Clover Advisors uses data privacy to transform the way that companies do business together and create a future where there is greater trust between companies and consumers.

Founded by Jodi Daniels, Red Clover Advisors helps their clients comply with data privacy laws and establish customer trust so that they can grow and nurture integrity. They work with companies in a variety of fields, including technology, SaaS, ecommerce, media agencies, professional services, and financial services.

You can get a copy of their free guide, “Privacy Resource Pack,” through this link.

You can also learn more about Red Clover Advisors by visiting their website or sending an email to

Episode Transcript

Prologue 0:01

Welcome to the She Said Privacy/He Said Security Podcast. Like any good marriage, we will debate, evaluate, and sometimes quarrel about how privacy and security impact business in the 21st century.

Jodi Daniels 0:21

Hi, Jodi Daniels here. I'm the Founder and CEO of Red Clover Advisors, a certified women's privacy consultancy. I'm a privacy consultant and a certified informational privacy professional providing practical privacy advice to overwhelms companies.

Justin Daniels 0:38

Alright, Justin Daniels here, I am a technology attorney who is passionate about helping companies solve complex cyber and privacy challenges during the lifecycle of their business. I am the cyber quarterback helping clients design and implement cyber plans as well as help them manage and recover from data breaches.

Jodi Daniels 0:57

And this episode is brought to you by that one this time, Red Clover Advisors, we help companies to comply with data privacy laws, and establish customer trust so that they can grow and nurture integrity. We work with companies in a variety of fields, including technology, SAS, ecommerce, media agencies, and professional in financial services. In short, we use data privacy to transform the way companies do business together, we're creating a future where there's greater trust between companies and consumers. To learn more, visit

Justin Daniels 1:36

And today, what is this gonna mark the end of our kitchen renovation?

Jodi Daniels 1:41

Almost almost a really big important one, we're going to get a counter. This is very exciting. But then there's, there's like a decorative element that that comes, we need a new light. I'll let you

Justin Daniels 1:51

handle that. But I'm just curious. We have a contractor coming today. And how will you know when you answered the door that that person is the contractor,

Jodi Daniels 2:00

they have a really big counter with.

Justin Daniels 2:03

So they'll have something that they're using to indicate to you that this person must be the contractor and isn't impersonating the contractor.

Jodi Daniels 2:11

I think you're giving a hint for what we're going to be talking about today.

Justin Daniels 2:14

Exactly. We're going to be talking a little bit about identity access management for

Jodi Daniels 2:20

the cloud. Well, since you're so excited, I think you should introduce our special guest

Justin Daniels 2:26

Art Poghosyan is a serial entrepreneur with 20 plus years of cybersecurity experience. His entrepreneurial journey started with advancing a leading identity management consulting and solutions implementation company, where he led the company's exponential growth and eventual acquisition by Optiv Security in 2016. Now, as founder of Britive, they are solving clouds most challenging security problem, privileged access security, prior to his foray into entrepreneurship, Art’s security career began as a consultant with a big four firm where he spent eight years working with global enterprises across various industries. Welcome Art.

Art Poghosyan 3:08

Thank you. Thank you. Glad to be here.

Jodi Daniels 3:10

Well, we're so excited to talk to you today. And we always like to get started to understand how people got to where they are today. So you have founded a new company, we learned a little bit about how you started at a big four firm, but help us fill in the gaps. Sounds good?

Art Poghosyan 3:29

Yeah. So we spent about 20 years in InfoSec. And first half of that was with them, big four firms, and Ernst & Young was the main one in I was a consultant information security consultant working with major enterprise accounts, across industry verticals. And second half was the beginning of my entrepreneurial journey journey in as you mentioned, the advanced of was the first company business that I found it. So little story there when I was starting my first business in 2009. We're right in the middle of one for you remember that one of the largest economic financial crisis that this country has ever seen. So it was kind of interesting time to do that sort of business. But I kind of took a contrarian bet that the security consulting services were primed for disruption and specifically identity and access management. This was because at the time the market was primarily dominated by the Big Four players for these services, which you know, the the cost and the price tag is was very high very expensive, but customers were also increasingly getting dissatisfied with the outcomes. So, that is the kind of problem and I I decided to take on the you know, boutique approach to expert i consulting services. And that's what advanced advanced did. So the business did really well. We grew for about six, almost six years actually, before optive acquired us. And after that, I started the Britive journey, which is a cloud, cloud privileged access and identity management solution.

Jodi Daniels 5:20

Well, I as an entrepreneur, myself, I always relish hearing different entrepreneur journey journeys, and a lot of people will say that it's a when there's a downturn is actually still a great time to build something and and you took advantage of that. So kudos to you. Thank you, especially with cybersecurity where there is no downturn, then bad people always find an opportunity they do.

Justin Daniels 5:44

Well, one of the reasons we're excited to have Art here today is we're going to talk about the cloud. Because Art is I like to say, You know what my date is on the cloud. So of course, that means it's very secure. And so teeing up our question, which is what is the biggest misconception around security in the cloud?

Art Poghosyan 6:07

Great question. So from the the enterprise business point of view, I think the biggest misconception is that you can actually secure the cloud with tools that were built for the data center. And this is one that that I see almost every day. And I think, one big reason why because it is because most businesses today, when they think about their cloud journey, they think about sort of this lift and shift approach. And it's very common, they take what they have, in the center data centers, from the physical infrastructure network and move it to the cloud and put it into VMware, and now they have cloud freight. And they kind of tend to think about the security of that world, like it is still the data. What happens over time is their their needs evolve, and they start adopting and using more, you know, cloud technologies and cloud native technologies, which are much more, you know, sophisticated and advanced, like serverless technologies and micro services that big cloud providers like Amazon, and Google and so on, are offering. But you know, the reality of these new cloud tech, new cloud native technologies is that they are very different. Architectural II, they're very different. And so really, it's a whole different level of challenge to secure these environments. So that's, that's kind of the common recurring theme that I see that there's that misconception that you can secure cloud with the tools that you already had in the data center.

Jodi Daniels 7:44

Well, we often hear and Justin, you just kind of joked about your data is in the cloud business, so it's safe. And And truly, though, I'll talk to companies and they'll say, Well, my data is in AWS or my data's in XYZ cloud space, and therefore I don't need to do anything. It's all secure. Why is it that businesses say or think I put it in the cloud? It's secure?

Art Poghosyan 8:12

Yeah, so is this. On this topic, there was a completely different point of view, maybe like 15 years ago, when cloud was new, and it was more like, Oh, nothing in the cloud is secure. So it's almost like we've kind of taken a whole 180 degree shift from that perspective to Okay, so if it's in the cloud, it must be secure. I think there's a combination of a couple of different things here that that, to me, at least it explains why there's that that it. Definitely Cloud has matured and evolved over the past 15 years, and some public cloud technologies have become, you know, popular. I think the a lot of it is because technologies, the cloud technologies, and cloud vendors have evolved in their security. Well, it's one thing that, you know, is common is a lot of the, you know, the clouds, service, cloud, native service providers, SAS dashpass, anything have, they always try to, you know, show the customers that they have the right level of security, that they are protecting the data prop properly. And they're not necessarily lying about that, but the reality of how especially large and, you know, very large cloud providers, like major SAS platforms, for example. The reality of these businesses is that a lot of what they what's happening especially in their, in their technologies, how they're evolving. It's it's difficult to always maintain security at the right level across all especially they're innovating very fast, difficult to maintain security, kind of equally across all the new features and functionality. We add to that to also the the Businesses are consumer facing business, especially that they're building a lot of new applications, new, modern tools and technologies for consumers. Especially when it involves data. They're using what's what Cloud has to offer the cloud native capabilities like big data and analytics, you know, technologies. So these functions, these business functions, in a way, they kind of run ahead of the security, they are trying to be agile, they're trying to be very customer, customer centric, and offer new new features and functionality. Well, our offers the technologies, but that don't always the security features and controls don't always exist in this new, you know, features and functionality. So what I'm seeing more frequently today is security teams are really trying to play catch up and try to, you know, balance, the business agility, the business needs, with the security requirements and protecting the important data. Pretty cool data like consumer data. So

Justin Daniels 11:07

turn a turning to your company now, Britive, how are you addressing disrupting identity access management security for the cloud?

Art Poghosyan 11:17

City? Yeah, listen, this is definitely near and dear to my heart. I've spent a long time in identity and access management solution space. And I believe that the identity and access management plays a much bigger role in the modern cloud native world than it has in the data center world. This is because you know, cloud technologies cannot be protected, protected with a firewall like like the traditional networks were. So Identity and Access has effectively become the perimeter, the front line secure security that the business does have to protect. Unfortunately, though, many businesses are still thinking about securing cloud cloud identities and access, like it as a data center. Britive introduces several innovative concepts and ideas for identity and access management in the cloud. One of them is this concept of zero standing access zero standing privileges. What this means is, the users can be authorized in receive access on demand, just in time when they need it. But it always expires automatically, when their session is over when they're done, you know, with their activities. This ensures that there is no 24 by seven exposure of access or privileges that attackers love to exploit. Another key concept is to ensure that we are constantly analyzing and monitoring the activities of users in in the cloud technologies, identities and the privileges and privileges and access that they are authorized to have. And our machine learning engine makes sure that we detect behavioral patterns that are different from normal pattern and quickly, you know, take on remediation actions, for example, for alerting to security teams, as well as revoking access on the fly, if that's what's needed to to prevent a breach.

Jodi Daniels 13:35

Are there any, you know, some companies might say I'm too small to need these types of tools? So thinking about some of the objections that I'm sure you get when you're talking to different companies? We've talked about some of them already with the need for security in the cloud. Can you share a little bit about what kinds of companies or the type of data that they're protecting, need to have this type of a tool?

Art Poghosyan 14:02

You know, this one is an interesting one, because there are two camps out there that, you know, the security teams, we rarely hear any objections from them. The security teams always understand that the you know, benefits and value of a solution like private. The the common objections come from the cloud development teams, the folks who are building the applications folks are building, you know, data analytics, you know, platforms and so on. And the approach they they like to take is to say that they can build a solution like Britive, they can do it in house. And there are two two big problems with this approach. The first problem is well, while the cloud developers are more than competent to build solutions, building security solutions in house is a whole different beast and often And the time and the resources and the cost that takes to build solutions, like Britive in house are vastly underestimated. And it's usually one or two years after they start looking at the the situation say, this is not sustainable, and they start looking for social enterprise. The second scenario is, multi cloud is reality for enterprise, even if they could build in house for one of the major platforms, let's say AWS, as soon as they start expanding to other cloud providers like Azure and GCP. Now, the cost exponentially goes up, and it becomes a lot bigger, you know, undertaking to doing in house. And to do question, what size and what complexity of organization needs a solution like writer, our belief is that every organization needs a solution like Britive, that has, you know, clouds assets to protect. And if you're in business, you have at least employee data, if you're in business, you at least have some customer data, and so on. Right, so and you proprietary, your proprietary assets, and Crown Jewels, all of these things need to be protected. So there really isn't a part of why and what businesses do or do not need a solution, like right

Jodi Daniels 16:24

here, the build your own tool all the time, especially in the privacy space, I always say there are experts in the field. That's why you work with the experts, they've done it all for you.

Justin Daniels 16:35

Well, you know, when I think about your business in the cloud, and the security challenges of today, I think of Cassia, I think of the vendor ecosystem that provides services as part of the cloud offering. And so my thought is with that in mind, how does your tool for people who use it help them better protect themselves when there's nothing you can do as a customer, if your cloud vendor has its own vendor get hacked, and because they want access to all of the customers of that cloud vendor?

Art Poghosyan 17:12

Now, that is a, that is a very big problem, you're correct, Justin, because these days, even small businesses that are very open and have a lot of, you know, a partner, business partners, and they have third parties that they're there, they're doing business with. So they're essentially their security extends to all these third parties that have in some way, you know, control or access to their data to their environment, infrastructure, how Britive it protects that this is actually one of the the biggest value adds to the customers that, you know, as I said before, right, the identity and access in in organizations that are have embraced cloud becomes very, very important for securing the environment. So that goes for the or any third parties that need to be granted access to their environment to their data. And that's where Britive helps make that very efficient, at the same time, you know, facilitate the business knee while not making that security, be a any kind of, you know, burden or barrier for doing business with this entities, right. So that becomes a very natural and agile method of granting access, and that access always expire. So there's no risk left to be attacked from the from the outside.

Justin Daniels 18:48

Well, thank you for that. So one thing that we love to ask all of our guests, particularly someone such as yourself, with so much experience in the security field is on a personal level for our audience, what is your best security tip that people could benefit from? Yeah.

Art Poghosyan 19:07

So this, this may sound kind of very, very simple, but really to the broader audience. My best tip is to actually try to learn security. Security and privacy teams these days are the kind of modern day heroes defending against, you know, all kinds of sophisticated attacks every day. And security really has to become everyone's business to protect our companies to protect our businesses. And so, fortunately, there's a lot of educational content and material out there about security these days. And every every, every employee, every team member can take half an hour to 45 minutes a day to read up about something like ransomware or privileged credential attacks or phishing attacks, and just really understand how attackers succeed. And what each and every one can do to stop them from from from carrying his attack.

Jodi Daniels 20:06

And when you are not building a company trying to help fend off these types of attacks, what do you like to do in your free time?

Art Poghosyan 20:18

Which is very, very little, but I still do get some free time the end. Thank you for that question. I love outdoors. God, I tried to do some hiking whenever I can. Fortunately, I live close to Angeles National Forest big, you know, wilderness area, I'm close to that. And there's a lot of hiking trails, so I I spent some time there. But other than that, I like reading books. And ironically, these days, most of them are about how to grow your business. That's my other hobby.

Jodi Daniels 20:53

So do you have a favorite book that you've read maybe in the last little while that you want to recommend?

Art Poghosyan 21:00

Ah, sure. The one that I just finished with? The title was The Qualified Sales Leader by John McMahon and it was a really good book I thought and very timely for where my business is and what we what challenges we need to tackle.

Jodi Daniels 21:21

Excellent. Well, if someone would like to learn more about Britive and connect with you, where should they go? I'm fairly accessible.

Art Poghosyan 21:29

LinkedIn, I respond to messages. Artyom Poghosyan. If you search for Britive, just look for the Founder, CEO. That's easier. But otherwise, email is

Jodi Daniels 21:44

Excellent. Well, thank you so much for sharing all of this information with our audience. We really appreciate it. The same thing you want to add

Justin Daniels 21:51

no spin. Great. Everyone thinks everything's secure on the cloud and Art’s here to say that not so much.

Jodi Daniels 22:00

Excellent. Well, thank you again are we're doing

Art Poghosyan 22:04

just as but we're getting there with with with Britive and others that are really helping secure the cloud. We'll get there.

Jodi Daniels 22:13

Absolutely. Thank you again.

Prologue 22:19

Thanks for listening to the She Said Privacy/He Said Security Podcast. If you haven't already, be sure to click Subscribe to get future episodes and check us out on LinkedIn. See you next time.

Ted Harrington

Ted Harrington is the #1 best-selling Author of Hackable: How to do Application Security Right. He is also the Executive Partner at Independent Security Evaluators, a company of ethical hackers famous for hacking cars, medical devices, web applications, and password managers. Ted has helped hundreds of companies — including Google, Amazon, and Netflix — fix security vulnerabilities. He also hosts the Tech Done Different podcast.

In addition to this, Ted is a professional keynote speaker and the Co-founder of IoT Village, a traveling hacking event series. Previously, he was the Chief Executive Officer at NMG Technologies and the Director at Wolfpack.

Available_Black copy
Available_Black copy

Here’s a glimpse of what you’ll learn:

  • Ted Harrington defines ethical hacking
  • Why is security lagging behind technology developments?
  • How a team of good hackers can actually strengthen your cybersecurity strategy
  • The importance of recognizing security risks and taking steps to reduce them
  • What are the differences between vulnerability scans, vulnerability assessments, and penetration testing?
  • How to build a strong security perimeter around your company’s technology
  • Ted’s top security tip: use a password manager

In this episode…

Hackers are evil people trying to destroy companies and wreak havoc on the world of privacy and security. Right?

Not necessarily. The word hacking and the term hacker have become grossly abused. Hackers are neither good nor bad — they are simply problem solvers. They see a system and say, “It’s supposed to do one thing. Can it do this other thing instead?” As Ted Harrington explains, the differentiating factor is the hacker’s motivation: are they after personal gain or trying to harm an organization? Those are attackers. On the other hand, ethical hackers find vulnerabilities in order to fix them and make the technology stronger. By identifying all the holes in your security perimeter, a team of ethical hackers can show you how to make your defense almost impenetrable.

In this episode of She Said Privacy/He Said Security, Jodi and Justin Daniels sit down with Ted Harrington, Executive Partner at Independent Security Evaluators, to discuss how ethical hackers can improve your company’s cybersecurity. Ted talks about why many companies' security is lagging behind technology developments, the benefits of ethical hacking, and his tips for keeping your passwords secure.

Resources Mentioned in this episode

Sponsor for this episode…

This episode is brought to you by Red Clover Advisors.

Red Clover Advisors uses data privacy to transform the way that companies do business together and create a future where there is greater trust between companies and consumers.

Founded by Jodi Daniels, Red Clover Advisors helps their clients comply with data privacy laws and establish customer trust so that they can grow and nurture integrity. They work with companies in a variety of fields, including technology, SaaS, ecommerce, media agencies, professional services, and financial services.

You can get a copy of their free guide, “Privacy Resource Pack,” through this link.

You can also learn more about Red Clover Advisors by visiting their website or sending an email to

Episode Transcript

Intro  0:01  

Welcome to the She Said Privacy/He Said Security Podcast. Like any good marriage, we will debate, evaluate, and sometimes quarrel about how privacy and security impact business in the 21st century.

Jodi Daniels  0:21  

Hi, Jodi Daniels here. I'm the Founder and CEO of Red Clover Advisors, a certified women's privacy consultancy. I'm a privacy consultant and a certified informational privacy professional, and I provide practical privacy advice to overwhelmed companies.

Justin Daniels  0:37  

Hi, Justin Daniels. Here I am a technology attorney who is passionate about helping companies solve complex cyber and privacy challenges during the lifecycle of their business. I'm the cyber quarterback helping clients design and implement cyber plans as well as help them manage and recover from data breaches.

Jodi Daniels  0:53  

And this episode is brought to you by honest thinking this time, Red Clover Advisors, we help companies to comply with data privacy laws and establish customer trust so that they can grow and nurture integrity. We work with companies in a variety of fields, including technology, fast, ecommerce, media, and professional and financial services. In short, we use data privacy to transform the way companies do business. Together, we're creating a future where there's greater trust between companies and consumers. To learn more, visit Hi, you're staring at me? Yes, as I should, as you know, it's like evil stare what's going on? No, cuz

Justin Daniels  1:37  

I'm looking forward to finding my hair. I'm looking forward to finding out how I can ethically hack all of your devices today.

Jodi Daniels  1:43  

I'm sure you are. We did learn earlier a trick on how to teach children how to ethically hack moms text messages, dude. Let's see what we're gonna learn today. All right, well,

Justin Daniels  1:54  

I'm excited to introduce our guest it is Ted Harrington, a number one best selling author of Hackable: How to do Application Security Right, and the Executive Partner of Independent Security Evaluators, the company of ethical hackers famous for hacking cars, medical devices, web applications, password managers, and soon to be Jodi’s devices. He's helped hundreds of companies fix 10s of 1000s of security vulnerabilities, including Google, Amazon and Netflix. He hosts the Tech Done Different podcast. Hello, Ted.

Ted Harrington  2:28  

Hey, guys, thanks for having me. Excited to be here.

Jodi Daniels  2:30

Absolutely. Well, one has to ask the question, how do you get into a career where you hack everything and try and find all of the problems? So tell us a little bit about your career journey?

Ted Harrington  2:48  

Sure, yeah. So I mean, for anyone who's listening who doesn't know what Ethical Hacking is, maybe I can just define that really quickly. So no, we're talking about because the word, the word hacking and and the term hacker as becomes just so grossly abused. And if you were to believe whatever the media thinks, or however they portray hackers, you think it's a bad person doing evil. And that's really not entirely true. Because a hacker is neither good nor bad. A hacker is a problem solver. A hacker is someone who's creative hacker is someone who looks at a system and says, hey, it's supposed to do this thing, can it do this other thing instead? And the fork in the road comes after that point, which is when it comes down to what the motivation is? So is it to harm an organization or gain some sort of personal gain? Well, that would be what attackers do, or is it to try to find vulnerabilities in order to fix them in order to make the tech stronger. And that's what ethical hackers, that's what people from my corner of the world do. And so, I've always been really drawn to the idea of creative problem solving things that are really difficult to do, how do you serve others, you know, that kind of stuff. And throughout the course of my career, all the different steps along the way, I was sort of looking for that combination of principles. And it was about 10 years ago, when I met, the guy who would become was now my business partner. And we started talking about this idea of ethical hacking. And at the time, I wasn't even in security, I was working, I was running a tech company that sort of was in water was in water conservation. And I started hearing about this idea of like, wait a minute, you get to do the bad guy stuff. But you're a good guy and you get paid for it. You don't go to jail, and you help people I'm in let's do it. And once you sort of go down that rabbit hole, you will never look at the world the same way again. I mean, I never look at a line that everyone else is waiting in. I'm like, I should get in the back of that line. I'm like, how would I go to the front of this line? How do I do that?

Jodi Daniels  4:49  

I love it. It's such an interesting, I say this all the time, but I always find careers, paths and journeys and experiences. super fascinating and interesting. So thank you for sharing and now Next time I look at a line, I'm gonna say, Well, Ted told me I can go to the front of the line.

Ted Harrington  5:05  

Well, the question we have to ask is this system, right? This system that is a line is built to do a certain thing in a certain way. And the question we have to ask is, well, what if I did something different? And I'm not, I'm not advocating that people, you know, do things unethically or break rules, or steal or anything like that. But I think lines are something that we can all relate to that, like, no one likes waiting in line, right? But you'd be shocked how many people just literally will stand in a line because it's there. We had this funny thing happen at I forget which conference, it was, there was a conference recently, and I wasn't personally able to attend this conference. So I heard this story secondhand. And I was like, I can't if it wasn't people that I trusted. I mean, it sounds like there's no way this story is true. Where I had a bunch of I'd signed a whole bunch of books and sent it with my team who was at this conference, and they had a table, and they're giving these signed copies of my book away. And so there's, you know, this long line of people waiting to, you know, to try to get their hands on the signed copy of a book. And eventually we run out of books. And so the next person who gets the front of the line, we run out of books, and the person on our team who's been handing them out, she says, I'm so sorry, we're out of signed books. And this person who'd been waiting in line for like 40 minutes, says, Oh, cool. I was wondering what this line was for. I'm like, you stood in line for 40 minutes not knowing why you were standing in it, what are you doing? But that's the way people think about systems all the time is they try to a normal person looks at a system and says, How do I comply with the system? What are the rules of the system? I'll follow the system. And hackers, both the good guy and the bad guy. And they say, Well, what are the rules of the system? How can I defeat the system within those rules?

Justin Daniels  6:46  

Basically, ethical hackers are another word for entrepreneur, they see a system and they want to break it or do something different.

Jodi Daniels  6:55  

There you go. Exactly.

Justin Daniels  6:57  

Well, speaking of technology, why do you think technology? As I talk to you today, I'm now working with NF T's and I do things in the Ethereum blockchain. All this is evolved, but security seems to be stuck back in 1991.

Jodi Daniels  7:15  

Why do you just seriously think, Yeah, I figured why not.

Ted Harrington  7:20  

I was good with it, you know? So why is Why is security lagging? I've grappled with that question a lot. And the best answer that I can get for it is a very human one, which is that people genuinely one of two things, they genuinely don't understand it, or the business pressure, which also doesn't understand it is too heavy. And when the business pressures you to do a thing in a certain way, that's counter to the good security decision, you have to follow the where the pressure, you know, is pushing you. And so if you've got an executive or a leader who's pushing a business priority doesn't necessarily understand the security implications. And that's not a value judgment. That's just, I mean, there's tons of things I don't know anything about. Um, but if someone who is a leader, and maybe they came up through sales, and that's how they eventually became leader of the whole organization, of course, they're not going to understand security, they shouldn't, that's not their background. And so when they're making decisions that are counter to their best interest in security, all the people who report into that organization, they kind of have to follow that. And so, I think that's really where it comes down to. And that's why I spend so much of my time and energy trying to educate, you know, reading books appearing on great podcasts, like what you guys have delivering keynotes stuff like that, to try to really drive that education, because until we can first understand the challenges, and then make sure that the business prioritizes, how to solve them correctly. I don't think it's going to change very fast.

Justin Daniels  8:51  

kind of tend to drill into this concept a little more, especially in my line of work. From what I see is, for example, after a ransomware event, you know, we like to do an after action report, think about what went wrong in the military. When you do it, well, you want to think like your enemy, so you know, where they might be thinking much like an ethical hacker might. So what do you think stops businesses from taking the same type of mindset with the not so good hacker?

Ted Harrington  9:24  

Well, I think there's some good news in there, which is that definitely on the pioneering and of companies, companies are doing that. Exactly. And I mean, I think when I look at in our consulting practice, anyone who's hiring our company, they want to think like that, like that's why they're hiring. It's like we need someone who can help us think like a hacker. We want to understand how we might be attacked, exploited so we can defend against it. So there definitely are companies doing that. That's a really good sign. But they're probably not the majority. They're there. If you look at any sort of adoption bell curve right there, one of the ends of it They're not in the middle, you know, majority or whatever. And so for those other companies who aren't quite there yet, I think it's maybe it's one of two things, either they don't know they should do that, or they don't know how to do it, because it is a really different way of thinking. I mean, even just that story just talked about the line a second ago, like most people don't think that way. And most people shouldn't think that way to be honest, because once you start thinking that way, you can't unthink it? I mean, there's literally nothing that I do, where I'm not like, how if I wanted to, how would I break this system? Whatever the system is, and that's everything from, you know, analog systems, like, like a line or a, you know, even just like a parking meter, like, oh, how could someone get free parking? If they wanted to this thing supposed to collect payment? How could they do it without payment? And most, that's a weird way to think. And I don't think most people should think that way. But it is critical that there is a person or team who is thinking that way. And it's usually probably ideally beneficial to have that be someone outside an organization who kind of is independent from the bias that might exist in any company. That's certainly what I advocate.

Justin Daniels  11:04  

So one of the things I wanted to get your take on just thinking about the conversation we're having now is, you know, you mentioned about the guy in sales, who becomes the CEO of the Corporation, because you know, sales are the lifeblood of company. And you know, most business, people on sales are executives, they want to be positive, they always think the best is going to happen. And when we turn to security, we talk about privacy, especially on our show is, well, most people think, well, that's not going to happen, or oh, I don't want to think about risks. I'm here to sell a tool or make this drone work. And how do you think that mindset plays into it where people only want to think about the positive as opposed to thinking about? Yeah, there's some stuff behind door number two, that could be a problem?

Ted Harrington  11:52  

Well, first of all, I think that's a wonderful mindset. I think that that is the one of the most important ingredients for entrepreneurial success, like, you have to believe that you're going to succeed. And if you don't believe that, then you're right out the gate, you're not going to. So I would never want to stamp down someone's optimism. But one of the things we do have to realize is that there's a lot of cognitive bias that exists, especially amongst driven executives, entrepreneurs, leaders, people who are trying to change the world, there's this bias, that's that is very much like, well, I'll find my way through it, or I'm going to be the lucky one what happened or it hasn't happened yet. And so that means it won't happen. And when we think about really any type of risk, security, of course, being one of those types of risks, when you think about building an organization, we always have to be thinking about all types of risks. So how does a company think about things like their competition, new regulation, changes in the marketplace, competitors, poaching, their top talent, all of these things are risks that leaders are thinking about all the time? And how do they deal with those risks? They don't drown in their sorrows, that, hey, some competitor might come and disrupt the industry and put me out of business, but they're aware of it. And they make sure that part of their plan includes how are we going to deal with that? How are we going to identify the risk? How are we going to measure it? How are we going to reduce it? And really, security is the same thing. It should, it's not the thing that should stop you from having progress and solving the problem you're trying to solve. But it should be seen as one of those things that you need to consider. And one of the things that I really strongly advocate for, and I don't hear a lot of people making this case, is that what most people think about security is, it's the removal of a bad thing. Right? Like if you invest in security, you won't get hacked, that's when most people will try to think about it. And that's a good way to think about it for sure. I'm not advocating against that. What most people don't do, is they don't think about how can it be the pursuit of a good thing. So most people think it's avoidance of a bad thing, don't get hacked. And I say, in addition to that, we should think about how can it be pursuit of a good thing? Because the fundamental truth is that a company who can demonstrate security, so who can actually invest in security, do it right, and prove it that resonates with their current and prospective customers, because the current and prospective customers, they want to work with companies who are secure. So it's this enormously differentiating competitive advantage that companies who who are doing security right have over pretty much everybody else.

Jodi Daniels  14:26  

That is very similar to the privacy side of the equation where a lot of companies are focused on I have to comply with XYZ law because there's a fine and if I don't, then, you know, I might have a huge cost. And actually, what is a significant driver that is often overlooked is that your customers are expecting that 52% of customers or people won't buy from a company over privacy and security concerns. More than 80% of people are concerned over that. And so I really like how you recognize that it can be a competitive edge that it can be a differentiator, because when you're comparing company A and company B, if they're all the same, which one are you going to pick? Well, you're gonna pick one that has some edge. And if one of those edges is privacy and security, that's really important. If we think about now, how can a company make sure it has that edge? So maybe turning to tactics a little bit? Why does a company need to be doing? We talk about training, good measures, vulnerability scans assessments, we've been talking about ethical hacking so that we can identify the problems in advance. Can you share a little bit like you helped explain what Ethical Hacking is? Can you help us understand the difference between a vulnerability scan and an assessment? Sure,

Ted Harrington  15:45  

yeah. So let's use a metaphor. Let's talk about cars to try to illustrate this because the question you're asking is an astronomical problem insecurity right now, which is that terms are used inappropriately, to mean different things and what they actually mean. So this is a very, very common case right now that there's one other term that you didn't ask about that we should mention, because people will be familiar with it. And it's relevant to these two terms you asked about, which is penetration testing. So what is happening in most organizations right now is that people are asking for penetration testing. The reason for that is there's a lot of regulatory frameworks that literally require it. It's become the term that everyone's like, I don't know what it is. But that's the security testing I've heard of. So like, I'm gonna ask for that. So people ask for penetration testing. But the problem is that they're usually sold something else, they usually sold vulnerability scanning. But what they actually need is usually something else is usually vulnerability assessments. So let me illustrate what these three things are using a metaphor. And we can use cars as this metaphor. So penetration testing is kind of like when the car makers are building a vehicle. And they want to know, how will it perform in a specific crash scenario? For example, what happens in a head on collision? Will the passenger survive? So what do they do? They literally crash it into the wall, and they measure what happened. That's what penetration testing is kind of like you take a completely built system, something that's gone through all kinds of testing and what's called hardening to make it you know, better and more secure. And then you simulate a real world exercise that has a really narrowly defined scope, and a very binary outcome, like, did the passengers passengers survived or not? That's really what it's looking at. So a penetration test is kind of similar, right? You're looking at, hey, could an attacker escalate privileges within this, whatever these parameters are, that we're looking for. So that's what penetration testing really is. But if you were to Google right now, that term, most of the results you get back are going to be for vulnerability scans. And that's a problem because they're two really different things. So vulnerability scan, if we use our car metaphor is more like when the check engine light comes on in your car, you go to the oil change, place your mechanic, they stick that little thing into the dash, and it, you know, interrogates the computer and spits back some codes and says, here's how you turn off the check engine light. It's very cheap, it's very inexpensive, it's very quick, but it can only look for known issues. And think about how different that is from what penetration testing is, it's really like the scan of the onboard computer is pretty different from crash testing the car. But what people are really wanting is this third thing altogether, which is a vulnerability assessment. And what a vulnerability assessment is, is a comprehensive evaluation of how all the different systems might work together, how might someone attack this system? In all the different ways it could? So for example, we use this metaphor again, that's like the automotive safety engineering department. What do they do? They look at things like how does the side impact beams work with the lane departure technology work with airbags work with the seatbelts work with the the roll cage all these different systems? How does it all work together to make sure we maximize the likelihood the passenger survives an incident is that holistic view that people are looking for? It's the really specific custom ways that the system might be defeated. That's really what people are after not just something cursory, like a scan would be, or something very narrowly scoped like what a pen test might be. And by now understanding the difference, hopefully, people can walk away and say, Okay, I first of all, I recognize there is a difference. Second of all, I recognize the difference is significant. But most importantly, they're able to walk away saying, What should I do with this information? And what you should do with it is not necessarily just memorize the terms as much as I want people to use the right term to describe the right thing I recognize. The horses left the barn, right? Like we're not going to get the whole world start using the right term. So instead, what do we do? We start with a goal. So as organizations think about how do we measure where how we're doing on security, start with your goal is your goal to have this Real World exercise a very narrowly scoped situation that has a binary outcome. That sounds like you want a penetration test. Do you want just a quick, inexpensive? Look at the most common issues knowing that you're excluding anything that would require even a slight degree of sophistication? That's where you want to scan? Or are you looking for a more holistic view? You want to understand custom severity, you understand? How do all the components of the system work together? How do you fix it? That's where you want your vulnerability assessment.

Jodi Daniels  20:31  

I think that was a really well done way. I love metaphors and analogies. So thank you for breaking it down. And I think it's helpful because not every company will always be able to jump to one of those, but they'll be able to understand at least what they're getting, hopefully, when they're talking to someone about what they're right. Next up.

Justin Daniels  20:53  

I think Ted's point, most companies ask for things, and they don't know what the terms they're using mean. So they end up getting something that doesn't do what they think it does, or they put over reliance on something that isn't as comprehensive as they think it might be.

Ted Harrington  21:10  

And I've had a lot of empathy for that, like I took it, my heart goes out to that, right, because think about, let's put ourselves in the shoes of the person who's buying that service, right, they're buying that service, because they don't do it, that's something you should first of all have independent anyway, you're probably not the security expert to begin with. Or if you are the security expert, your expertise isn't necessarily in ethical hacking. So you go to find somebody and say, I need this thing, give it you know, sell me this thing, and you're trusting then that person can give you what you need. And unfortunately, that's not necessarily what's happening. So that that's a does not cool. That's that was a big motivator for I want to write my book, because I saw stuff like that happening all the time, I'm like, I cannot stand to allow that to happen anymore. So it's a, it's a practical reality that everybody faces even if they don't realize it.

Justin Daniels  21:56  

So kind of building on what you're talking about that one of the things that I've seen multiple times in the last 12 months, is with the quick pivot to the remote workforce. The perimeter, your security perimeter is now basically your employees assets that don't belong to the company. And I've had multiple ransomware incidents whose root cause was something that happened on the employee's computer that moved on to the network. And so my question is, how do you think about security and what you do in light of a security perimeter that is now basically your employees own computer systems? It's not well defined anymore?

Ted Harrington  22:40  

Well, here's the hidden secret within the question is that the perimeter didn't exist already, even before COVID. And so what COVID did was COVID revealed that reality. So before, a lot of companies had this very misplaced reliance on the idea of a perimeter of the idea that bad guys on the outside, the good guys are on the inside. I remember really vividly, I was at a conference after I had delivered a keynote. And I was in the whatever, happy hour thing that happens afterwards. And I was talking to somebody, and we were talking about this idea. And I was I was asking them about what they do with their organization. They said, No, we're, we're pretty secure. Yeah, I mean, we've got everything locked down. You know, we have a Norton Antivirus. So I think, what more could we need? And I'm like, oh, boy, like, here's here's a tech leader who thinks that a, a tool like that is the only thing that they need. They think that we've they literally said, Oh, we've got these really great firewalls. And what it reveals is that a lot of organizations don't understand that. Attacks originate from the inside, whether the insider is themselves and attacker or attackers escalate privileges to get insider status. And so the perimeter has long been gone. And the companies who are doing well with security have already recognized that. And they're implementing tactics that are known as defense in depth. So if you think of defense in depth, like, think of it like a castle, so for anyone who's ever been to Europe, we've all seen, you don't have to have been to Europe, if you've seen Game of Thrones, like if you went to high school, you you're familiar with the idea of a castle, and castles have they have the moat and the drawbridge. They have either the archers on the turrets, they have guys pouring the hot oil down, you have fortified compartments within the castle itself. These are all defenses that layer on top of each other in order to do two things. Number one, make it harder for the attacker to get in. And number two to make it harder for the attacker to succeed. Once they do get in, make it hard for them to succeed. And in the case of a cyber attack, extract whatever the assets are that they're looking for. And so when we think about the move to a remote, remote workforce, what we have to do is we have to realize it's the same principle. We still have to think about the Any amount of access or privilege that is provisioned to anybody changes the way in an organization might be attacked. And so I actually think from a security standpoint, and I don't, I don't mean this, I'm not celebrating a pandemic that so many people have suffered from. But one thing that is a silver lining that came out of the pandemic, is it has actually forced companies to rethink their philosophy on security. And that is a

Justin Daniels  25:24  

very good thing. Anything on that? was pretty good. So yeah, when I think of the castle, it's funny you say that, because when I use the castle, I take it from Lord of the Rings in the Battle of Helm's Deep, and add the relief army for good measure.

Jodi Daniels  25:43  

I think they're all great analogies. So Ted, as a security Pro, I imagine that you're able to share all kinds of really interesting stories. And people then always ask you, well, what what should I do? What are some good couple quick tips that I should implement, either personally or at my company? We always like to ask all of our guests, what is the best privacy or security tip that you would offer?

Ted Harrington  26:09  

Yeah, I love that. I get that question all the time.

Jodi Daniels  26:12  

And feeling you might be read Ted’s book, yet? Yeah. Well,

Jodi Daniels  26:19  

that's in the next, we need to try to help me

Ted Harrington  26:24  

in the book. I'm actually not even sure if that is a real struggle that I have, because who I really serve are the, you know, people at companies who have resources to solve these problems. And I have a lot of empathy for the individual who's like, Okay, I'm now aware that security is scary, but I'm just a person, what do I do. And, obviously, you can't do the same thing that a company can do, you don't have the same money or access to people or certain skills. But there are some basic things. And my favorite recommendation, because it's easy to do. And it's really effective. And it makes people's lives easier, is a combination of two things. So number one, use a password manager. And number two, use it in a way that actually modifies the passwords that are in the password manager, I'm happy to explain that second part, if you want, I'm in the process of writing a blog about it. So I can give people step by step advice on on how to do that. What that second part does is it makes it so that if the password manager ever gets hacked, the passwords themselves actually don't get hacked. But if we focus just on one piece of advice would be the first one, which is use a password manager. And the reason that I think they're wonderful is because the most important thing that an individual person can do is use long, unique passwords for every service. And the unique is important. And here's why. Attackers know that people are lazy. And attackers are just like you and I, they're efficient. So when any company gets hacked, the credential pairs, the username and password usually wind up in a database available on the internet or on the dark web. So what does an attacker do they go get that database. And they make this assumption. Well, the credential pairs that are in this database of whatever XYZ web app that got hacked, I bet you a lot of these people use those same username and passwords on other services. So they're going to take those and they're going to try them on other services. And a huge percentage of those are going to work. So that's a really big problem if you're reusing passwords. So what a password manager does is it enables you to make sure that you're using a unique password for every service, you don't have to memorize them anymore. You don't have to have them written down, you don't have to, like have some weird scheme in your head where it's like, well, you start with my dog's name, and then a year and then I modify it by the month. It's like, no, it's just you don't have to know what the password is the password manager memorizes it for you. And it makes your life so much easier, you have significantly improved security. Because we're using unique passwords, you can use the most complex password that each site will allow. And you only have to remember the master password to log into the password manager itself. It auto populates it for you. It's such an incredible improvement for the individual user. And people who rightly asked the question of, well, what about the password manager getting hacked, that's where the second piece of advice would come in how to use it slightly differently. But even in that case, using a password manager is going to be better than not using a password manager.

Jodi Daniels  29:16  

That sounds good. Very common tip on our show.

Justin Daniels  29:24  

It sounds like if I'm going to use the password manager, I would put a password in there that is like almost written in a like a form of cryptography. It might have like symbols or things that I know how to translate it. So even if they hack into that they see something that unless they know the code that's in my head, they couldn't take the password.

Ted Harrington  29:45  

You're mostly right. Yeah, you would modify you'd modify it by adding something to the password. So like what's in the Password Manager is part of the password and then you add something to it. You manually type it in when you actually log into the website or whatever you're logging into. So if a password manager ever gets hacked, they only have part of the password, they don't have the whole thing. And then the modifier you just remember.

Justin Daniels  30:05  

Really awesome tip, I'm gonna have to try that thing.

Ted Harrington  30:09  

Yeah, it's a little painful to like get it set up, because it requires a little bit of change of behavior, like most people don't set it up this way. But once you get it set up, your life is just so much easier.

Justin Daniels  30:21  

No, it's almost like you've gotten your own version of multifactor on the Password Manager, which, that's a great idea. Thank you for sharing that. Your New

Jodi Daniels  30:31  

Password Manager fans over here. The other reason why password managers are so good is the ability if you ever have to share across the family instead of emailing it, you can now grant access without them even seeing the password. So those are some good benefits.

Justin Daniels  30:46  

Well, we've spent so much time talking about security and ethical hacking in your book, when you're not busy writing a best selling book or finding security flaws and companies. What do you like to do for fun?

Ted Harrington  31:00  

Man, I love to travel. It's like, I read this thing this morning. And this book that I'm reading, it's it was talking about peak performance, basically, and it talks about the I forget the exact sentence was but I literally stopped and put the book down when I read this because like, wow, that that just hit me in the heart bone. And it said something like, when you're doing what you truly love, you don't have to work on being present. Because you would you wouldn't rather be anywhere else. And that is exactly how I feel when I'm in another country having a new experience. My eyes are seeing scenery I've never seen before. I'm having food I've never tried before. I'm hearing accents I've never heard before. And that to me is just so so stimulating and rewarding. It's like as soon as I get home from whatever trip I need to adjust the jetlag. And I'm immediately figuring out what the next trip is gonna be.

Jodi Daniels  31:48  

So what's the next trip?

Ted Harrington  31:50  

Oh, well, I just booked a trip to Mexico City go down there for a few days haven't been there yet. And then probably Costa Rica right after that.

Jodi Daniels  31:59  

Here Costa Rica is beautiful. It's definitely on my list. For sure. Well, Ted, thank you so much for joining us today. How can people learn more about you? Grab a copy of your book and connect?

Ted Harrington  32:11  

Yeah, it's super easy. Just go to Everything's there. Where to follow me on social media information about my book if you need help with security testing, all that stuff is there. Yeah.

Jodi Daniels  32:24  

Well, wonderful. Well, thank you again, we appreciate all the great advice that you've shared.

Ted Harrington  32:30  

My pleasure. Thanks for having me.

Outro  32:35  

Thanks for listening to the She Said Privacy/He Said Security Podcast. If you haven't already, be sure to click Subscribe to get future episodes and check us out on LinkedIn. See you next time.

Andrea Amico

Andrea Amico is the Founder of Privacy4Cars. It is the first company focused on solving the growing data privacy and security issues posed by vehicles. Through its unique platform, Privacy4Cars is increasingly convincing auto finance fleets and dealerships to provide sensible protection for consumers. Privacy4Cars also offers free help to consumers who want their data deleted and privacy respected by asserting their legal rights.

Andrea is also an Adjunct Professor of Engineering Ethics at Kennesaw State University. Previously, he was the President of Jack Cooper Logistics and the Managing Director of Strategic Initiatives and Analysis at NBC Universal.

Available_Black copy
Available_Black copy

Here’s a glimpse of what you’ll learn:

  • Andrea Amico explains how he became interested in privacy for car owners
  • How much data do cars contain?
  • Andrea describes the services that Privacy4Cars offers
  • Potential laws to protect the data your car collects
  • Best practices for limiting the data you share with your vehicle
  • Where does your information end up after it’s collected?
  • Tips for regaining control of your data
  • Ways car designs could change to improve privacy and security

In this episode…

You probably know a lot about your car. But do you realize how much your car knows about you?

Think about it. You let it know your location every time you open the navigation app. It knows all your friends' contact information when you sync your contacts. It hears all your conversations with the Bluetooth functions and can gather text messages, social media interactions, browser histories, calendar entries, and more. Once you realize the frightening amount of information your car holds on the other side of the steering wheel, you’re likely thinking, “How do I make sure my information stays secure and private?”

In this episode of She Said Privacy/He Said Security, Jodi and Justin Daniels sit down with Andrea Amico, Founder of Privacy4Cars, to discuss how you can regain control of your car’s data collection. Andrea talks about the types of data your car collects, protecting your privacy, and how Privacy4Cars services can help.

Resources Mentioned in this episode

Sponsor for this episode…

This episode is brought to you by Red Clover Advisors.

Red Clover Advisors uses data privacy to transform the way that companies do business together and create a future where there is greater trust between companies and consumers.

Founded by Jodi Daniels, Red Clover Advisors helps their clients comply with data privacy laws and establish customer trust so that they can grow and nurture integrity. They work with companies in a variety of fields, including technology, SaaS, ecommerce, media agencies, professional services, and financial services.

You can get a copy of their free guide, “Privacy Resource Pack,” through this link.

You can also learn more about Red Clover Advisors by visiting their website or sending an email to

Episode Transcript

Intro  0:01

Welcome to the She Said Privacy/He Said Security Podcast. Like any good marriage we will debate, evaluate, and sometimes quarrel about how privacy and security impact business in the 21st century.

Jodi Daniels  0:21  

Hi, Jodi Daniels here. I'm the Founder and CEO of Red Clover Advisors, a certified women's privacy consultancy. I'm a privacy consultant and certified informational privacy professional and provide practical privacy advice to overwhelmed companies.

Justin Daniels  0:37  

Hello, Justin Daniels here I am a technology attorney who is passionate about helping companies solve complex cyber and privacy challenges during the lifecycle of their business. I am the cyber quarterback helping clients design and implement cyber plans as well as help them manage and recover from data breaches.

Jodi Daniels  0:57  

And this episode is brought to you by i like it more than journal these days. How fascinating Red Clover Advisors, we help companies to comply with private data privacy laws and establish customer trust so that they can grow and nurture integrity. We work with companies in a variety of fields including technology, SaaS, ecommerce, media agencies, and professional and financial services. In short, we use data privacy to transform the way companies do business. Together, we're creating a future where there's greater trust between companies and consumers. To learn more, visit So today is going to be super fun. We're gonna combine what I used to watch as a kid like the Jetsons with your favorite topic of autonomous vehicles, because today, we have a very special guest, we have Andrea Amico, who is the Founder of Privacy4Cars. The first company focused on solving the growing data privacy and security issues posed by vehicles. Through its unique platform. Privacy4Cars is increasingly convincing auto finance fleets and dealerships to put in place sensible protection for consumers. Privacy4Cars also offers free help to consumers who want to get their data deleted and privacy respected by asserting their legal rights. Welcome to the show.

Andrea Amico  2:28  

Thank you, Jodi. And thank you, Justin, I'm excited to be here.

Jodi Daniels  2:32  

Wow, would you like to kick us off,

Justin Daniels  2:34  

I would first like to know what you guys think about the changing leaves, at least where we live, the leafs are trying finally changing to a nice golden brown,

Jodi Daniels  2:44  

why might need some automated car help, because I'm so focused on the beautiful colors, they might be slightly distracted while driving my car. So I need all of my bells and whistles actually in my car are going to talk about to help keep me safe. So I can enjoy all the beautiful orange, yellow and red leaves.

Justin Daniels  3:00  

So you already want a new car that has semi autonomous smooth. Okay, me ready?

Jodi Daniels  3:07  

All right, well, are you going to really get started now?

Justin Daniels  3:11  

Let's get started. Andrea, welcome to the fun. Why don't we start? How did your career evolve to your current position? Hanging out with us and talking about privacy and automobiles?

Andrea Amico  3:29  

Well, that’s really the pinnacle of my career to be able to hang out with you guys, this is, you know, the fact that you make it fun, and that you talk about privacy and security is absolutely awesome. So I'm super elated to be here, about me. I've been in the automotive industry for more than a decade. One of the companies I used to run was actually a used car inspection company. And that's how the I caught the bug of what data is left in cars, because we're doing audits. And we started to realize years ago that hold on a second people leave their home address and garage, their codes in cars have been resolved. Is that a good idea? And when we asked people in the industry, nobody had a clue meaning we all had the admin anecdotes of people saying yeah, I know this, this in a car for sale, or all of us had the experience of renting a car, there's 10 other phones synced into it. But nobody really had any data to say what data is collected by cars. What's happening with this data? Who has rights to it? How often does it happen? Is it the same if it's on rent, or if it's a fleet or if it's a lease, or if your vehicle has been repossessed? Or its total loss because you got into an accident? So that's how I got started. And you know, there we go. Now we're having fun with privacy for cars.

Jodi Daniels  4:41  

You know, it's so interesting. I actually got my start in privacy when I was working at Cox automotive, which is a conglomerate of everything automotive from they call it cradle to grave, and you're talking about the data that's being left in cars. Well at the auction The type of infer or stuff that was left in cars was fascinating. And made literally when cars would get turned in, they uncover I mean strollers and all kinds of personal information wallets. And just like you had shared home addresses and all types of information, it's really, really interesting. Which kind of lens us help educate, because I don't think everyone appreciates and realizes how much data are created from cars today. So help us understand a little bit, what kind of data cars are creating, how much data are in these cars? And where do you think it's gonna go in the next couple years?

Andrea Amico  5:41  

Well, I can definitely answer the last part first, because we do audits, every year, the audits get worse, meaning a higher percentage of vehicles are resold containing personal information. And the average amount of personal information is growing over time, because of two reasons. One, technology is evolving in cars at a very fast pace. So all these new sensors capture extra data. But also, people use more those systems. sure that if you were one of those people who had a GPS and a navigation system, the first ones came out, I think, 23 years ago, back then they were not real pretty. Nowadays, you know, they are basically a glorified iPad, and they're really fun to interact with. And they give you all sorts of information, people use them. And as a consequence, they leave a lot of information behind Bluetooth is also 20 years old, people don't realize that's really the problem data in cars started back then. Because when you sync your phone, most people don't realize that actually the way systems have been designed. He said, your phone will dump data into the car, and then the car reads it from the car, not from your phone. And so there's always a copy of what's in your phone that is resident in the car. And so, you know, the things that I think people probably intuitively think about is okay, maybe there's a copy of my contacts. But there's a lot more. You can find, you know, in all vehicles, it was very common to have the contacts, a copy of the text messages, like the actual text of the text messages. Of course, your call logs, all sorts of identifiers that makes it easy to know, who was driving this car was adjusted, you know, because now I have your unique identifier. Nowadays, you have, you know, track records of your Twitter and Facebook account, they may be in your car, your browser history may be in your car, your calendar entries may be in your car, if you've taken photos with your phone, there's a track record in your car that too. So again, all sorts of things that people really don't realize. And I don't think they're really properly disclosed to consumers, which is part of the reason why we exist.

Justin Daniels  7:50  

It sounds to me like we could have a day soon where you have a lawyer subpoenaing records to figure out where the husband was going, if he was hanging out with somebody other than your girlfriend.

Andrea Amico  8:01  

Well, that's already happening. So so fun, fun story, because you're the lawyer here. So there's one special computer in the cars called the electronic data recorder, people will nickname it, you know, the black box, because it's similar to the black box of plays. And that's used for accident reconstruction. So you you know, you get into an accident, the last few seconds before in the last few seconds after the impact. All sorts of data is recorded. It's typically, you know, pretty boring data, meaning it's your acceleration, what year you were in where your seatbelts on the the airbags final time, did it fire all these kinds of things? Well, their data, which is highly technically used for us, actually construction is actually protected by a law called the Driver Privacy Act. That is the only computer that actually has some legal protections. Meaning that the owner has the rights to say no, and you need a warrant to be able to access data. If you you know, to know who you called, how long you were on the phone and what you were texting. That day, they started building the infotainment system. So it's not covered. So you have no legal rights so that you have no legal protections kind of interesting, right?

Jodi Daniels  9:13  

is very interesting. It reminds me of a conversation we were having as a family with our youngest, and she didn't like the idea that the car knew every every place we were going, she didn't like the idea of the tracking that was taking place in a car and the GPS. And so we had an entire conversation of well, who she started asking whose data is it? It's a really fascinating question. Is it the drivers data? Is it the people who sold us the car, the dealer is that the manufacturer and we had to explain all these different people? It's actually a really complex question because you have some data that wants to get back to a manufacturer for recall purposes. dealers want to have that information to be able to contact you and then It's this kind of data that we're talking about actually in the car. It's really very fascinating and very complex answer to which she didn't really like any of our questions, any of our answers. And she felt like it should be her data. So the young generation focus group of one feels like it is her data that she should be able to be in control of.

Andrea Amico  10:19  

How old is she?

Jodi Daniels  10:21  

She is eight.

Andrea Amico  10:22  

She's a bright young lady. More people would think about this. Yeah. So today, fun fact, we track over 500 companies that actually collect, share and sell data that originated in our car. So it's a it's a huge ecosystem. And I bet I'm just scratching the surface. Sure, if we were to do this podcast a year from now, we'll be well, more of over 1000. I mean, we every day, we find new companies that sell in through this data and consumer ship, by and large, no idea.

Justin Daniels  10:52  

So on that note, why don't you tell us a little bit more about Privacy4Cars? What services do you provide?

Andrea Amico  10:59  

So again, we started with a very simple idea, this was really not meant to be a business. You know, I stumbled upon the fact that people left data and I thought that that wasn't right. And so the idea was always how do we help consumers taking care of that. And when we started to talk to companies, the first reaction they had was, Well, the problem is that there's so many different systems. And there are there are literally 10s of 1000s of variations of systems out there that collect data from cars. But so since they said, well, it's too complex, then we don't have to do anything about it. Right? That didn't seem like the right answer. And if there's something that motivates me, is when somebody tells me that something is impossible to do. And so we started to literally boil the ocean tried to figure out what are all the systems out there, how they work, what data they collect, and how to get rid of it. And today we work with, we give, you know, services for free to consumers. As you mentioned in the intro, if you don't know how to delete the data from your cars, download our app, it's free, and you can figure it out by yourself. It's really easy. You can do it in less than a minute. Typically, if you are worried about which companies have access to your data, and you want to tell them to stop when you can place a request with us, and we'll take care of it again free of charge. It's a it's a service, we're beta testing now. But we mainly work with companies. Some of them actually you mentioned, Jodi, at the beginning of the call. They we work with banks, fleets, increasingly dealerships nowadays. And we help them first of all raise awareness of the fact that their vehicles contain data of their customers. And we help them create processes around that on how to secure the data, how to delete it, how to properly dispose of it, and build compliance records around that so that they can prove to consumers that they're doing the right thing.

Jodi Daniels  12:56  

We're so many of the different laws that are currently coming on. Right. We have California, we have Virginia, Colorado, you mentioned that the kind of black box of the vehicle has a particular law, the Driver's Privacy Act, where do you see any of the law or protections coming for this type of data? I mean, you would think some of that might be covered from these other privacy laws, but it might be kind of it might be different. I know, Justin, you've worked with autonomous vehicles, and there's all kinds of other constituents that are part of it. But I'm just kind of curious for for your thoughts on where and how the law might catch up with the type of data that is being collected? And where we're going.

Andrea Amico  13:40  

Yeah, so we have a very unique perspective on this. And the reason is because cars are very special cars are first of all, and IoT, right? So they collect a lot of data. So beyond the usual privacy laws that everybody talks about, that you just mentioned, right? There is actually a ton of laws out there that regulate data security of data stored in electronic devices, cars, right again, those laws were not written for cars, but they surely apply to cars. There's laws about data breaches, there's laws about biometrics, the most litigated along the lender, and now it's Beibei in Illinois, guess what your car's collecting, you know, biometrics. So, reality right now we track over 200 laws that at the state level that apply to vehicles, not none of those laws were specifically written to vehicles. But that's not uncommon, right. There's there's a, for laws to be written about electronic and electronic devices. I think it will be really hard to say that they do not apply to vehicles. And then there's some specific sectors that have special regulations. So for instance, insurance companies, they have a set of laws that are written by the National Association of Insurance Commissioners. Well, there's a data privacy law that applies to 11 states and there's a data secure The deal also applies to 33. States. Most companies have no idea. So, you know, companies themselves, they have no idea how broad the law landscape is. And that's problematic per se. Because if the companies are not thinking about it, you can bet that there are no protections for consumers.

Unknown Speaker  15:23  

You're thinking,

Jodi Daniels  15:24  

No, I'm pressable? Yes, I'm thinking, you're thinking, you're thinking look on. I know, you want to talk talk about autonomous vehicles I set the stage for you're so excited for your autonomous vehicles,

Justin Daniels  15:35  

I think as important as autonomous vehicles are I'm stunned by the amount of data that gets collected. And I guess a follow up question I have is, you talked about earlier with navigation systems. So now, not only does the vehicle come with a navigation system, but you have Apple CarPlay, which means if you want to go to, you know, Google Maps, what have you, you can do that. And I learned the hard way. And I'd love to get your thoughts on this, that even if I put the privacy settings on my phone, so Google Maps wouldn't be on unless I wanted directions, it really didn't matter because the navigation system built into my car had all of the data anyway. So no matter what I do, if I care about privacy, Andrea, what can I do? I feel like there I'm kind of stuck because of that navigation system on my car.

Andrea Amico  16:31  

You are kind of stuck. But you know, let's talk about a couple of things. Right? So first of all, a myth I want to dispel, I talk often to consumers say, Well, I don't use the navigation in the car, I use, just like I heard, right, I use my phone to navigate, well, tough luck, because the sensors are still on, they're still dropping up, right crammed into your system, typically every one to three seconds. And they stored this data for a very long amount of time. So really, everywhere you go in your car is still logged locally in there. And if your car is connected through telematics, is pushed out to a bunch of companies on Apple CarPlay. So interesting factoid that you may like when we reached out to Apple. So we know that when you connect your phone, when you plug your phone into a USB port, a couple of things happen, right? One is that, again, data will migrate from your phone into the car. And so now your car knows stuff about your phone. But if you have Android Auto or Apple CarPlay, the opposite is also true. Right? Your phone will suck out some data that is generated by the car, it will transmit it to Android into Apple. And so the question then becomes, okay, what data do they have? But so we reached out to Apple, for instance, right? Very promising forward company? And the answer we got is, well, you can log into the dashboard of your Apple products, right? You can all do it right. And you can see all the apps and you can query your data, etc. There's one problem, there is no Apple CarPlay on the list, there is no way for consumers to know what Apple has collected from the car through Apple CarPlay. And then the other thing they say is well just check our privacy policy. Other problem, there is no section in the privacy policy says what data is collected by Apple CarPlay. So even with the most privacy for companies like Apple, this is extremely murky. And that's the very best case that you're dealing with everything else is, you know, much more in the darker shades of gray, as you can imagine. Back to your question, what can you do? Well, you have very limited tools, but you can use your local laws to try to ask companies to apply those laws to you in that for isn't what we're offering, you know, for free to consumer. So you can place a data request, and we'll try to figure out what laws apply what systems or what companies are collecting data from your car, and then we'll place a request, we act as your agent, they will say, hey, Jodi, requested that, you know, her data is not be sold. Tell us, you know, tell us if you're selling the data, what categories of data do you have? And by and large, most companies don't even know how to respond, because they've never received a request before. But I think it's very healthy for consumers to be involved in this kind of questions. Because otherwise, how do you get the change that you want to see in society?

Jodi Daniels  19:31  

You shared that there's it sounds like hundreds of companies kind of on the back end collecting using sharing all this information, what would be some of the examples that people would find very surprising.

Andrea Amico  19:44  

So I think most people understand, okay, if I if I punch in a destination in my navigation, probably my manufacturer will know where I am and where I'm going because they need to provide directions, right? It's really hard to do if you don't know where you are, where you're going. So The answer is yes. You know, of course your manufacturer knows. But then typically, we call the manufacturer but they really are assemblers, they put stuff together made by other companies. So there's going to be a company will provide the mapping system, well, they also get to collect for you are we are going, if you get traffic notifications, they may come from a different company altogether. So they also need to know where you are, where you're going, you may get weather notification, you may get, you know, points of interest, you may be able nowadays to you can buy your favorite drink from from your car. And so all these companies, they have the rights to this information, and many of them have the ability to further shell shares, sell it, rocker it. And then there's giant data aggregators, they collect all this data from all the sources packaged nicely and sell it. So you know, I can go and buy probably your car data for just a few dollars.

Justin Daniels  20:57  

So I guess a follow up question to ask you is, in the last two years, Jodi, and I bought new cars, we have His and Her Kia Telluride. And so we went through the process. And, you know, we bought the car from kiya. And from what you're saying is, I don't have a direct contract with the people who design the mapping system. I don't have a direct contract with the people who put so much of the technology in the car, I just have an agreement with Kia that I bought the car. And so basically, what you're saying is there's all these agreements behind the scenes between Kia automotive and all the suppliers of the technology that go in the car as to what they're doing with the data. And as a consumer, it wasn't explained to me in the sales process, I know nothing about it yet. It's going on behind the scenes. And what you're saying is, obviously it's big business is That is my understanding accurate?

Andrea Amico  21:53  

It is very accurate. So we have reviewed the privacy policy of 40 different makes. And by and large, they all say the same thing, right? When you go to the dealership, and you sign on that dotted line, somewhere in that contract says that you are agreeing to the privacy policy, right? This is not any different when you download an app, and you click OK. Right? It's the same thing people don't get to know read what's happening. But that's essentially what's happening. So when you sign the contract, you're agreeing to privacy policy of the manufacturer, what those typically say is that any data collected by the car is they have the rights to it, they have the rights to share it, they have the rights to use it. And typically, you know, explain the uses of the data. As you know, we use this data to improve the safety of your vehicle. We use it for research, we use it to deliver services to us for antibody says or any other purposes, which essentially means, you know, we can do whatever we want with it. And actually most manufacturer have a 20 year retention policy. So if you 20 years from now, if you don't know where you were today, call your manufacturer, because chances are, there's a blue dot somewhere on the map that says exactly where you were.

Jodi Daniels  23:08  

I can imagine you're very popular at a cocktail party,

Andrea Amico  23:12  

the most promising

Jodi Daniels  23:15  

people probably ask you all kinds of information. And if you were there, what would be the couple tips you would give someone to help make sure that they're aware of what's happening and what control they might be able to offer?

Andrea Amico  23:31  

Well, I'm a very pragmatic person, right? So let's start with the simple stuff. The data that is stored in the vehicle, is the easiest place where you can make some significant process because it's stored locally. And also it's a place where you should really think about because if you're selling that vehicle, that data goes to the new owner, and actually anybody in that chain, right as you as you know, many people will touch that car before it's actually finds a new owner. And since there's no pain, there's no face recognition. There's no fingerprinting cars, right, but your factor authentication is the keys. So anybody with the keys has access to all this data. And it's very easy to extract a lot of data. So make sure that if you're selling your car, you're trading a vehicle you're returning a rental, delete your data is just really good hygiene is like washing your hands, you should do it every single time. The the next thing you should do is go in as the companies that actually touch your vehicle to do the same, because I don't think it's fundamentally fair, that consumers are burdened with something that is, quite frankly pretty technical and pretty complex. And is just, you know, difficult. In fact, we know that today in the United States more than four out of five cars I've resold while still containing the personal information the previous owner so you know that clearly tells you people don't do it well, so dealerships don't do it and auto finance companies don't do it. Insurance companies don't do it. And I think it's important for consumers to ask for it. Now, if you buy a used car, you should be concerned about finding people's data to and you may think, why do I care? Well, you know, just the past week, I was two incidents were reported to me of two different vehicles, in which the old owner had tethered their phone to the car, right, because nowadays, cars come with an app, which sounds really a lot of fun, you know, in the middle of the winter, to say, I'm gonna turn on the engine from my room, so that the car warms up five minutes before I get into it. And it's not really cold, that's awesome. But if you sell your car, or if you bought a car used, and somebody still has the hands on the remote, it can locate and unlock it and started again, in this instance, in just one week, and be reported to me, that's problematic. And that's where the slope from this is a privacy problem to security problem to a safety problem that becomes really steep really fast. And so again, I really think that, hopefully, your audience will think about these kinds of things that will serve ASCII, you know, as a business leadership, are you taking care of my data? And I think they should.

Jodi Daniels  26:17  

I'm so curious and not example, because what comes to mind is that it feels like a design flaw with that app and the ability to transfer ownership. I know that you work with a lot of companies, is that one of the topics that they're starting to address in the design process?

Andrea Amico  26:34  

Well, yes or no. So we did a benchmark of so I scored this by pure chance by writing a car, right? So I see my app on the car. And then I realized after the rental, whoa, hold on a second. Okay, so follow the cargos. So, for months, we watched it going, you know, from Connecticut, originally right to do all the way up to Maine over, you know, two and a half months, we contacted the manufacturer, they were actually really kind in that case, to have a workshop with their connected car team. And they didn't follow all of our recommendations, but at least they tried to make some changes, which is encouraging. Direct, our company never responded to us. So you know, I cannot say as many good things in that case. But then we did the same for you know, 1516 manufacturer, I think, and essentially, they all had the same problem. Typically, all it takes to be able to take over a vehicle is to have a little bit of ingenuity, and at most a little bit of social engineering. And it's really not that hard. And that hasn't changed. In fact, when we disclose it was manufacturer never gave us a response back with the screws through the alkali sec, which is a wonderful organization. By the way, if you ever find something, please say something to the advisor. But a couple of manufacturer got back to us and said, Well, if somebody did that there will be a breach of the terms. And I thought well, you know, now the safety the physical safety of consumers is protected by this giant wall called the terms of service and hoping that nobody ill intention will actually breach them. That's that doesn't sound quite right. That's it. I see you snickering on the other side.

Jodi Daniels  28:17  

I'm a little bit louder. snickering he was a quiet snicker

Andrea Amico  28:22  

No, but I mean, this, this is really stuff, right? Because this is this is becoming really common with because most because you come out of factories nowadays have these systems, they have this services. And they are all awesome. Right? They provide great convenience. But to say that privacy and security has been an afterthought, I don't think is quite cutting it. Right. It is. It's just not a really good situation. And I think until consumers start to ask, what are you doing? What are the incentives for companies to actually do something, either there's going to be a mandate because something ugly is going to happen. I hope that's not the case. I hope we can all have a little bit of you know, prevention rather than fixing bad things from happening because they ever happened before. But realize it right now, there's a court in the case of San Francisco, of It's a terrible case of domestic abuse. an OEM is named in the lawsuit, because of this kind of situation because the abuser allegedly used the connection to the car to physically harm the his ex wife. And I think we just need to be really be thoughtful about this kind of things.

Justin Daniels  29:38  

This is great because you've hit on a theme of I think our entire show. When you say everyone likes the convenience and privacy and security don't even rise to the level of an afterthought, because I'm listening to what you're saying about the car situation. How is that any different from these hospital that's being sued because they're ransomware event? The plaintiff is asserting led to the fact that someone had to be transported to another hospital, where they passed away in route. Because the hospital had been ransomware. And so now you're into a very life threatening situation similar to what you're saying. And yet, I guess, Andre, I'm asking you just taking what you see in the auto industry. What are your thoughts around why consumers are just so seemingly unaware or not thinking about how privacy and security from their Apple to their car, to their Siri, Alexa, it just isn't really registering?

Andrea Amico  30:39  

Okay? I don't have really a say I can, anecdotally, most people do not think of their car as if they do with their cell phone. Right? I have a special drawer in my house where I have all my old laptops, in all my old devices, you know, smartphones, because even though in theory, I could delete the data, I just don't trust it. And I'd rather you know, I'd rather keep them there or physically destroy them didn't do anything. None of us has the luxury of having a special garage where we keep our old cars from. It's just not real guys, you know, cars are just too valuable. You can't do that. Right. And so, but at the same token, you know, if you go to Best Buy or February, the returning retailer, and you bring back your laptop, or you have it remanufactured, they have decided years ago, that they're going to be cleaning up the device before when they put it back in commerce, probably because they didn't used to do so. And then when stuff got back into commerce, bad things happen, right? Same thing with the phones, if you bring your quarterback your phone back to Verizon, at&t, whoever you use, either store them either delete the data, or we'll tell you how to do it. Why we don't do this for cars, it just baffles me, because it's the same thing. In fact, cars have more sensors than the average smartphone nowadays.

Jodi Daniels  32:05  

How many sensors are up in a car these days

Andrea Amico  32:08  

are plenty. I mean, essentially, in a phone, you have, you know, gari scopes in a GPS unit. And you know, you have cameras and light sensors, etc, all of this stuff you have in cars, plus a bunch of other things, right. And they also tend to be a lot more precise, because while your car wouldn't run as efficiently, or we'll be able to deliver these advanced features that you have in cars nowadays, right. And so cars have AdWord cameras in inward, but inward cameras, people have no idea that cars have a world cameras, that's shocking to me, and realize that they're designed to be hidden behind the dashboard. So there's no pinhole like you can and there's no red light flashing say, Hey, I'm recording you, right? So people drive and they have no idea that anything from their, where their eyes are looking to the you know, what are their what their face has been recognized? Nowadays, the leading manufacturer of these systems have acquired two companies that specialize in emotion detection. In other words, the way where this is going is that manufacturers want to figure out, are you hungry, because I know what your favorite Facebook, you know, fast food place is and I can recommend to put it on the route. Now, this creates a host of other problems because you get into an accident in the AI of the car determine you were angry. Maybe you were just a minority in the AI didn't work quite as well as we know that there's some systematic bias issues in all the systems. What's going to happen, then? I think that's a you know, there's a question. I really don't want to see what the answer really is. But probably we're going to have to subpoena and put understand OEMs to say, tell me what your AI does again, before we send somebody to jail.

Justin Daniels  33:56  

So when you're busy, when you're not busy navigating privacy and connected cars, what do you like to do for fun? Maybe you like to race antique carts.

Andrea Amico  34:08  

I should pick up on the antique car, so there wouldn't be any system but then where will be the fun here. Now, you know, I have two little girls and you guys were talking about the leafs turning and just this past weekend we took a very long drive and we went up in the mountains and we will fishing and spending time with my girls is the best time outside of work that I can possibly spend. And if there are any parents on the audience, I'm sure that they feel exactly the same way. Here and it was very amusing. And great parenting tip. I taught my daughter when she was eight. So this is what two years ago how to hack into mom's car to extract our text messages or text messages. I think it's awesome parenting. So if anybody wants tips to Send me a lime and contact us through our website. And I'm happy to teach your kids as well.

Jodi Daniels  35:04  

On that theme was going to ask how can people connect with you? And where should they go to learn more as a consumer to make sure that they know how to properly delete their data?

Andrea Amico  35:16  

Thank you. So, yeah, you can go to spelled privacy number four cars, just one word, dot com. And you can also type in the same word privacy records into your Google Play or iTunes to download your app store to download our app for free. If you go on the website, the top right link right now we are beginning beta testing disturb is to help you figure out who has your data and how to tell them to stop it. So please follow requests, we'd love to be able to help.

Jodi Daniels  35:50  

Wonderful. Well, I know I've learned a lot during our time here together, and I'm sure many others will as well. So thank you so much for sharing, and scaring all of us about what our cars are collecting and doing but empowering everyone here to be able to take control.

Andrea Amico  36:08  

Well, thank you for your I really don't want to scare anybody but I think as with everything and you know I I listened to your podcast, you're doing a great job at educating the public about what are some real risks are there and realities of raising awareness makes all of us safer. So again, thank you for giving me the opportunity to be here today.

Jodi Daniels  36:29  

It's our pleasure.

Justin Daniels  36:30  

Thank you. I enjoyed it. We didn't even talk about autonomous vehicle.

Jodi Daniels  36:34  

I hinted next time. I'll come back. Thank you so much.

Outro  36:42

Thanks for listening to the She Said Privacy/He Said Security Podcast. If you haven't already, be sure to click Subscribe to get future episodes and check us out on LinkedIn. See you next time.

Caroline McCaffery

Alexandra Ross is the Senior Director of Senior Data Protection and Use & Ethics Counsel at Autodesk, where she provides legal, strategic, and governance support. She is also an Advisor to BreachRx and an Innovators Evangelist for The Rise of Privacy Tech (TROPT). Alexandra received the 2019 Bay Area Corporate Counsel Award for privacy and founded The Privacy Guru blog in 2014. She is also the author of the e-book, Privacy for Humans.

Previously, Alexandra was Senior Counsel at Paragon Legal and Associate General Counsel for Walmart stores. She is a Certified Information Privacy Professional and practices in San Francisco, California. Alexandra earned her law degree from UC Hastings College of the Law and her bachelor’s degree in theater from Northwestern University.

Available_Black copy
Available_Black copy

Here’s a glimpse of what you’ll learn:

  • Alexandra Ross shares how she discovered her passion for privacy
  • How will privacy practices evolve as more companies move their data to the cloud?
  • Alexandra discusses the new privacy legislation that is currently under debate
  • Alexandra’s thoughts on the ethical code of conduct for collecting data
  • How ESG (Environmental, Social, and Governance) is impacting private equity and venture capital firms
  • The non-legal marketplaces that are influencing people to take privacy and security more seriously
  • How privacy professionals can help start-ups make privacy a priority
  • Alexandra recommends several resources to start learning about privacy

In this episode…

Technology is speedily moving forward in unprecedented and exciting ways. However, it’s advancing faster than regulation can catch up — meaning consumers are typically unaware of the ways their data is being collected and stored. So, how can your business handle data in a way that builds trust?

Doing the right thing means not just complying with the law. There is legislation under debate for structured data regulation — but if you want to build consumer trust, you should hop on the bandwagon before the law finally rolls around. It’s important to think about the perceptions of consumers. Is the data you’re collecting providing value to your customers? Are you actually managing their expectations and maintaining their privacy?

In this episode of She Said Privacy/He Said Security, Jodi and Justin Daniels sit down with Alexandra Ross, the Founder of The Privacy Guru, to discuss how to create ethical privacy practices for your business. Alexandra talks about how privacy practices are changing as more businesses move their data to the cloud and the various ways ESG is impacting private equity and capital venture firms. She also shares some resources to deepen your awareness of the best privacy practices.

Resources Mentioned in this episode

Sponsor for this episode…

This episode is brought to you by Red Clover Advisors.

Red Clover Advisors uses data privacy to transform the way that companies do business together and create a future where there is greater trust between companies and consumers.

Founded by Jodi Daniels, Red Clover Advisors helps their clients comply with data privacy laws and establish customer trust so that they can grow and nurture integrity. They work with companies in a variety of fields, including technology, SaaS, ecommerce, media agencies, professional services, and financial services.

You can get a copy of their free guide, “Privacy Resource Pack,” through this link.

You can also learn more about Red Clover Advisors by visiting their website or sending an email to

Episode Transcript

Intro  0:01  

Welcome to the She Said Privacy/He Said Security Podcast. Like any good marriage, we will debate, evaluate, and sometimes quarrel about how privacy and security impact business in the 21st century.

Jodi Daniels  0:21  

Hi, Jodi Daniels here. I'm the Founder and CEO of Red Clover Advisors, a certified women's privacy consultancy. I'm a privacy consultant, certified informational privacy professional, and I provide practical privacy advice to overwhelmed companies.

Justin Daniels  0:36  

Hi, Justin Daniels here without the mic near my face. I am a technology attorney who is passionate about helping companies solve complex cyber and privacy challenges during the lifecycle of their business. I am the cyber quarterback, helping clients design and implement cyber plans as well as help them manage and recover from data breaches.

Jodi Daniels  0:59  

And this episode is brought to you by that was a really bad drum low Red Clover Advisors. We help companies to comply with data privacy laws and establish customer trust that they can grow and nurture integrity. We work with companies in a variety of fields, including technology, SaaS, e commerce, media agencies, and professional and financial services. In short, we use data privacy to transform the way companies do business. Together, we're creating a future where there's greater trust between companies and consumers. To learn more, visit

Justin Daniels  1:38  

Well, how do you feel at the 12 o'clock hour as I watched these emails, just your inbox is just

Jodi Daniels  1:44  

overflowing? Yeah, you just don't look at that part. It's very private, you should, you should not look at the number of emails that are there. That's those are my email. But that would

Justin Daniels  1:54  

just be aggregated data I'm

Jodi Daniels  1:55  

not looking at you can't you can't see any of the specifics because our big screens are covering them. That is all true. So don't worry clients, anyone listening who can't see anything. He can just see the number keep going up and up and up even though

Justin Daniels  2:07  

you have attorney client privilege, but okay. Let's introduce our guest. We're joined today and we're excited to have her Alexandra Ross, and she is the director, senior data protection use and Ethics Counsel at Autodesk and an advisor to BreachRX. She is a certified information privacy, privacy, professional, and practices in San Francisco, California. She is also the recipient of the 2019 Bay Area corporate counsel award for privacy. She also launched The Privacy Guru blog in 2014. And in published in ebook Privacy for Humans, available on Amazon and iTunes. Welcome, Alexandra.

Alexandra Ross  2:53  

Thank you. Thank you so much for having me. I'm happy to be here.

Jodi Daniels  2:58  

Well, welcome. We always like to get started with a little bit of the who are you? How did you find your way to privacy?

Alexandra Ross  3:06  

It's a really good question. Um, I went to UC Hastings, which is a law school in San Francisco in the Bay Area and started my practice at a law firm in San Francisco practicing intellectual property. And fairly early in my career, I started working on privacy and data protection issues, it was something that I was drawn to just had an interest in it. And this was before GDPR. So we're talking can spam security were with incident response, drafting privacy statements, you know, some foundational privacy work, but not a lot of the regulatory issues that we see today, as privacy law has emerged. But I was really drawn to that sort of confluence between technology, societal and legal concepts. And a lot of that kind of creativity that legal professional had to bring to applying these existing laws of intellectual property laws licensing. There weren't a lot of laws on the books yet around privacy and working with a lot of technology companies and startups. So that was sort of the the very start of my career. Since then, I worked as an Associate General Counsel at Walmart, and I lead their privacy program at the time. Then I took a few years and I was working within consulting with Paragon legal, and then most recently, I've been at Autodesk and I lead the legal team that supports our privacy, security, data use and ethics programs worldwide. So I manage a team of attorneys and we provide legal strategic and governance support for our global data protection and data use and emerging data ethics program.

Jodi Daniels  4:57  

So basically, you do nothing related Privacy is what you're really here to say

Alexandra Ross  5:02  

I do everything related to privacy. And I like to say I sort of grew up with the privacy field when I started doing privacy. There was no privacy field, there were a handful of us doing it. And now there's hundreds of us. There are organizations dedicated to privacy professionals and advocacy. So it's, I think I was at the right place at the right time. Absolutely, I would enjoy it really enjoyed it. It's been a good career path for me.

Jodi Daniels  5:33  

I was completely getting I hope it came across that way, excuse me, did that. I was really emphasizing how much you have accomplished and done in privacy, really setting the stage for a great conversation here today. I can have a field day. I know you could. God oops, if it didn't come across,

Alexandra Ross  5:54  

not take it as positive now I think excellent. Yeah. And I live in breathe privacy. So it's, yeah, I do. I do enjoy it. And I do a lot of other sort of extracurricular activities around privacy as well. You mentioned the the advisor work that I do with BreachRX, which is a privacy platform for incident response, though, when I'm not working at Autodesk. I'm working on blog posts for the privacy group or website or on helping startups with privacy. So sometimes I feel like I need to disconnect a little bit. Make sure that I have a multiple, you know, diversity of interests. But yeah, I'm pretty deep in the privacy world.

Don Keninitz  6:35  

Well, Justin, kick us off. Well, speaking of

Justin Daniels  6:40  

the evolving privacy landscape, talk to us a little bit about how privacy will continue to evolve when we talk about digital transformation, because what's key to digital transformation is putting everything on the cloud. And it would be great to get your perspective on how privacy will evolve as we put everything on the cloud.

Alexandra Ross  7:04  

And as I say, as I was saying, I think there's three core aspects that you can look at in terms of the evolution and whether privacy is keeping pace with new technology, regulation and enforcement, customer awareness, and then privacy technology itself. So the sort of privacy vendors face or technology that's helping customers and companies manage privacy programs. So you know, new technology has changed the prevalence and accessibility of information. There's just more more data, right? We live our lives online, there's digital information, cloud computing events, and social media AI all these Ways the data is collected and used for many services that actually benefit society in which we rely on. And this innovative technology can add a lot of value to our lives in advance of social good. But then there's also the dangers of privacy. Right? There's the companies that maybe aren't respecting privacy or security, the things we read about in the news. And in that respect, we see a general awareness in custom customers in terms of privacy. So we're seeing more regulation like GDPR, CCPA, we're seeing things where regulators, I think, and society is finally catching up to what's evolving and technology and putting appropriate regulations in place. So I think there's been a bit of a disconnect over the past maybe 10 or 20 years where technology is advancing faster than the regulation can catch up and faster than honestly, consumers are even aware that our data is being collected in this way. So in recent years, we've seen more regulation, we've seen more enforcement, we've seen more customer awareness of how their data is being collected and used in ways maybe that aren't positive, or maybe they want to set different customer preferences to be able to have more control in the digital space. So I think that's all positive. I think we're sort of seeing more of a convergence where before maybe privacy wasn't keeping pace. And now we're getting a little bit closer, where we're seeing more well drafted legislation, we're seeing more customer awareness of what's actually happening to their data. And I think that's all a good thing. And then this final thing that I think is really interesting that's happened even more recently, is the impact of privacy technology. So these are privacy centric solutions, data privacy, technology that helps us and companies manage privacy. So these are this is the privacy vendor space right there. For years and years. There have been security tech vendors that help us manage endpoint or MFA or whatever it is, from a security perspective. We're seeing more vendors that are actually Helping companies, be compliant with technology with regulations, help them innovate with their technology, but also help with Cookie compliance or help manage opt outs, remarketing, things like that, that I think are really a positive evolution of the so the whole ecosystem in terms of we want to enable privacy, we want to enable innovation of technology, and how do those two sort of meet up in the appropriate ways?

Jodi Daniels  10:33  

You had mentioned that legislation often lags behind technology. And I kind of think that might keep happening at the pace that excuse me, technology keep keeps happening. I think we're on Moore's law, like exponential going on here. It would be really fascinating to hear a little bit more around the legislation that we have now, where do you think it needs to go what you what you might expect to come on the horizon?

Alexandra Ross  11:00  

Yeah, and I don't want to be too optimistic to say that that legislation is perfect, or legislation has completely caught up with privacy, but I think is getting a little bit better. And we're seeing more informed discussions about legislation, I think we're seeing a lot more people such as Lena Khan, or al cancel Tane, which are now sort of placed in positions of leadership and power at the FTC and at the California privacy agency, that come from a background that steeped in privacy experience and Privacy Awareness. So I think it's, it's going to get better in terms of the way regulation is drafted and sort of the the way that it's going to keep pace with technology. So I mean, the things that I'm keeping track of what's happening in the United States with the PRA, which is the new California TCPA, 2.0, you know, whatever alphabet soup, you want to call it, the Colorado and Virginia laws. So those are things that, you know, we have to, as privacy professionals working on behalf of companies need me to make sure that we're keeping track of because those implementation dates are coming, you know, very soon. And then there's also what might happen with federal privacy legislation, which has always been this sort of, you know, will they or won't they will Congress, you know, actually pass some sort of comprehensive privacy legislation, like we see with GDPR in Europe. And it's interesting to track that, because I think there's been, you know, bipartisan support for federal privacy legislation, there's been a lot of things happening with COVID. And with Facebook whistleblowers that are sort of bringing these topics back up in the news, there's been some hearings recently, you know, talking about federal privacy legislation. So we'll see if that happens this year, or next year, I think that would actually be beneficial in many respects to have one federal comprehensive privacy law rather than a patchwork of state privacy laws. Um, but it really remains to be seen if if there's going to be any action that we see this year from, from the US Congress. Those are, those are the main things we're talking, at least that are on my radar, there's also China, we're talking about global privacy issues, you know, the new P IPL that was just enacted in China that has a lot of similarities to GDPR. But some differences in terms of data localization, consent, and notice, and, you know, companies like Autodesk, we're taking a very close look at that legislation and how we can be compliant and continue to track that as we get more information from the regulators in China about actually what they meant by some of the things that they published in August. So

Justin Daniels  14:02  

kind of changing gears just slightly. You know, we talk a lot about and read a lot about the legislation that relates to privacy, we've been leaving a lot in the news about some of the practices that were highlighted on 60 minutes, which we could have a whole show on, but, um, what are your thoughts around the kind of ethical code code of conduct relating to the collection of data, especially when we start to overlay it with artificial intelligence, any thoughts? Because ethics is kind of like what we should do not needing to be prodded by the law. But as we also know, if you collect data now, it also is an asset but also potentially a big liability if you don't manage it appropriately.

Alexandra Ross  14:44  

Yeah, it's a really interesting development. And I think, you know, the conversations that I've been having when when we have governance bodies, and we're reviewing compliance activities, or our compliance program with privacy or security or use cases says that the business would like to do in terms of collecting data and providing value to our customers. We've talked about ethics over the years, we haven't labeled it as such. But we've always talked about how is this compliant with the law? What would this? How would this appear to customers sort of some of those ethical transparency issues? You know, when I talk to colleagues, we've always sort of been having this ethical discussion. But what we're seeing now is companies actually publishing their ethical principles, publishing ethical policy, saying we're developing ethics by design program. And I think it's in part because of this prevalence of data and new technologies like AI and machine learning, where the the tech, the legislation might not yet give us enough direction about what is sort of the right, the right answer. And we want to have these ethical principles and programs to help us decide what makes sense for our business and what makes sense for our customers. So these kinds of voluntary codes of conduct we're seeing being adopted I and I do think that that's a really positive thing, because it gives customers more trust in the companies that are using their data, and the value of some of these AI and machine learning technologies that that can actually benefit them in terms of, you know, smart cities and city planning and technology that is tracking them that might seem scary, or might seem a little big brother ish. But if you actually take the ethical considerations into into play, and actually provide the appropriate notice, and you've gone through sort of your checklist of is there bias? Is there a problem with this, and you're upfront with that, then I think that gives society sort of a little bit more, they can take that with a little bit more certainty that some of those considerations were properly taken into account. So doing the right thing means not just complying with the law, but actually thinking about what is that public and vital perception going to be? Are you providing value to your customers? And are you actually managing their expectations and maintaining their trust? I think, you know, it's interesting to track the development of these ethical programs, because now, you know, a lot of companies are putting them in place, proactively anticipating that there's going to be some legislation, we're seeing some draft legislation coming out of the EU related to artificial intelligence is actually going to require in some cases, impact assessments, additional transparency, those sort of ethics by design programs that we have in place now for privacy and security that's coming in a couple of years. Europe is leading the way not surprisingly. So I think it's, it's it's a good thing for companies now to start thinking about these ethical decision making, leveraging their existing data protection programs, thinking about how they can incorporate some of this ethical decision making into the use cases that they're currently reviewing.

Jodi Daniels  18:21  

On that topic, and that seem the idea that more and more companies are thinking and taking a proactive stance, that they're considering ethics. What are your thoughts from a PE and VC firm? And how they're thinking about privacy and security? And especially really, how do you think there? How do you think ESG is affecting their view on privacy and security?

Alexandra Ross  18:47  

Yeah, so So ESG, just for the listeners that don't know what that what that relates to? That's the acronym for environmental, social and corporate governance. It's really like sustainability and sort of being a good corporate citizen. So the E is environmental sustainability programs, the asses diversity programs, things like that. And governance, is that kind of corporate governance, and that's where the security and privacy and ethical programs would fall under the G vs. G. So if you're thinking about, you know, private equity VC firms and where they're making their investments and where they're focusing their attention, I do think that looking at companies, whether they have even the basic sort of acknowledgment that they're collecting data, where they're operating, what is the regulatory environment, what is their customer base and expectations? are they managing sensitive data, whether or not they have a formal ESG program in place or even, you know, a legal department much less, you know, a privacy council at some of these startups, but having the founders of these companies He's having discussions with VCs, so that they understand that this is an important part of their development as a company, not just from a risk management perspective. But again, going back to customer trust, I do think that VCs are seeing the value in startups and midsize companies that they may want to invest in having an understanding of the importance of privacy and data ethics. You know, there's a lot of studies that show that companies that have ESG programs in place are actually, you know, doing better in the marketplace, you know, we're seeing that they're there, there's a business case for ESG, that they can be more profitable. And that can be because they're mitigating risk, because they're attracting more customers. So I think you're going to be seeing more and more VC companies pursuing that, that investor base that has ESG, or at least privacy programs in place, and also those companies themselves, developing ESG programs. So they've been lagging in some respects in adopting ESG within their own companies. So if you look at sort of the numbers, the VC firms themselves haven't always been adopting ESG for themselves. But you're starting to see those VC firms say, Okay, it's really important that we look at that, and the companies in which we're investing, and it's also actually important that we start developing these programs and how. So,

Justin Daniels  21:39  

kind of on that note about ESG. From your perspective, what other kinds of market forces like ESG are out there that are going to require better privacy and security practices that aren't regulation? And for example, I consider what's happened to the cyber insurers this year as one and how you go through their underwriting requirements. But I'd love to get your perspective, what are some of the other non legal market forces that you think are impacting people to take privacy and security more seriously?

Alexandra Ross  22:10  

Yeah, I think the insurance one is really interesting, just in terms of, you know, the additional level of scrutiny, that the insurance companies are asking more and more detailed questions, which again, sort of prompts companies to do the right thing, and have the security and privacy programs in place, so that they can get the coverage that they need. The other things, you know, I think we spoke a little bit about evolving, regulatory, not just the legislation itself, but enforcement and funding of those regulatory agencies. So, you know, there's been some criticism of the Irish regulator, for example, that they've sort of been, you know, sitting back and letting tech companies, you know, go crazy, and not enforcing GDPR as aggressively as they should. So we might see some changes in terms of the funding of those regulatory agencies, both in Europe and the United States, and more aggressive enforcement of the wrongs that they're seeing in terms of privacy and security, compliance, or non compliance with existing laws, or these new sort of ethical considerations that companies aren't taking into, into consideration. So I would say like increased regulatory enforcement is something that that is a market force that that's out there. And you can sort of track the cases and see who's in who's leading various agencies and try to read the tea leaves that way. The other thing I think, is basic customer awareness. I mean, if you look at the number of news stories, journalists that are dedicated to privacy and security and technology, you're seeing just a lot more understanding and information about this ecosystem of privacy compliance. And I think that's going to drive some changes within companies because they have more and more sophisticated investors, they have more and more sophisticated questions coming up in their investor days, there's more and more scrutiny in terms of the news coverage. And there's more sophisticated customers that have higher expectations of companies, and how they're respecting their privacy and security.

Jodi Daniels  24:27  

With so many startups and emerging technology companies who are still not sold, the market forces haven't come to them yet. Their PE and VC firm maybe haven't brought it to their attention and they're solely in the Yeah, I don't really need to deal with this. We have a privacy professionals community who know how important this truly is. So how can privacy professionals leverage their expertise to help these startups and emerging technologies? Really make sure that privacy is top of mind?

Alexandra Ross  24:58  

Yeah, that's a good question. So it's a passion of mine to sort of evangelize about privacy and the importance for companies to take this into consideration. So I think there's a couple of things that a privacy professional can do. One is the IPP, the International Association of Privacy professionals. There's a real there's a lot of great content that they offer webinars, seminars, conferences, you know, invite one of your privacy. One of your business colleagues to one of those conferences, talk about the privacy and the IPP materials. When you're talking to colleagues of yours that work in startups that might not have any exposure to privacy or any resources related to privacy. There's a lot of really good content available through the IPP that I think those companies can leverage. The other thing that I'm a part of is something called The Rise of Privacy Tech that was founded by Lourdes M. Turrecha, who's a wonderful colleague, and privacy attorney and advocate. That's a group that brings together privacy tech founders, investors, experts in an advocate, and they evangelize and field privacy innovation. There was a way for privacy experts, privacy founders, privacy investors to get together and share information. And there's there's collaboration there, sort of meetings that we have, where we talk about various issues. And then there's a bit of a matching program, where privacy investors and privacy founders can get together privacy experts like myself and privacy startups like BreachRX can make introductions, that's, in fact, how I got introduced to Andy Lunsford, who's the founder of BreachRX, and I'm advising for that company. So I met Andy through The Rise of Privacy Tech, because he was working with them as a privacy tech founder and I was working with with them as a privacy advisor, we got matched up and had a couple of conversations. So I think that's a way for privacy professionals to kind of give back to that emerging privacy, tech, fear and share their knowledge and expertise, because there is that gap of knowledge. There are a lot of founders who have really good ideas about privacy tech, but need, you know, some some input from people who've been in the industry who understand what the market is, like, understand what the buying patterns are, like, understand what the pain points are for companies that want to bring in house some sort of privacy tech solution. So I would highly recommend The Rise of Privacy Tech, it's a really great organization, a lot of good people working in collaborating there.

Justin Daniels  27:56  

Well, thank you. I think the work that you do with the other professionals to help startups is really important, although I will differ with Jodi as to why the startups in the VCs don't adopt them, because you're worried about sales and a minimum viable product. So a lot of times privacy and security become an afterthought, because it's just not part of the sales process. And as much as I want to see that change. I think, unfortunately, that's the reality.

Alexandra Ross  28:25  

Yeah, I mean, we can we can have a discussion about that. I mean, I think when you add privacy and security as your foundational features of your product, you can make the sales pitch, but it's all about prioritization. And I think some some companies don't think it's, it's important until they get into trouble. Some other companies can be more proactive and can actually make it part of their their product offering. And I do think that there's an investor community and a customer segment that wants that. Although I'm,

Justin Daniels  29:03  

I would expect that if they wanted to do business with Autodesk and go through their procurement process, privacy and security would definitely need to be a priority if they wish to get a contract. And that's one area where I'm hopeful that that is another market force, like we talked about before, that can help change behavior because a company's where they have people such as yourself, they do prioritize privacy and security and those startups who covered those contracts, they aren't going to get it until you and others are satisfied that they are taking privacy and security seriously. Yeah, I

Alexandra Ross  29:37  

think that's right. I think companies like Autodesk that have vendor management programs in place that have, you know, their due diligence that they require of vendors, purity questionnaires, and specific contract terms that have to be played. You are working with a certain segment of the vendor population that's a bit more mature and the expectations are you Have those those foundational privacy and security provisions in place? I think that's only going to continue to grow. And those expectations are going to continue to strengthen.

Jodi Daniels  30:14  

I mean, I see it all the time with company A not being able to close the sale with Company B, because they're not complying with XYZ law.

Alexandra Ross  30:22  

Yeah, well, it's a competitive differentiator. I mean, look at some of the companies that are that are making privacy part of their, you know, communication and PR platform like Apple, right. I mean, they're, they're intentionally making privacy part of their product offering. And you're right, if Company A is if a company is deciding between two vendors at the same price and the same features and functionality, and one has better privacy and security provisions. That seems to me a no brainer,

Jodi Daniels  30:53  

right? So with all this privacy and security knowledge, what is your best personal privacy tip, you're at a cocktail party and someone's like, what what did I know? What did you tell?

Alexandra Ross  31:05  

Well, you know, I would say, take a really deep and, and profound look at the social media and apps that you're using. And you're downloading, I think there's, there's a lot of bad activity out there in terms of apps that we might download, because we think they're super fun, or we want to share information with friends. And those companies are not respecting your privacy and security. So I would say be mindful, breed those pesky Terms of Use and Privacy statements, that your privacy settings on Venmo to private, for God's sakes, you know, some of those basic things just in terms of the the things we use in our everyday life, making sure that you're doing what you can to either not use certain products, or look at those privacy settings and do what you can to control how your information is collected.

Justin Daniels  32:08  

So when you're not out there being a privacy evangelists, what do you like to do for fun?

Alexandra Ross  32:16  

That's a really good question. I mean, coming out of two years of lockdown, where we, it's the the ways we were able to have fun, we're fairly limited. I would say, I'm really looking forward to being able to travel more extensively. I love to travel, especially internationally and sort of take take vacations where I can see a new part of the world or practice my language skills are, you know, see some friends and do some exciting travels. So that's definitely something that that I like to do. I haven't been able to do that as much. And I'm looking forward to some some travel plan. The other thing is like music. So I don't know what the music scene is like where you guys live. But in San Francisco, there's so much great live music, there's so many festivals and concerts. And I'm a big kind of music junkie, and I love to go hear live music.

Jodi Daniels  33:16  

Well, thank you so much for sharing all of the great information. If someone wanted to connect and learn more, what is the best way to find you?

Alexandra Ross  33:26  

Yeah, so you can check me out on LinkedIn. I have a profile there. You can also contact me via my website, The Privacy Guru. There's there's a lot of information there about the advocacy work that I do the advisory and speaker work that I do. And you can also connect with me via the website and my email address.

Jodi Daniels  33:46  

Well, again, thank you so much. I wish you much success on your travels and music concert.

Alexandra Ross  33:53  

That's right. I have to find I have to find a way to put those two together.

Jodi Daniels  33:58  

Yeah, just have to go to a concert in a really great place.

Alexandra Ross  34:00  

That's right. Destination concert.

Jodi Daniels  34:03  

Oh, that would be cool. Yes. Well, thank you again, Alexandra.

Alexandra Ross  34:09  

Thank you for having me.

Outro  34:14  

Thanks for listenng to the She Said Privacy/He Said SecurityPodcast. If you haven't already, be sure to click Subscribe to get future episodes and check us out on LinkedIn. See you next time.

Blake Brannon

Blake Brannon is the Chief Strategy Officer at OneTrust, the #1 platform to operationalize privacy, security, and data governance. In this role, Blake is responsible for strategy, partnerships, sales engineering teams, and defining the privacy, security, and governance market. He was the first Chief Technology Officer at OneTrust, building the technology platform of trust that has been awarded more than 150 patents.

Before OneTrust, Blake was one of the first employees at AirWatch, where he served as the Global Director of Sales Engineering and the Vice President of Product Marketing. He was also a research assistant at Georgia Tech, his alma mater.

Available_Black copy
Available_Black copy

Here’s a glimpse of what you’ll learn:

  • Blake Brannon shares the major privacy concerns that sparked the creation of OneTrust
  • What privacy challenges do companies face today?
  • Why tools alone won’t solve your privacy issues
  • Blake’s advice for start-ups: Don’t make privacy an afterthought — hop on current privacy trends now
  • How privacy programs can build client trust
  • Where is the privacy field headed in the next 10 years?
  • Blake’s top privacy tip: You can request a copy of your data from companies

In this episode…

Privacy used to be pretty straightforward for companies. All they had to do was write the terms of service policy or privacy statement at the end of a contract or on the bottom of a website. Now, there are many more aspects to consider if you don’t want to get sued. But besides avoiding a lawsuit, how can privacy benefit your company?

Privacy isn’t just about dodging the courtroom — it’s about building trust. For example, Apple released a new ad that says “Privacy. That’s iPhone.” Those three words speak volumes about the lengths Apple is willing to go to preserve data privacy — and consumers are eating it up. Users want to know how companies will handle their sensitive information and data. If you can prove that your employees, processes, and tools are dedicated to protecting consumer privacy, your customers will keep coming back for more.

In this episode of She Said Privacy/He Said Security, Jodi and Justin Daniels sit down with Blake Brannon, the Chief Strategy Officer at OneTrust, to discuss how your company’s privacy policies can build client trust. Blake talks about the privacy challenges that companies face today, how to build programs that work in harmony with your privacy software, and the importance of hopping on current privacy trends.

Resources mentioned in this episode:

Sponsor for this episode…

This episode is brought to you by Red Clover Advisors.

Red Clover Advisors uses data privacy to transform the way that companies do business together and create a future where there is greater trust between companies and consumers.

Founded by Jodi Daniels, Red Clover Advisors helps their clients comply with data privacy laws and establish customer trust so that they can grow and nurture integrity. They work with companies in a variety of fields, including technology, SaaS, ecommerce, media agencies, professional services, and financial services.

You can get a copy of their free guide, “Privacy Resource Pack,” through this link.

You can also learn more about Red Clover Advisors by visiting their website or sending an email to

Episode Transcript

Prologue  0:01  

Welcome to the Said Privacy/He Said Security Podcast. Like any good marriage we will debate, evaluate, and sometimes quarrel about how privacy and security impact business in the 21st century.

Jodi Daniels  0:21  

Hi, Jodi Daniels here. I'm the Founder and CEO of Red Clover Advisors, a certified women's privacy consultancy. I'm a privacy consultant, and certified informational privacy professional, providing practical privacy advice to overwhelmed companies.

Justin Daniels  0:37  

Hello, Justin Daniels here I am a technology attorney who is passionate about helping companies solve complex cyber and privacy challenges during the lifecycle of their business. I am the cyber quarterback helping clients design and implement cyber plans as well as help them manage and recover from data breaches.

Jodi Daniels  0:59  

And this episode is brought to us by that was a really weak call, we're gonna send you to travel class, Red Clover Advisors, we help companies to comply with data privacy laws and establish customer trust so that they can grow and nurture integrity. We work with companies in a variety of fields, including technology, SaaS, ecommerce, media, and professional and financial services. In short, we use data privacy to transform the way companies do business together. We're creating a future where there's greater trust between companies and consumers. To learn more, visit I'm gonna change our fancy zoom angle here. I realized I'm totally asleep on this Monday morning recording.

Justin Daniels  1:46  

Backwards. That sounds like just a regular.

Jodi Daniels  1:50  

Oh, are you a barrel of fun this morning? Where's your brace that go great.

Justin Daniels  1:57  

You know that I can't root for Atlanta sports teams and you know,

Jodi Daniels  2:01  

oh, well, we're gonna bring blades Go Red Sox hot. It

Blake Brannon  2:07

sounds like the perfect duo.

Jodi Daniels  2:09

Ah, exactly. Well, we're so excited today to bring the Chief Strategy Officer Blake Brannon at OneTrust. He is responsible for strategy, product Technology Partnerships, sales engineering teams, and ultimately for defining the privacy security and governance market and OneTrust capabilities. He was the OneTrust first Chief Technology Officer building the technology platform of trust that has been awarded more than 150 patents. And before OneTrust Brandon was one of the first employees at AirWatch, where he served as global director of sales engineering and Vice President of Product Marketing. So Blake, so excited to have you here today.

Blake Brannon  2:57

Yes, and Jodi, great to be here. Thanks for having me.

Jodi Daniels  3:00

You are a barrel giggles what is going on with you? Just laughing at you. That's why are you laughing at me?

Justin Daniels  3:05

Because I know how excited you were we walk the dog yesterday about today's episode.

Jodi Daniels  3:09  

I am super excited. We did we had a whole strategy session over today's episode while we were walking the dog in our beautiful Atlanta fall weather. Because that's what everyone should do on a Sunday afternoon. When they're not watching football. Exactly. Well, blink, tat Well, we share it a little bit. You are a VMware now poof, magically, you're at OneTrust. But give us a little bit more history as to your career journey that led you to being the Chief Strategy Officer of OneTrust.

Blake Brannon  3:38  

Yeah, great kind of context here. So I ended up in Atlanta going to Georgia Tech. And I've never escaped the city since then. But my background is in being in Georgia Tech, engineering and technology. I started out in mobility at AirWatch. And, you know, ran consulting and pre sales and sales engineering and all those things. And what led us to, you know, ultimately OneTrust was what we stumbled upon doing a lot of the mobile enablement that happened around the world. And if you remember back in the early 2000s, and 10s, you used to and it was very common for businesses to have a Blackberry for work that they carried around, and then their personal phone. And as that personal phone became smarter and smarter, everyone's sort of started saying like, why am I carrying two phones around two chargers? You know, people calling different numbers. This calendar is not with this calendar. I can't even I booked two meetings, and I forget. So what we did in AirWatch was help people do bring your own device programs, where you could take your personal phone and just put your work apps and information on it. And why that's interesting and why it's relevant to OneTrust is one of the things that was very common in the early days of mobility where it people that were saying if you're going to bring device into my corporate network and get on my Wi Fi, my VPN and get access to corporate apps and data, I need to make sure there's nothing malicious on that phone. And it needs to kind of pass some inspection. And one of the common things that it would do is kind of whitelist, or blacklist common known malicious applications that could be installed. For example, there was like this flashlight app that was really rooting your phone and you know, was malware hidden, but it looked like a cool utility that was just turned your camera into a flashlight. And it would do that now, why that became a problem is we started, as we're rolling out these projects, we started seeing a lot of resistance to people wanting to do BYOD devices. And of course, we were technologists are like why would anybody ever say no to this, this is amazing, like, think about how efficient you could be being able to, like approve something on your phone and just get it off your to do list. It was rooted in the fact that there were all these privacy concerns. And everybody, of course, had big picture like oh my gosh, it can read all my email, listen to my phone calls, see my family photos, and things that really weren't actually possible because of sandboxing on the device. But what they could do, going back to that blacklist example is see a list of the application bundle IDs, which basically represents the name of the application. And just by knowing that information and having a help desk, inside a company be able to see that on a personal phone, I could differ what religion you might be, I could differ if you have medical conditions, I could differ sexual orientation, or if you're cheating, you know, and your spouse, just by having a list of the names of the apps, you can kind of infer a lot of information. So it created a lot of this privacy, you know, awareness, and it was just getting bigger. And we realized that this was not just one small little like, you know, update or software change that needed to happen in the operating system. This was a massive, sort of society level problem in the world of we've got all this data, we've got accessibility to it, we've got cheap, infinite storage. Privacy is a huge, huge challenge for the world. And that's what kind of led us to ultimately seeing and our CEO and founder could be your boss today, seeing sort of that this wasn't just a project or one time thing. This was like a whole market of software that needed to exist to help companies solve these privacy challenges at that kind of size and scale. So fast forward to get to the answer to the question I had stayed in touch with Kabir, as he sort of started the business about 2017 I came on, I've been running product and engineering and building out all the products that we have for the company. And then we matured up to the point earlier this year, we said you know what we need strategy as a dedicated role and have a little bit of separation between the day to day operations and the strategy function of the company to analyze new emerging markets, potential m&a, product and business strategy. And that's that's what I've been chartered to do.

Jodi Daniels  8:02  

I love hearing back stories, I find it really, really interesting, especially in the mobility and how it connects today to those privacy challenges really interesting.

Justin Daniels  8:12  

It's interesting when I hear him talk about the privacy part, because when we go from two phones to one, my security alarm bells ringing particularly as we've done it from the remote workforce. But anyway, bringing ourselves front and center. Can you talk a little bit about what are the biggest privacy challenges you see companies facing today in 2021?

Blake Brannon  8:34  

There there are only so much time in this podcast, right? I can't

Jodi Daniels  8:39 

go scramble. That's true. We could have a whole whole workshop on just that question,

Blake Brannon  8:44 

either, right? So I would say there's two, the two that come to mind. For me the most are one, the changing regulatory landscape. We're in a pretty volatile time right now where, you know, conceptually, what's okay to do and what's not okay to do is something the world is trying to figure out. And we're trying to figure that out through a combination of Standards and Technology and kind of working bodies. We're trying to figure that out with regulations, and data protection laws and residency laws. So you have all these states, you have the shrimps to decision that voided, you know, Privacy Shield and all of those things with transfers, you've got new data protection laws in China, for companies that are multinational. So just keeping pace with what's going on, there is a big challenge in itself. The second big thing, getting more to the operations of a company. The biggest challenge I think every company still struggles with is what did we even do today? Like what data is new that somebody inside the company is now processing or added and or what new transfers or new uses of data are happening and how do we get to a point where just like, you know, Justin like intrusion detection or You know, these types of security maturity systems that have gotten to the point where you can detect threats and detect new sort of exploits that are happening? The Privacy equivalent of that is sort of how do I detect new uses of the data? I have the same data, but I am now processing it based on a new purpose. And that new purpose might cause a privacy problem, it might crawl cause a compliance problem, how do I get to the point where I can actually detect that there's something new in my environment, so I can proactively look at risk mitigation and or the right disclosures or transparencies around it. Got it. So the the operational issues that are coming together our organizations struggle with, how do I detect what has changed in my environment, in my data, data ecosystem, my business application, that could cause a new privacy ethics or compliance sort of risk. And the difference in that than traditional just thought of as tools is, you know, privacy hasn't matured from a technology standpoint yet to the way security we see security today with Justin, things like intrusion detection, you know, early warning and prevention type systems where you can sort of detect that someone's trying to DDoS your service, or you've detected someone is now connected this server to this server. And that's a norm, you know, not usual, privacy is nowhere close to that same proactiveness, where you really need to understand, when has someone in the business started capturing new data, in an existing application that we didn't previously know is there, you know, some developer adding something in adding a new vendor into or an SDK into the application, we don't have systems that really detect that to the level we need, that the unique thing also about privacy is detection is not just, you know, on and off, it's this notion of processing the data for a new purpose, creates the risk or the compliance issue. So I can have it existing data that I have stored, I've been using it but if I start using it for a new purpose, and that new purpose is materially different, that in itself is what could require me to disclose get consent from users for the new purpose, various different activities and things. So the ability to stay on top of, you know, to restate at one the laws and regulations and what's okay to do and not okay to do is kind of one heroic feat in itself. The second is understanding what has changed in my business in terms of we've got a new app, a new vendor, a new use of data, new data inside an existing system, having the detection tools to proactively detect, so you can get ahead of risk remediation, disclosure, reconsent, whatever the remediation needs to be to reduce the risk of processing that new data.

Jodi Daniels  12:54 

And that lens, the idea of soft software, companies need some tools to be able to help them at the same time, it's all about Justin, I are always saying people process and technology. What would you say is important for company to consider when purchasing and using privacy tools like OneTrust, so that it's effectively used in the company, I think a lot of people think software equals done. And as we're talking about here, everything you just shared, there's a lot more behind just implementing that software, we have to really understand how the business is operating and then connecting it to the software. Yeah, 100%.

Blake Brannon  13:35  

And the first thing that everyone really needs to think about you're spot on tools won't solve your problem, right? It's people process and tools. But the first thing everyone needs to really make sure they crisply understand and write down is what is the problem you were trying to solve? That answer may lie in needing software tools, it may lie in just needing training and process it may lie in meeting extended skill sets and people. So needing to crisply define that as an organization is key. And then the other key thing you want to think about, especially for larger organizations that and I don't mean just enterprises, but like bigger than a startup where there are different stakeholders. Privacy is everywhere, right? You got to have privacy and your marketing, your sales, your customer support your data engineering, your data, governance and analytics. It is like a piece of fabric that's woven throughout all the stuff you're doing. And that means you have multiple stakeholders that probably need to be on board with what you're trying to achieve and do. And you want to think about how you as a privacy champion, need to educate and evangelize for those different stakeholders to get them on board and in agreement with what you're trying to achieve. We we internally talk to our own sales team a lot about like you as a privacy person can't go implement marketing solutions. You got to get your chief marketing officer You're on board with why you're trying to do what you're wanting to do, how you flip the the paradigm to not just be a, because we have to compliance but actually part of a broader marketing initiative, we call that an initiative, typically trust, initiative being trustworthy. But getting that stakeholder buy in inside the company is key. Otherwise, the project kind of, you know, won't take off, won't have life won't have legs and things like that. So all that comes together. And as you said, it's typically rarely just one of those three that you need, you really need all of them kind of working in conjunction or in harmony, if you will, to really achieve something.

Jodi Daniels  15:37  

I like how you mentioned how it needs to be woven, like a piece of fabric through the company, one of the questions I always get is where should privacy live? Who should own privacy? And it's a hard answer. And it really is I find whichever company, just whoever believes privacy enough, just kind of the executive sponsor, and then they have to get everybody else wrapped around them, just like you had described.

Justin Daniels  16:10  

Well said, well, thank you. I thought it was. So, you know, Blake is, as you well know, OneTrust is I think what the fastest growing company in the entire country. And I'm sure you remember back to the days when you were starting out, and Atlanta has a thriving ecosystem. So a lot of times when I read about startups, or I have to vet them for my clients, it's at that point, you find out that privacy and security have kind of been an afterthought. So for a company that's just starting out, what's the best advice you would give them, when it comes to hey, you need to be thinking about these privacy concepts.

Blake Brannon  16:47  

You know, my advice on that I think is going to be I'm a big fan of skipping steps. And I say it in this context, like if you, if you're new to a mobile phone today, like if you happen to be like, get a mobile phone, don't start with the beeper, and then go to the the handheld flip phone to the smartphone, you know, to the smart device, just go straight to what people have done. And I say that in the context of like, there's been a lot of innovation that has happened in privacy and compliance in the past three years. And if you're talking to someone, or you're looking at a program or a model that models the way things were done in 2015 2017 2018, there's probably some fundamental different leaps that you can just skip over steps to kind of get to the end state that you have, for example, one of the kind of big things that has you talked Justin about the startup market, when the big things that has exploded as the cost for cloud computing has just dramatically slashed. It's enabled all these b2b startup companies to kind of exist, where 10 years ago, you could, and you've got all these b2b startup companies that are small, they're building a very purpose built product for technology. But to sell that to a healthcare to sell that to a business people today need assurances of? Why should we trust you with the data we're giving? Or why should we trust you with this? So they're putting pressure on these 510 20 person companies to get a HIPAA certification to get an ISO certification to get a sock two type two, report. And, you know, as a CEO, or founder of a five person company, you're just like a technology guy that saw an opportunity to solve a problem, you're like, How in the world am I supposed to do that? Will Will that market has kind of gone from hiring an outside firm, to kind of do all this audit preparation and these things to the best analogy for the audience here is, you know, TurboTax, like, I don't need to be a CPA to fill out my own personal taxes, I can just answer questions about bought a house, you know, how to kid, things like that. And I can get to the state of just submitting it, their software now that really walks you through those steps, step by step, what should your policy be? What are your controls need to be connects into your Cloud account, your business application sucks that config back to say you are you're not compliant, and sort of simplifies that process. So, you know, skip the step, to go straight to that to start with, don't go through the traditional thing. So that would be my advice to anyone getting started.

Jodi Daniels  19:23  

You mentioned before around this concept of trust and trustworthiness. I talk a lot about that with clients. And also when educating is this whole idea of people need to trust the company is not only going to deliver a great product or service, it's going to show up on time, it's going to be a fabulous experience. And then they can also trust you with the data. I'd love to hear from you how you believe privacy programs can help build customer trust.

Blake Brannon  19:50 

It's a it's a huge, interesting concept because privacy programs traditionally were pocketed and if you think about Like how who managed privacy is not new? So who was managing privacy for what was it? It was a function of the legal team that wrote the terms of service policy or the privacy statement that was like bolted into the end of a contract or on the bottom of a website. Right? That was it. That's all you had to do. We were covered, somebody wrote some stuff down, and therefore we're compliant. That has shifted to not be this compliance function. But to start to think about this as an enabling function. So it's not just reducing downside risk of what if we get fined and there's some, you know, cover our butt type of thing, too. If you, for example, look at an Apple ad. Lately, it just says privacy. That's iPhone. Yeah, that's, that's all the ad says. So it becomes this like real enabler to the business in the company, where it's about trust of the end users for why they should trust us as a company, and trust giving us their sensitive, you know, information and data, and know that we're going to do good with it. Now, when you operationalize a program. What's great about a privacy initiative, for anyone undertaking it for the first time, I kind of call it it's like cleaning up the junk drawer. Like when you go across a company and you do like a data map. And you're, you're trying to figure out, well, what data do we have? Why do we have it like that? Those two simple things, what data do we have? Why do we have it that causes a lot of cleaning up that needs to happen, and, to your point about trust, trust is about consistency in the messaging to the customers and consistency of sort of choices that they see across different properties. And it is not uncommon today to go to an organization, and you probably experience everybody on this, this podcast probably can relate to this, you know, interacting with one group in a company and a different group. And you see, like completely different experiences, you know, your choices over here weren't really respected. When you go into the retail store. They don't know what the Online team did. And, and there's just all these weirdness things and that erodes trust, right, that causes people to not be as confident about you, and the the expectations that they have. So what we're seeing is privacy is going from this compliance thing into a, how do we enable people to really trust us as an organization, see the consistency of choices and settings, see the consistency of conveying how we're going to be using, you're going to be using my information and my data. And then another key I think element of trust is how recoverable you are the business resilience of an organization knowing that we're all going to have a data breach, we're all going to, you know, make a mistake, forget about breaches. But just make a mistake, we accidentally did something that we probably in retrospect have shut now, how do we respond to that as an organization, which I think God gets back to your point a little bit about who owns privacy? And do they have the right sort of executive reach in an organization to drive the right culture and attitude around it. Because all of that kind of conveys into your your customers want, you know, willing to trust you are able to trust you or not.

Jodi Daniels  23:05  

I love where the trust conversation is going. It's really interesting to see more and more companies start to adopt it. And I know OneTrust has chief trust officer, I'm starting to see some other companies do the same. It'll be really interesting, I think to see how this continues to play out.

Justin Daniels  23:19 

I would just like to add lake that my company doesn't have to worry about data breaches, because our data is on the cloud. And why would anybody want to come after us?

Jodi Daniels  23:30 

As a very common answer is in the cloud. I don't have to do anything. I don't have to worry.

Blake Brannon  23:35 

But I do PR PRT said, your vendors are not a scapegoat for your liability. And that's not even just secure. Obviously, that's just privacy side of it. But yeah, it's very common today that you cannot outsource your risk. And you will be held liable, regardless of who what processor sub processors sub sub processors may have been the victim here of a breach or a specific incident.

Jodi Daniels  24:04  

I'm going to interject also, I think what's interesting that GDPR highlighted is a lot of times people will say, Oh, it's on the cloud, I don't have to worry about it, or I don't collect credit card information, or health information. I don't have to worry about it. That is a very common us approach, thinking this privacy thing only matters if I collect certain kinds of information. And where the privacy landscapes going now is Nope, just name and email was good enough. Just one of those is fine. You're doing any kind of fun things online. That counts too. So it's really shifting the conversation.

Justin Daniels  24:38  

So another thing that we're talking about when it comes to trust, which has been a theme of today's show is in the next five to 10 years. I do a lot of work in the Smart Cities arena. Now I'm getting involved in projects where you're gonna see drones with cameras, and they're flying over for traffic patterns. You already have probably seen The facial recognition statutes that have come out. And then with CVS, when people come into the store, they're trying to prevent shoplifting. And these types of technologies that are being deployed have profound ramifications for privacy. And so with some of these things in mind, where do you see privacy going in the next five to 10? years?

Blake Brannon  25:21

Yeah, great, great, great question. So several, several things that are going to drive, what's going to happen here, one, you're just going to see an increase in investment of technology education, and a company I talked about automation of detecting changes that you know, are gonna help you better get proactively engaged with getting ahead of a problem. So think of CI CD pipeline, like static code analysis, dynamic code analysis, to see you have a new privacy problem, just like you see, like, Oh, we've got a SQL injection, you know, in our code that we just created. So that kind of maturity, I think you're gonna, you're gonna have, you also have this emerging class of privacy enhancing technologies that are to be developed. But they're gonna, they're gonna enable exactly what you just said, where, you know, there are obviously uses of data, where you can have dramatic negative implications, if that is not really safeguarded in the right way. And facial recognition, AI usage and in certain algorithms, fraud detection, you know, the one Apple did a couple months ago, the child sex, photography, stuff, all those different things, like on one side of it, and I don't want to say one's good or not good. But on one side of it, there is a benefit to society. And on the other side of it, it could massively go south. So how do we safeguard, saying, we still want to use data, we still want to do things like, you know, detect terrorism and the bad and all these types of things. But we need technologies and enclaves and secure computes and things that enable the computation of that data to happen in a privacy centric way. So that no one company is, you know, incorrectly motivated to do something wrong, or no one company has the bulk of both sides of the equation of data. So these PTS are going to just explode. And I think they're going to help us do some of the things that we honestly are slowing down as a civilization doing because of all these challenges and concerns about the negative implications. And these new technologies are gonna enable that. And it's not just about privacy. You know, I think this is where it's part of even our strategy we take about trust people think about it's not just privacy that makes you trustworthy. It's security, privacy, its data governance, its ethics and compliance. You've got the AI act, you know, coming out in Europe, but like use ethical use of AI, how do you detect biased and algorithms, those types of things, disclosures, and whistleblowing, all of that kind of attributes to a trustworthy organization and an emerging one that I think is also going to be very disruptive in the next five plus years is the ESG space where, you know, just like today, you might think about privacy and data usage as an important buying decision for like the iPhone, as I was saying, I think your you know, carbon nutrition label is going to be a real thing that one is required to disclose and to consumers or even really looking at as they make their own purchasing decisions based on, you know, your kind of broader ESG type initiatives. So all of those I think, are going to be what elevates the privacy role, and sort of trust trajectory inside a company.

Jodi Daniels  28:44  

Or young daughter might be part of leading the charge of that carbon neutral nutrition label. She is very adamant of saving the Earth. And we know she's gonna do great things that that's awesome. Yes. Well, in that spirit, one shows has expanded beyond Privacy software tools. Can you share a little bit more about those adjacent areas? And what's available today and how they intersect as well with a privacy program?

Blake Brannon  29:10  

Yeah, as we thought about this, going back to my point, a lot of our early customers were like, We don't want to just do these activities to prevent getting fined by the DPA and Europe or whatever, for GDPR. Was this broader? Like, how do we up level our overarching company governance and showcase it so that we can use it as a differentiator in the market for why people should trust us? And when you think about it at that level, it's about not just privacy from a traditional like data mapping, privacy risk assessing type activities that you have to do you start to think about, well, what about our corporate governance? And you start to expand into thinking about GRC and the traditional compliance type activities that you have to do you have third parties or vendors like you were saying providers, you have to do due diligence on those cloud providers and make sure you're not doing transfers that are inappropriate for the type of data that you're using. You're not, you know, onboarding, additional risk or resilience problems, because those will become your problems. So third party due diligence, and third party governance is a key aspect of broader governance that you have to do ethics and compliance. I talked about that a little bit a moment ago, but moving from not just privacy analysis of, are we doing good or not with the data we're using? But are there ethical concerns with the data that we're using in the, the way we are processing certain categories of data or enabling SpeakUp programs where, you know, things could be going south? So how do you incorporate that into your program, and then ESG, as I talked about, everything, from dei to carbon accounting, you know, one of the initiatives we're big on is helping create a carbon ledger for organizations so that they can actually consume and sort of document in an auditable way, their carbon consumption from their utility companies, their cloud providers, you know, you're inheriting some scope to carbon from them, things like that. So that is that is broaden the spectrum of our, our platform. And we call that the trust platform, because again, all these things sort of feed into what are the externalization of what I'm doing with this. And if I do that, right, and I'm showcasing that trust, I'm showcasing the choice and the transparency, I can get more data, I can better understand my customers, I can move faster as business because I can use that data confidently. Like if I have a well oiled governance program internally, I can launch new projects, do news, you know, services, all these things at a much faster rate than my competition. So all this needs to be creates a competitive differentiator and a competitive advantage for me as a business.

Jodi Daniels  31:49

Well, if you need a young designer, we know someone who might be available for that carbon neutral design.

Justin Daniels  31:57  

Probably will relate to animals do

Jodi Daniels  31:59  

i It might but she's she loves to, she's been looking for she's really like a job that she can showcase her talents. And it blends her patient

Justin Daniels  32:08  

willing to work for Pez?

Blake Brannon  32:09 

Well, we will we will send her the the job application.

Justin Daniels  32:15

How's that? So, you know, as a privacy Pro, having seen so much not only at OneTrust, but your prior jobs? What is the best personal tip you give at a cocktail party when people ask you how do I navigate all this privacy and security stuff?

Blake Brannon  32:37 

Yeah, it that's a fun one. There's probably like two things. One, when I say like, oh, I work at OneTrust where Privacy software? And of course, like what is that? I have no idea. So the the most tangible thing is to talk about the little pop ups you see on websites with the cookies and saying, you know, do you want to accept cookies? And they're like, Yes, I see those can you make them go away? We're working on it. Yes. The the most practical or I would say interesting thing that I think most people don't know, that is a fun exercise to go do is everybody has a right to their personal data that you give to another company. And you have a right to get a copy of that data. And even companies where there's some state laws that require this, there's not one in Georgia today. But most of the time big multinational companies, they'll give it to regardless because they just say hey, globally, we're gonna do the same thing for everyone. So an interesting exercise is to go into the privacy policy in the footer of our website, go find your your Access Request sort of section. Normally, it's like a web form or something on someone's site, but request a copy of your data. And it's kind of fascinating what you get back from these different companies to see what they have. I mean, Apple, for example, will send you a copy of every device that's ever connected to your iCloud account every app you've ever downloaded the IP address it downloaded from the date time stamp of all your purchases and syncs. So all of that kind of personal information that is yours. You have right to access it. So it's fascinating to go like ping the companies that you share your information with and see what you get back.

Jodi Daniels  34:14  

That is a very fun tip. Now when

Blake Brannon  34:18 

you're on a Saturday and you want to go like burning the flowers

Jodi Daniels  34:22 

that you put on it. That is a good story for another day to share Saturday, privacy, security fun. But when you're not building privacy and security programs and the fastest growing company in America, I see you like to do for fun.

Blake Brannon  34:40  

Outside of slow jazz music and long walks on the

Jodi Daniels  34:42  

beach or fat counts is fun.

Blake Brannon  34:46  

I have so I have three kids at home young kids and I spend a lot of time with them as well and I love running as well. So that's my stress relief from the the chaos of building the company is Go get a good run in.

Jodi Daniels  35:02  

Well, I we appreciate your time here today. And if people want to learn more about OneTrust or privacy or connect with you, what's the best way to do that?

Blake Brannon  35:14  

That's way to get to us And you can learn about what we do. You can watch our product, demonstration videos and things online. And if you'd love to have access to an account, you can sign up for a free account.

Jodi Daniels  35:29  

Well, thank you again, for joining us today. This is such a hot topic and we're delighted to be able to have you and learn more about what OneTrust is doing.

Blake Brannon  35:40  

Yeah, I really appreciate the the opportunity to come on it's great conversation. We should we should do it again and a couple years and see how different the answers are.

Jodi Daniels  35:49  

All of that. And yeah, I wonder if it would even be a couple years.

Blake Brannon  35:54  

Six, yes, six months.

Jodi Daniels  35:57  

It is moving so quickly.

Blake Brannon 36:00  

Well, thank you.

Outro 36:04  

Thanks for listening to the She Said Privacy/He Said Security Podcast. If you haven't already, be sure to click Subscribe to get future episodes and check us out on LinkedIn. See you next time.


Odia Kagan is a Partner and Chair of GDPR Compliance and International Privacy at Fox Rothschild LLP. Odia advises clients on how to design and implement their products and services, consummate their M&A transactions, and engage third-party vendors in the US and abroad. More than 80 companies have benefited from her in-depth knowledge of privacy and data security regulations and emerging information technologies.

Odia holds multiple certifications, including Fellow of Information Privacy (FIP), Certified Information Privacy Manager (CIPM), and Certified Data Protection Officer (CDPO). She is also a Chapter Chair for OneTrust PrivacyConnect and a Member of the Business Law Section Executive Committee for the Philadelphia Bar Association. Previously, Odia was a Member of the Publications Advisory Board for IAPP.

Available_Black copy
Available_Black copy

Here’s a glimpse of what you’ll learn:

  • Odia Kagan shares how she discovered her passion for privacy as an attorney
  • How are data collection and privacy evolving for autonomous vehicles?
  • The importance of transparency about data usage and simplifying user opt-out options
  • Who really owns the data from your autonomous vehicle?
  • Odia explains zero-party data
  • How privacy laws are being enforced

In this episode…

It seems like vehicles, phones, and even refrigerators are getting smarter with every passing day. However, the convenience of smart technology comes at a price: your data. At the end of the day, who’s holding the information you’re freely giving up? 

Take autonomous vehicles as an example. Those amazing AI chauffeurs know a lot about you — and it’s hard to pinpoint exactly who owns that data after it’s been collected. Is it the manufacturer? The dealer? You? How can you protect your privacy from a world that’s consistently mining for more information?

 In this episode of She Said Privacy/He Said Security, Jodi and Justin Daniels sit down with Odia Kagan, Partner at Fox Rothschild LLP, to discuss how to protect your privacy and security in a world that’s digging for data. Odia talks about the importance of transparency from autonomous vehicle companies, the rise of zero-party data, and how privacy laws are being enforced. Stay tuned. 

Resources mentioned in this episode: 

Sponsor for this episode…

This episode is brought to you by Red Clover Advisors.

Red Clover Advisors uses data privacy to transform the way that companies do business together and create a future where there is greater trust between companies and consumers.

Founded by Jodi Daniels, Red Clover Advisors helps their clients comply with data privacy laws and establish customer trust so that they can grow and nurture integrity. They work with companies in a variety of fields, including technology, SaaS, ecommerce, media agencies, professional services, and financial services.

You can get a copy of their free guide, “Privacy Resource Pack,” through this link.

You can also learn more about Red Clover Advisors by visiting their website or sending an email to

Episode Transcript

Intro  0:01  

Welcome to the She Said Privacy/He Said Security Podcast. Like any good marriage, we will debate, evaluate, and sometimes quarrel about how privacy and security impact business in the 21st century.

Jodi Daniels  0:21  

Hi, Jodi Daniels here, I'm the Founder and CEO of Red Clover Advisors, a certified women's privacy consultancy. I'm a privacy consultant and Certified Information Privacy professional, and I provide practical privacy advice to overwhelmed companies. Hello,

Justin Daniels  0:38  

Justin Daniels here. I am a technology attorney who is passionate about helping companies solve complex cyber and privacy challenges during the lifecycle of their business. I am the cyber quarterback helping clients design and implement cyber plans as well as help them manage and recover from data breaches.

Jodi Daniels  0:55  

And this episode is brought to you by Red Clover Advisors. We help companies to comply with data privacy laws, and establish customer trust so that they can grow and nurture integrity. We work with companies in a variety of fields, including technology, SAS, e commerce, media agencies, and professional and financial services. In short, we use data privacy to transform the way companies do business. Together, we're creating a future where there's greater trust between companies and consumers. So learn more, visit And here we are. We're back together again. We just did this last night, right? If up?

Justin Daniels  1:36  

Yes, I'm vaguely remembering that from eight short hours.

Jodi Daniels  1:41  

I just thought we'd have the mic set up again, because it was so much fun. But today, we're really excited because we have a very special guests. We do and who is our special guest. Our special guest is Odia Kagan. She is a Partner and Chair of GDPR Compliance and International Privacy Practice at Fox Rothschild, a US national law firm. She has advised more than 200 companies of varying industries and sizes and compliance with GDPR CCPA and other US data protection laws. And she holds not one, not two, but three law degrees. And five bar admission. Plus five, privacy certification. We have a lot of letters in this introduction. already. Yeah, it's a delight to have you today. Welcome.

Odia Kagan  2:32  

Thank you. Nice to be here.

Jodi Daniels  2:34  

Well, in the Justin, you're gonna help us get started. Sure. So Odia,

Justin Daniels  2:42  

as we always start off with our guests, talk to us a little bit about how your career evolved into your current position.

Odia Kagan  2:49  

So, um, you know, I think it was in lean in, right that Sheryl Sandberg said, you know, it's it's a jungle gym, not a ladder. So it's been a bit of a jungle gym and not a ladder in that, you know, I started as you know, I've always actually wanted to be an attorney. Right. But like, legend has it that I decided I'd want to be an attorney when I was like three years old. And I actually never really wanted to be anything else. But the you know, it took me a little bit of figuring out that, you know, privacy was, you know, the spark joy in my heart kind of specialty. I started out doing corporate commercial stuff, I did some m&a, I did a lot of tech stuff back in and I'm Israeli, I started my career in Israel. And so I did a lot of tech, digital international stuff. And then, you know, and then kind of gravitated back into, you know, the tech lead back to the privacy. And so I've been focused on that for a good number of years. And I think that, you know, there is merit to the jungle gym. I think that a broad base knowledge is really helpful for attorneys to be good at what they do. Because, you know, even if you don't know the answer, you know what you don't know, and you can feel it. Also, having said that, I think you don't need to set out to do a jungle gym. And if you can, like, Oh, this is like really what I want to do, like go for the ladder, you know, don't like don't do it that way. But it's worked out well. For me.

Jodi Daniels  4:16  

I think it's so interesting now that privacy is even feel that you can actually concentrate and major and it wasn't when I was in school, there was no privacy class even or topic or consideration now that it can be an entire dedicated field with a degree I think is really freed. Now, let's move on to one of Justin's favorite topic, China to jump up and down when we move on to autonomous vehicle.

Justin Daniels  4:46  

I think you should pose the next question.

Jodi Daniels  4:49  

Well, I will but I just want you to curb the excitement. I know how excited you are about vehicles. But there is a big evolution of new technologies like autonomous vehicles, among others. And so how do you see data collection and privacy evolving in this area? Um,

Odia Kagan  5:10  

well, I think that there's, I mean, the obvious thing with autonomous vehicles and similar technologies is that there's a lot more data that is being collected and shared and processed. And some of it, you know, is is not apparent, and some of it people aren't yet aware of, right, but it's there, there's a lot of data, cars, you know, collect and process and share a lot of data. Um, a chunk of that data is personal data, it actually also kind of there's, you know, the definitions in Europe of personal data and under CCPA personal information and, you know, tied to a device and etc, right. But they at least there's a considerable amount of that data that is personal, and therefore subject to data protection laws. And so basically, a lot more data. And that data needs to be used in a way that's compliant, which that lends itself to, I mean, so far, you know, that sounds simple, right? But it's the simple. I've kind of mentioned this dichotomy before, the fact that it's simple doesn't make it easy, right? So you have this data, and you have a transparency requirements. So you need to make it known to people in a way that people understand. What is the data? And where is it coming from? And what are you doing with it? And who are you sharing with? And, you know, we've seen enforcement actions, most recently, right, the WhatsApp case in Europe and the California Attorney General enforcement report that put the emphasis on transparency and clarity and saying things in ways that people understand. And that's not easy, when you have a lot of different one systems in the car that are collecting data. Number two, a lot of stakeholders. So you have the manufacturer, but you also have and obviously, you have the driver, and you have a passenger, and you have another passenger, or maybe you have another driver, and you have like a, you know, like a pedestrian and things like that, right. And on top of that you also have third party providers, especially as cars are getting more complex in incorporating, you know, Android Auto and Android environment, incorporating input payment system incorporating third party applications. So there's a lot of stakeholders and making the transparent one transparency, right, making it apparent what data is collected? And how and why and how's it shared, and layering on top of that, the data protection requirements for control or consent, right, like in Europe, you need consent in connection with the car, because it's been determined to be terminal equipment. So you need consent, just like with cookies. In CCPA, you have the concept of sale for certain transfers, and how do you operationalize the opt in? Or the opt out? And and how do you do the contracts? What are the relationships between them? Because you know, are they service providers, and you need agreements? So I think the short answer is a lot more data, plus a lot of law, a lot of data protection laws, and then you need to operationalize all that.

Jodi Daniels  8:20  

Well, that is a lot for people to be able to digest as just an individual right, one of my opting into and in my opting out of what am I consenting? Are there any that you see that are doing this? Well?

Odia Kagan  8:34  

I think that it's I think that every I think that people are, I think that a lot of companies are working on it. I think that this concept of First of all, this concept of trying to translate that into something specific and understandable is kind of a I'm not gonna say a moving target. But I think that they require the threshold for what meets with the expectations of transparency are high, and it's clear that they are high. Let's go with that. Right. I don't want to say like, Oh, we didn't know about this, because we kind of I mean, we have the Transparency Report from the article 29 Working Party for a good number of years. But this wasn't kind of, you know, common practice. I think that one other thing that I can add to my answer is that I think, besides the extra effort that needs to go into it, the data collect the mapping of the data, understanding the mapping of the data, understanding the functions and spelling out. Also, there's I think importance for designing legal design, customer X, you know, customer experience, user experience, trying to, you know, figure out kind of new ways to present things, especially when you have, you know, small speed interfaces in the car or voice activated interfaces. I've seen a lot of good efforts and I think that there is you know, a lot of work that everybody He still needs to do it as this progresses.

Justin Daniels  10:04  

Kind of as a follow up on your idea about things being a bit of a moving target and how we operationalize all this. As we all know, we don't have a federal privacy law. But how do you think privacy will involve? When we have vehicle sensors? And then the wireless communication? That's really an FCC issue? The safety of the vehicles is the National Highway, Transportation Safety Administration, and then we've got CCPA, which is some state privacy law don't really have anything on the federal level. To your point, how do we start to integrate these varying regulations where you have privacy and security as part of it, but not the only part that you have to put all these puzzle pieces together?

Odia Kagan  10:48  

Yeah, and I'd probably add, like, if you have the financing of the of the vehicle piece, then maybe you have the CFPB. And if you have insurance companies that maybe have insurance specific, like, it really depends on where the data is going. I think the one thing that I would the place where I hope it would go, is I really hope that the various authorities, agencies that are responsible for pieces of it will work together, when putting forth regulation, we've seen similar, you know, issues or problems or, you know, on the interplay, for example, in a different area, but like the interplay between AML, you know, AML, CFT regulations and the data protection laws, right in Europe. So you have like a packet of AML, you know, regulations proposed and the data protection bodies. Now, the European data protection supervisor and the European Data Protection Board have both said, hey, you know, this doesn't match, you know, this, and this, and this, and this are missing, you need to work on it. And I think that it would be a good idea for cross agency collaboration on these not necessarily even in the, you know, putting forth new regulation in basically, kind of either like a guidance or clarification or whatever, on how this needs to work together. And also in the operation of each agency, right. Like, if the agencies that are responsible for safety or security, right, when they put for things, they need to consult with the privacy regulators to figure out that what they're doing matches the privacy side, when you have a privacy specific regulation or a again, like road safety. And that requires the collection or the exchange of information that needs to also match up with privacy and security considerations. So I think like working together, it would be the ideal that I'm hoping we will have.

Jodi Daniels  12:52  

So like all families, everyone talks about who owns the data while you're having a family drive. And it was just a couple weeks ago that we really weren't, we bought a new car. And we started talking about location information, and the car knew where we were, and our phones knew where we were. And our daughter said, Well, what I don't really want them to know where we are when we started on this whole fascinating discussion of who owns the data. So in your mind, when you're working with companies, who do you think owns the data, we have the manufacturer, we have the dealer, we have need the individual. And then as a driver, as an individual owner of a car, what should I be concerned about? As it relates to my data?

Odia Kagan  13:39  

And just want to say that our you know, our conversations are always who gets the switch? And who gets the iPad, and it's my turn, it's my turn. So they don't

Jodi Daniels  13:49  

get to switch? They're just still stuck in the back. And we're, yeah, we just yeah, they don't get the switching? Yes. Yeah. Maybe when they're older.

Odia Kagan  13:59  

So, um, I think so. Um, first of all, ownership of the data is an interesting concept that I think, you know, I see it a lot, and I see it in contracts. And I know that, you know, I've seen it in jest, and you probably have seen it a lot in contracts. So I think data protection laws are kind of not as concerned with ownership, but rather sort of more kind of rights and choice and control. I think that that the, all the data protection laws agree that it's the people is the person's data, it's the the person has rights in the data. Okay, so that's the beginning. Now, the question is, who else, you know, gets a piece of the data and how and the rights to do what with the data and like the interplay between that and I think so. So that's really kind of you know, I've seen this discussion in the cookie context, right, like zero party data because it's my party I'm it's, the person And that's my data. I think so the data is the person's, um, what should the person be concerned with? And what should the other stakeholders be concerned with? I mean, again, I've said that before, but I think the first thing and this is kind of more maybe US based approach, but my first priority would be as a user, and as a, you know, consumer facing stakeholder in the mobile mobility space, be it the OEM or you know, when the infotainment providers or payment provider or whomever is that the individual understands what data is being collected? And why and where is it going after? Right. So I have the full picture. I think that's the first priority, because they think that to a great extent, if I know what's going on, that will help me make an educated decision. Now, is that the full picture? I think, even in the US, that's not the full picture. Definitely in Europe, the approach is, okay, it's not enough for you to say yes or no, we need to have a layer before that, that says is this, like, just or necessary or ethical in the first place for the company to even be collecting this information? And I think that there is room for that. And that I think we will see, not only do I think there's room for that, um, CPRA. And the Virginia law and the Colorado law now have this kind of, you know, necessary fair and proportionate or necessary and proportionate analysis that you need to make ex ante in processing information. So it's not like a free for all, like, you can just collect everything and kind of hope for the best. So I think there is a component in that. And that's, I think, where, you know, regulators will weigh in on what are the guide? What's the what are the parameters for what is the what are the limits for data collected? And then all the data collected? You know, what, how much choice? Do I need to give individuals? There is a beginning, you know, there's a foundation of that the automotive innovators, right, they, the association, they there are the privacy, you know, there's this privacy guidelines for the North American manufacturers that has these concepts of transparency and control in the scope of control. And I think that, you know, we hopefully will see clarity on that, for, you know, for the US, I mean, as well as for Europe, getting more clarity would be helpful for everybody.

Jodi Daniels  17:30  

Indeed, I think there's a lot of confusion nowadays, everywhere.

Justin Daniels  17:35  

Yes, sometimes confused in my own house about where to put the trash,

Jodi Daniels  17:39  

you're very confused over where to put the trash every week, that same thing? Well, so switching gears a little bit, let's move to the marketing side of your specialties. And, you know, how our company is changing marketing tactics and reaction to what's happening in the universe. Right now, we have technology companies who are saying no to cookies, you mentioned zero party data, which I'd love if you could explain zero party data, because I think that's a new phrase that not everyone is using. So we'll just love your your thoughts kind of overall on where we are in the universe, and how companies are reacting from a marketing standpoint, maybe a little bit how we got here?

Odia Kagan  18:26  

Well, so the how we got here is there is you know, Google had this, you know, plan to deprecate third party cookies. And that plan was in the works and companies were reacting to that plan. And that plan has now been postponed giving, you know, a little bit of a, you know, respite to companies and more time to prepare. What does this mean, I think so the third party cookie situation is part of kind of a more, you know, bigger concept of, you know, that's endowed by, you know, people that don't like it, I guess, mainly, surveillance advertising, right, or, you know, the concept of, you know, advertising which is based on very kind of targeted, granular information that is gathered from activity across devices and matching, etc. And there are, there's a whole infrastructure in place that exchanges a huge amounts of information in real time in split in like a fraction of a fraction of a fraction of a second between, you know, 1000s of companies. So that's sort of the backdrop, and there are there are, there is ongoing kind of enforcement and litigation in the EU about it there. You know, there's investigation in the UK like on this whole industry, but basically so that's the backdrop and the issue is okay, so if third party cookies go away, and third party cookies are a way for a company to leverage tools Have other companies in order to analyze and monetize and their own data and generate leads, etc. If you and, and so leveraging if you can't do that anymore, okay? Then what can you do? So what you can do is you can use quote unquote, your own data. So what does your own data mean? It means data that you get directly from individuals. And so that is, has been called first party data, even though it's not really first party, right, because the first party is me, the person, and then the sort of second party is the the entity that I'm directly sending into. So that's why there's the term zero party and first party, but basically, it's, you have a direct relationship with the individual. So if you have a direct relationship with the individual, great, okay, by the way, great is parentheses great, but you still are subject to all the privacy laws, right, you need transparency, if you're sharing the information, you also need to think about the sharing. But it makes it so that's kind of one aspect is trying to move to this direct relationship. That, as I said, one doesn't alleviate completely, the privacy issues maybe simplifies them. But it's still, you know, an amount of work. The other issue, that's the challenge with this is that companies have turned to third party cookies, and like, you know, third party, you know, tools, etc, not only because they were available, and they really like the targeting concept, but also from a modern, you know, economics, right, usually smaller parties need to rely, smaller companies need to rely on third parties. And so there's also now if you're a small company, and you want to rely on your own audience, well, how do you develop that audience, right, you like you, one people go to if you can, then you go, you have channels, you have YouTube, you have Instagram, you have, you know, things that you try to generate leads and create your own audience, that's one option, I'm very much more difficult for smaller companies, I'm very difficult for companies that aren't consumer facing by their nature. And then you supplement that with kind of, you know, collaborating on your, you know, with your, your information. So I've got my little, you know, 100, you know, followers or whatever, and you have your own you your to your 100, and then we're going to collaborate, and then, you know, you again, have the sharing and the sharing needs to be accounted for, for privacy. So I think what are companies doing, companies are trying to figure out ways that don't rely on third party cookies, for their deployment. Some of them still involve a lot of personal information, especially both the collection and the sharing. And then the other direction is trying to do marketing and advertising that isn't based on personal information, for example, contextual advertising, that's another direction.

Jodi Daniels  22:56  

I find it so interesting, we're I feels like we're going back there contextual advertising, because we started there. And then here's all these great tools to help you make it more tailored. And now we're going back in this other direction. So it'll be very interesting to see how I think all this shakes up.

Justin Daniels  23:14  

So a lot of what's motivating all of this interest in privacy, compliance is enforcement. And we'd love to get your thoughts about the initial enforcement efforts to date with CCPA. And what do you think that might mean for CPRA? That's coming down the road.

Odia Kagan  23:32  

So I think, first of all, so there, as I mentioned, right, um, and there was a report that was issued by the California Attorney General for a year of enforcement, and the report is sort of an anonymized version of actual non compliance proceedings that the Attorney General initiated with companies and the takeaways that I have from it. Number one, I, you know, very much appreciated the transparency. And I think we need, you know, more of that and more guidance. And hopefully, that's forthcoming from the California Privacy Protection Authority, that is kind of forming and got, you know, appointed the executive director a few days ago. I think the key point, the key points in there. First of all, it was very granular, right. There were very specific pieces. It wasn't just big picture, egregious kind of breaches, there were very specific granular things, which is important to note, there was a big focus on do not sell the do not sell link, the analysis of sale, The operationalizing of the opt out, and, and consumer requests. So that's important. And the other piece that I saw that was important is that there wasn't a focus on transparency. You didn't include the rights you didn't include the right processes. You didn't disclose the third party sharing it was very, it was basically guys like how you draft your privacy. Notice this is important and I've been telling telling clients that and I think This is kind of, you know, reinforces that point here in the US. And I mentioned the WhatsApp case, which really highlights, you know, the whole case revolves around transparency, and how to draft privacy notices. And that like nine digit fine consequences of privacy notices being not clear. So I think the combination of those things, especially since CPRA, and you asked that PPRA thing CPRA and the other laws, right CPA CPA, they borrow a lot of concepts from GDPR, they borrow them sometimes for beta, right, like the definition of consent is literally copy pasted some some terminology regarding, you know, data minimization, data retention, purpose, limitation, transparency, they're borrowed from GDPR. And so I think that both of those things together, really highlight the importance of you No, understanding what's going on, like, you need to understand what what you're doing, and being transparent about it. And I think that, that is, that's a point of focus that I would anticipate. And then the other points of focus regarding CPRA. Um, you know, I am, I am staying tuned to see, but I think that it would be into one thing that would be interesting to me to understand, is the how these concepts that are barred from GDPR are going to be applied in practice. One generally in to given the fact that, you know, they are kind of, you know, it's like, it's like, when you take those, you know, cells from one place and you like, you put them in, you know, in the in the lab, and like something else and see how it grows, right. So this, these concepts, and GDPR will be interesting to see how they develop against the, you know, CPRA and the other law ecosystems, which are different, which have the concept of sale, which have the, you know, kind of historic kind of us

Justin Daniels  26:56  

approach. Well, thank you, that was very interesting. But on more of a personal note, as we like to ask all of our guests, what is your best security tip?

Odia Kagan  27:11  

For consumers or for companies?

Justin Daniels  27:14  

Um, we'll start with consumers. Um,

Odia Kagan  27:19  

that's really bad. That's really difficult saying, Yes, um, because it's, um, you know, there, it's, I would say that try to be mindful about what you're doing and understanding that in the in your quest and rush for convenience. Ooh, this looks cool. Oh, this looks nice. Oh, this is a good function. I was talking to my husband, I'm not going to name name names, but he, like talked to me about some, some new product and explain what it would what it does. It's like, oh, it's really cool. I'm like, Are you serious? Like, you understand that it's going to be doing this and this and this. And like, Yeah, but just when you tell it through, I'm like, really. So I think, you know, stopping and thinking about this, that's not the end all and be all because we all you know, we all err on the side of clicking, I accept, because I really need to do this now. But I think kind of stopping and being mindful about it maybe is the tip that I can

Jodi Daniels  28:15  

give. Well, thank you. Now, I very much appreciate your energy, and I can certainly see the passion when it comes to talking about privacy. When you're not talking about privacy, though, what do you like to do for fun. Um, so I,

Odia Kagan  28:34  

I have a kayak and I kayak on the river and near my house, and I do that, while I'm looking at I really like, you know, kind of kind of water front in the water and being on the water. And I do that while listening to podcasts or audio books, both of which I really love and I have the multitasking of it. Um, and I also really, like, I really like makeup. And you're like, Sephora is like the happiest place that I can be at. And I could be you know, once my husband actually called me after four hours to see where I was, and I was getting to the checkout. Um, and I like face painting. So I do you know, like different face painting things like superheroes and things with my kids. So I like doing that.

Jodi Daniels  29:21  

Super fun. My mom loves before. And we I'm sitting next to a fellow kayaker who would like more kayaking capabilities, but we don't have so many kayaking places here. Indeed. Well, Odia, It's been a pleasure talking to you today. If people would like to learn more or connect with you, where is a good place for them to do that?

Odia Kagan  29:43  

I'm definitely on LinkedIn. I do a lot on LinkedIn. I'm happy to hear from people and and they post a lot of content if it's interesting to you on this topic. So LinkedIn or you know, my Fox Rothschild bio has my contact information.

Jodi Daniels  29:55  

Well, wonderful. Thank you again for sharing all this great information. So with us today, we really appreciate it.

Odia Kagan  30:02  

Thank you very much. Thank you for having me.

Outro  30:07  

Thanks for listening to the She Said Privacy/He Said Security Podcast. If you haven't already, be sure to click Subscribe to get future episodes and check us out on LinkedIn. See you next time.

Mike Snader is the Associate Director of Cyber Investigations at Kivu Consulting. Kivu helps companies prevent and manage cyber ransoms and theft. In this role, Mike negotiates with cyber-terrorists after they have locked a company’s data in order to avoid the worst outcomes.

Mike joined Kivu’s team in January of 2020 after 25 years at the Scottsdale Police Department. He spent most of his police career in investigations, including nearly a decade on the United States Secret Service Electronic Crimes Task Force. Mike is also a Resident Security Agent for Major League Baseball.

Available_Black copy
Available_Black copy

Here’s a glimpse of what you’ll learn:

  • Mike Snader explains the process of ransom negotiation
  • How cyber ransoms have changed in the last 12 months
  • Is your company a main target for ransomware?
  • The various services offered by Kivu
  • Mike’s top security advice: be wary of phishing emails and texts

In this episode…

You’ve walked into the office and none of the computers work. You call the IT team and they find an unnerving message on the screen: “We’ve stolen your data and you must pay for it to be unlocked.” You only have one question while the sudden dread settles in: What do I do now?

Thankfully, Kivu Consulting’s got your back. Their team of highly-trained investigators are ready to negotiate with cyber terrorists in any situation. It’s similar to sending the SWAT team to negotiate for hostages at a bank robbery — but with specialized training for invisible, online attackers.

In this episode of She Said Privacy/He Said Security, Jodi and Justin Daniels sit down with Mike Snader, the Associate Director of Cyber Investigations at Kivu Consulting, to discuss ransomware negotiations. Mike explains the negotiation process, how Kivu assists clients, and his tips for avoiding future scams. Stay tuned.

Resources Mentioned in this episode

Sponsor for this episode…

This episode is brought to you by Red Clover Advisors.

Red Clover Advisors uses data privacy to transform the way that companies do business together and create a future where there is greater trust between companies and consumers.

Founded by Jodi Daniels, Red Clover Advisors helps their clients comply with data privacy laws and establish customer trust so that they can grow and nurture integrity. They work with companies in a variety of fields, including technology, SaaS, ecommerce, media agencies, professional services, and financial services.

You can get a copy of their free guide, “Privacy Resource Pack,” through this link.

You can also learn more about Red Clover Advisors by visiting their website or sending an email to

Episode Transcript

Intro 0:01  

Welcome to the She Said Privacy/He Said Security podcast. Like any good marriage we will debate, evaluate and sometimes quarrel about how privacy and security impact business in the 21st century.


Jodi Daniels  0:20  

HI, Jodi Daniels here. I'm the Founder and CEO of Red Clover Advisors, a certified women's privacy consultancy. I'm a privacy consultant, and a certified Information Privacy professional and I provide practical privacy advice to overwhelmed companies.


Justin Daniels  0:36  

Alright, Justin Daniels here as we're having the plumbing done today, I am passionate about helping companies solve complex cyber and privacy challenges during the lifecycle of their business. I am the cyber quarterback, helping clients design and implement cyber plans as well as help them manage and recover from data breaches.


Jodi Daniels  0:53  

And this episode is brought to you by experimenter Red Clover Advisors, we help companies to comply with data privacy laws, and establish customer trust so that they can grow and nurture integrity. We work with companies in a variety of fields, including technology, SAS, ecommerce, media and professional services. In short, we use data privacy to transform the way companies do business. To learn more, visit So Justin, who do we have with us today?


Justin Daniels  1:24  

Well, today we're going to be talking about a very specific part about incident response. And that is ransomware negotiation, though Today we have with us Mike Snader, who has been with Kivu since January of 2020. Working in incident response and ransom negotiation. He spent 25 years at the Scottsdale police department with most of his career and investigations, including nearly a decade on the United States Secret Service Electronic Crimes Task Force. Welcome, Mike.


Mike Snader  1:52  

Thank you. Thanks for having me.


Jodi Daniels  1:53  

Well, it's a pleasure to have you Mike, can you share a little bit about your background? How did you get to where you are today?


Mike Snader  2:00  

Where am I today, sitting at a desk at my house trying to not infect the world with COVID? Right? All right. So I retired from the police department and January of 2020. And a friend of mine went to work for Kivu Consulting and explained to me what he was doing, it was very similar to what we were doing at the police department, except it was a private company doing it. And I went see him at work one day and saw what they did, which which is incident response. And a couple days later, I was hired Kivu and was working in Incident Response Team.


Jodi Daniels  2:35  

So what does that mean? What is ransom negotiation with different


Mike Snader  2:41  

types of incident response, and ransom, just a component within the incident response. And if you were to think of it like like a crime show, and whatever your average our crime show, I'm sure you've seen negotiators negotiating for hostages, we're we're doing the same thing. So we will open up communications with our bad guy. And then we start making offers based on their offer. And then we negotiate whatever, whatever we're choosing to so if you're negotiating somebody out of the bank, let's say we're having that conversation. And while that conversation is going the incident response, people are putting up a perimeter and staging a SWAT team and putting people around the building to fix whatever damage occurs during the incident itself that I'm kind of my metaphor there.


Justin Daniels  3:29  

So can you just talk to us a little bit in general about how a ransom negotiation process works


Mike Snader  3:34  

kind of from start to finish, a client aka victim will be ransomed. So will they'll walk into their can typically walk into their office in the morning and their computers don't work, they'll call their IT people and say, our computers don't work. And the IT people have a very bad moment right there where they realize that there's a ransom involved. And they're staring at a screen that says, we have taken your data, and we're going to sell it and make lots of money if you don't pay us a ransom, and then it kind of spirals from there. So the client will usually call their insurance company, if they have cyber insurance. The cyber insurance brings in breach coaches, privacy counsel, and privacy counsel then brings us into the discussion and we all kind of play together nicely to do what really what's best for the client. At the end of the day, we're talking to the bad guys. Generally a couple different angles, bad guys go. And we deal with both those angles to meet the business continuity of really to meet the continuity of whatever business or private person Grantham. So the two angles that are normally attacked. One is that it's actually locking up the data so a company can't get to their information. And then the second part is the bad guys like to steal the data, and then they sell that data on brilliant. This sounds like a really deep, mysterious place, but on the deep dark web, and there There are plenty of markets that will buy that information.


Jodi Daniels  5:02  

Well, thank you for sharing, you know, over the last 12 months or so can you share about how ransom negotiation has changed? Yeah,


Mike Snader  5:11  

ransoms. I mean, ransom has a whole stay very similar. Some tactics have changed in the last 12 months been some really large cases that have been very public, like colonial pipeline, that that threat actor, that bad guy is not operated well, dark side as a named group is not operating anymore. The FBI and international authorities shut them down. But at the same time, they just opened their door up with a different name attached to it. So we're pretty sure we know who that the new group is. What we're seeing is is more expensive ransom to larger companies. And we're seeing a lot of offshoot or ransomware, as a service called in the ransomware as a service are these smaller splinter factions that basically franchise out they'll they'll buy the software that that will cause the ransom, and then they try to work it as they would a small mom and pop business and they split the revenues, then with the big boys. It sounds very convoluted, but it's it's like franchising, and I don't want to use a franchise store. But it's like franchising, a little convenience store.


Justin Daniels  6:19  

So, Mike, from your experience, are you most often working with on these projects through legal counsel? Or do you very often work with the company's director directly with outside without outside legal counsel?


Mike Snader  6:33  

The majority include legal counsel, we actually we strongly support that because of the privacy side. If they're not so worried about privacy, and they are just encrypted, then sometimes the privacy side, sometimes privacy counsel isn't necessary. But we we like to we actually like to work with privacy. counsel,


Jodi Daniels  6:53

you had mentioned a trend of working with larger companies. Do the smaller companies also have to worry about ransomware?


Mike Snader  6:59  

Great question. Yeah, we have a spike in business right now with with really small businesses, like sole practitioner doctors, sole practitioner, accountants are prime targets. Some smaller construction companies seem to be getting hit. Now I can look at some stats while we're talking, as I look at some stats while we're talking to me, so if you were to break down the industry, and these are just some of our tools that we use, professional businesses, so lawyers, doctors are a big, big chunk of what we're seeing in the ransomware game now. But when I say big chunk about 30%,


Jodi Daniels  7:35  

that's really interesting. I think so many smaller companies think they're so small, I'm boring. Leave me alone there.


Mike Snader  7:41  

Yes. And when they when they get locked up, or when they get ransom, they usually have that that moment where they're like, why are they going after me? At the end of the day, if you're a sole practitioner, Doctor, you have all the HIPAA compliance, and they know that it's gonna be an expensive route for you, just on the privacy side


Jodi Daniels  7:57  

really interesting really furthers the need for all businesses of all sizes to protect.


Justin Daniels  8:04  

So Mike, knowing Kivu a little bit, but for the benefit of our audience. What other types of services does Kivu provide besides this particular service with ransomware?


Mike Snader  8:15  

Ryan, is that something you want to go down the list because you'll give it much more credence than I will?


Ryan  8:20  

Sure Thank you, Mike. My name is Ryan. I'm with Kivu. consulting Kivu provides a full lifecycle of breach services. So Kivu supports organizations with pre breach services, which typically, folks associate with maybe penetration tests, risk assessments, tabletop exercises, and other services in that category will also help with business continuity and restoration plans. Kivu then does your traditional digital forensics and incident response services, doing log analysis, imaging devices doing doing collections, helping organizations contain incidents, we also have our post breach remediation services, where we come in and help organization with boots on the ground services, people on site helping rebuild and restore organization, help them decrypt data and get that back into the environment when they are encrypted. And then we wrap that around with managed services with a managed MDR managed detection and response service where we help contain and remediate and continually monitor organizations for evolving and continuing threats to the organization. That's what Kivu does.


Justin Daniels  9:33  

Well thank you for that. And so Mike, kind of changing topics just a little bit is from all of your years in this industry, and what the Electronic Crimes Task Force is, do you have a best security tip for our audience? Well,


Mike Snader  9:46  

sarcastic I would say don't plug anything into anything. That's my that's my failsafe you know from the reality of things are if you keep your security updated if you keep your operating systems you keep everything you're running what the current best practices are, you're going to be in better shape. If you're a larger company, you got to have an onion approach, defense in depth approach, though layer after layer after layer. And if you're just again, if you're just the average person, if you're really careful what you clicked on, and those phishing emails that that you read about that you hear about, you see about are very real. I get them sent to me every day and somebody wants to know if this is a phishing email. Now it's text messages. So not to disparage Amazon. There's no free amazon gift cards, though. Don't ever click on that. It's the whole story of if it's too good to be true. You get marketed all over the place, though I tell people limit their exposure as much as they can. If you're if you're the person that walks through the mall and fills out the winner free card or coupon whatever the thing you put your name, address and your phone number on and your email contact you should expect to get more phishing email than the average person. convoluted answer.


Jodi Daniels  11:03  

Now there are excellent tips, excellent tips, I really like the walk through the mall and fill up


Mike Snader  11:08  

the form for the free car. There's nothing free. Just doesn't So Mike,


Jodi Daniels  11:12  

when you're not giving out security advice, or helping companies when they have these types of situations. What do you like to do for fun outside of


Mike Snader  11:23  

messing with the keyboard? Like I like to play golf, and I'm a huge baseball fan, huh?


Justin Daniels  11:28  

I love baseball fan. Dude, we could have a whole separate discussion, we can definitely have


Mike Snader  11:33  

a separate discussion about


Jodi Daniels  11:34  

that as well. So thank you so much for sharing all this great information if they want to learn more about Kivu and ransomware negotiation Where should we send our audience?


Mike Snader  11:45 So I didn't want to mess that up. And I have Ryan looking at me with that smirk on his face


Jodi Daniels  11:54  

Perfect, though. Well, Mike, thank you again for joining us. Really appreciate it.


Mike Snader  11:58  

You're very welcome. Thanks for having


Outro 12:03  

thanks for listening to the She Said Privacy/He Said Security podcast. If you haven't already, be sure to click subscribe to get future episodes and check us out on LinkedIn. See you next time.