Tim MartinTim Martin is the Chief Development Officer at Pillar Technology Partners, an information security company that helps businesses improve cyber risk and optimize their processes and technology to ensure growth and security. Tim has more than 30 years of experience building and leading companies in every area of business, including executive leadership, IT, technology, sales, and more. He is also an expert at helping accounting and professional services firms drastically grow their businesses.

Skeet Spillane is the CEO and Chief Information Security Officer at Pillar Technology Partners. He has more than 25 years ofSkeet Spillane experience in information security, business process improvement, and enterprise architecture consulting and has worked with some of the largest Fortune 50 companies in the world. In addition to this, Skeet is a Certified Information Systems Security Professional (CISSP), a Certified HIPPA Security Professional (CHSP), and a Six Sigma Master Black Belt.


Available_Black copy
Available_Black copy

Here’s a glimpse of what you’ll learn:

  • Tim Martin and Skeet Spillane talk about their decades-long backgrounds in cybersecurity
  • The most common cybersecurity threats that companies are facing every day: email attacks and third-party vendors
  • How the increase in remote work has opened up opportunities for unique—and dangerous—cyber risks
  • Tim and Skeet’s strategies for helping C-suite executives understand the immediate importance of cybersecurity
  • Do companies really know what customer data they’re collecting and how to protect their assets?
  • Tim and Skeet share practical tips to help you protect your personal information today

In this episode…

Does your company struggle to identify—and remedy—common cybersecurity threats? What about the threats you may be presenting to your clients and customers?

It is surprisingly common for companies of all shapes and sizes to be in the dark about the risks they experience—or present—every single day. From partnering with third-party vendors to amassing an unprecedented amount of customer data, cybersecurity risks are everywhere. So, how can you pinpoint and solve pressing threats to keep your company and clients safe? Cybersecurity experts Tim Martin and Skeet Spillane are here today to answer this question and many others!

In this episode of She Said Privacy/He Said Security, Jodi and Justin Daniels sit down with Tim Martin and Skeet Spillane, leaders at Pillar Technology Partners, to discuss how your company can mitigate pressing cyber risks today. Listen in as Tim and Skeet reveal the most ubiquitous cybersecurity risks companies are facing in 2021, how to help your company’s C-suite implement effective security/privacy measures, and the practical steps you can take to protect your personal data right now. Stay tuned!

Resources Mentioned in this episode

Sponsor for this episode…

This episode is brought to you by Red Clover Advisors.

Red Clover Advisors uses data privacy to transform the way that companies do business together and create a future where there is greater trust between companies and consumers.

Founded by Jodi Daniels, Red Clover Advisors helps their clients comply with data privacy laws and establish customer trust so that they can grow and nurture integrity. They work with companies in a variety of fields, including technology, SaaS, ecommerce, media agencies, professional services, and financial services.

Their free guide, “How to Increase Customer Engagement in a Private World,” is available here.

You can also learn more about Red Clover Advisors by visiting their website or sending an email to info@redcloveradvisors.com.

Andrew Hepburn

Andy Hepburn is an expert in privacy, digital marketing, and technology law. He is the Founder of Neolaw Hepburn LLC, which provides innovative legal services to technology and tech-related companies.

Andy’s expansive legal expertise stems from decades of experience in executive roles with companies like Sizmek, Sony Mobile, EquaTerra, and more. In addition to this, he has been a Certified Information Privacy Professional (CIPP) since 2013.

Available_Black copy
Available_Black copy

Here’s a glimpse of what you’ll learn:

  • Andy Hepburn talks about his background as an in-house lawyer
  • How privacy and security issues have grown and evolved over the past few decades
  • Andy explains how regulation laws are affecting Big Tech in America
  • Will there be new legislation on the state and federal levels in the coming years?
  • The challenges of implementing effective, consumer-friendly regulation
  • Andy discusses the nutrition label approach to privacy notices
  • The importance of paying close attention to the privacy and security provisions in your company’s contracts
  • Andy’s urgent privacy/security tip: don’t allow location tracking!

In this episode…

Are you sick and tired of Big Tech corporations evaluating, recording, and predicting your every move? Do you wish that there was a way to regulate your right to privacy and security as a consumer?

Although the legislation around regulation is complicated, it essentially begs this question: is there truly an effective way to regulate Big Tech behaviour to ensure the integrity of our personal data? To help Justin and Jodi ponder the topic of regulation laws, Andy Hepburn, an expert in privacy, digital marketing, and technology law, is here to share his sage advice and actionable tips.

In this episode of She Said Privacy/He Said Security, Jodi and Justin Daniels sit down with Andy Hepburn, the Founder of Neolaw Hepburn LLC, to discuss the impacts of regulation laws on corporations and consumers. Listen in as Andy shares how privacy/security concerns have shifted in recent years, what regulation really means for the tech industry, and why turning off your phone’s location services is vital to maintaining your personal privacy and security. Stay tuned!

Resources Mentioned in this episode

Sponsor for this episode…

This episode is brought to you by Red Clover Advisors.

Red Clover Advisors uses data privacy to transform the way that companies do business together and create a future where there is greater trust between companies and consumers.

Founded by Jodi Daniels, Red Clover Advisors helps their clients comply with data privacy laws and establish customer trust so that they can grow and nurture integrity. They work with companies in a variety of fields, including technology, SaaS, ecommerce, media agencies, professional services, and financial services.

Their free guide, “How to Increase Customer Engagement in a Private World,” is available here.

You can also learn more about Red Clover Advisors by visiting their website or sending an email to info@redcloveradvisors.com.


Host (00:00):

Okay. Okay. We’re good.

Host 2 (00:07):

Hi everyone. Jodi Daniels here, I’m the founder and CEO of Red Clover Advisors, a certified women’s privacy consultancy. I’m a privacy consultant and a certified informational privacy professional. And I help provide practical privacy advice to overwhelmed companies.

Host (00:24):

Hello, Justin Daniels here I am passionate about helping companies manage their cybersecurity risk. I am a cybersecurity subject matter expert in business attorney. I am also the cyber quarterback, helping clients design and implement cyber plans as well as help them manage and recover from data breaches. I’ve also helped companies provide cyber business consulting services.

Host 2 (00:52):

And today this episode is brought to you by Red Clover Advisors. We help companies to comply with data privacy laws and establish customer trust so that they can grow and nurture integrity. We work with companies in a variety of fields, including technology, SAS, e-commerce media agencies, professional in financial services. In short, we use data privacy to transform the way companies do business together. We are creating a future where there is greater trust between companies and consumers to learn more, visit redcloveradvisors.com. And today I’m so excited to introduce our guest Andy Hepburn, Andy and I have had the great pleasure of working together in a number of different clients. And Andy is a privacy and a technology lawyer with decades of experience representing both buyers and sellers of technology. He has been a certified information, privacy professional since 2013, including leading the global GDPR compliance program for Seismic, where he was general counsel. So Andy, welcome to the show.

Andy Hepburn (02:01):

Happy to be here. Thank you.

Host (02:03):

We’re So excited. All right. I think Justin, you’re going to kick us off.

Host (02:07):

I’m kicking us off after all my technical difficulties. Okay. Technical difficulties talks with Mike. I see. Good morning, Andy. How are you?

Andy Hepburn (02:15):

Good morning, Justin. Good to see your fingers.

Host (02:18):

Nice to see you. So as always, I’m fascinated by your career arc where you’ve been both in-house and

Andy Hepburn (02:27):

Out-house. Yeah. HAHA

Host (02:28):

So I’d love to have you give us a sense of your career arc because it’s really an interesting one.

Andy Hepburn (02:33):

Well, yeah, so I have spent actually most of my career in house, which means I’ve worked as an employee for companies as the lawyer who either oversees a business division where I did that back in the nineties, I’m an old lawyer for Georgia Pacific, where I was the technology lawyer for Georgia Pacific when the internet first came on the scene and we were doing things there, like trying to figure out if you could sell a saw mill on the internet. Nobody had ever thought about that before. So we had to figure out some interesting things. And then most recently I worked at a company called Seismic, which is a company that actually serves digital ads onto your phone or into your browser. And it’s now owned by Amazon. As part of that, along the way, I learned a lot about data and privacy and confidentiality and how to protect those things and what are company’s obligations to protect those things.

Andy Hepburn (03:32):

So I’ve had a very interesting career in terms of really understanding the whole, the questions about people’s data companies, data, how to protect them and so forth. And that’s led me to kind of a natural interest in the privacy field. So I joined the predecessor to seismic, which was back in 2012, that that company was called digital generation. And they were the leading company in the area of delivering television ad spots to the servers that, that TV stations would actually slot their ad into into a computer and upload it onto the airwaves. And so DG was like almost a monopoly in that space. They decided that they would try to replicate their success in the digital online advertising market. And so they started buying companies that did that media mind, I wonder. And so I came on right, as they had done those acquisitions, and almost immediately we started having privacy questions and there were no experts in the company that knew anything about privacy. So I volunteered not knowing that privacy was going to be big. I volunteered and got my CIPP certified information, privacy, professional certification and had been doing that ever since.

Host 2 (04:55):

I’m curious since you started with privacy and kind of have seen it over a long haul because a lot of people are just now thinking, Oh, there’s this privacy thing I should pay attention. So they feel like it’s new, except it isn’t actually, it’s, it’s more of an issue because we have so much data now, but it’s not exactly new. So can you share a little bit about some of the privacy issues or challenges that you worked on? I don’t know, 10, 15 years ago, compared to kind of some of the ones that we’re talking about today?

Andy Hepburn (05:27):

Yeah. I mean, I think that the first thing is, privacy is not new, even in the U S where, where, you know, there’s a lot of conversation about US Doesn’t protect people’s privacy and that’s not at all true. I mean, we’ve had laws going back to the sixties and seventies that govern you know, privacy of your financial information and HIPAA. I can’t remember when HIPAA was passed the you know, the healthcare privacy law. And so, you know, we have lots of what are known as sectorial laws, meaning the U S federal government and state governments pass laws that protect your financial information. So there’s a privacy focused law for financial information, and there’s a privacy focus law for healthcare information. And the US laws in those sectors are actually as restrictive as any laws in the world.

Andy Hepburn (06:22):

And in many cases more restrictive, but where we haven’t had privacy laws in the US until recently in California, one might expect is on the area of consumer privacy for the data that you share or make accessible on the internet. And in the US has been kind of the wild West on that front. But it’s, you know, I think it’s important to, to kind of correct the record around there are no US privacy laws. There’s lots of US privacy laws, but they’re focused on what I would describe as the most critical, important areas. There’s one around education, educational information privacy, so it’s not that we don’t have privacy laws, we’re just focused on where it matters. Um, what I think has happened though, is companies, you know, since the internet came on the scene in the 1990s companies, including startup companies that are very creative, have learned how to harvest data from people who are surfing the internet, and to turn that data into gold, you know, they’re, they’re like the Rumpelstiltskin of the internet.

Andy Hepburn (07:31):

And so I think it’s been a really interesting development since, since the mid nineties to see how entrepreneurial companies have learned to make use of cookies which cookies aren’t bad. They save your shopping cart preferences. That’s a good thing. So you don’t have to remember what you put in your Amazon cart. A cookie is what enables it, but, uh, innovative and thoughtful, and creative technologists have also learned how to cookies to track your behavior as you move across websites. And so I think to be fair to the, to the companies that are doing this kind of thing is, you know, their abilities to collect and harvest and use data have far outstripped the law the laws ability to keep up anyway. And so it’s, that’s not wrong, but we’re at the stage now where that, that capability is so pervasive that it’s it’s time. The regulations are happening because it’s time and a lot would, are limiting. People would argue pass time to set standards around, you know, what, what companies can do with individuals’ personal information.

Host 2 (08:46):

I think that there’s a lot to unpack there. And I know one of the big ones is regulation, hot button topic for you. It is, I’m just curious, you were talking about regulation and the sector-based approach, but I’d also love to have you share your comments about maybe some unintended consequences of regulation that has been outpaced by technology. And what I mean by that is when we talk about the telecom act in section 230. So I’d love to get your thoughts around that. Yes, I did. You told me not to, but any set me up. So there’s lots of other regulations you can talk about.

Andy Hepburn (09:25):

Well, I mean I actually my former brother-in-law was the chief of staff for Senator Ron Wyden, who was one of the sponsors of section of the internet internet tax freedom act which was, I think a predecessor to section 230 and other legislation like that had a good intent when it was passed, which was to make sure that there was the freedom for this incredible new information sharing tool platform, global information sharing platform to grow and spawn business activity. And so by providing the safe Harbor shelter, hers from litigation under two 30, I think it achieved its purpose at the time. The question is, has it gotten to the point now where those companies you think about the ones that are most benefited at this point by section two 30 and they’re large companies, the telcos, Google, Facebook, Twitter, all of those companies are reaping an incredible benefit from230.

Andy Hepburn (10:41):

And I guess if for anybody that’s listening, that doesn’t know what that is, but it’s essentially a federal law that says that the website or the web service like Facebook or Twitter, that allows you to post user generated content on their site is not liable for what your content says. So if you post a fallacy or a fraudulent statement on Facebook, Facebook’s not liable for it. And the reason for that is because section 230 says they’re not liable. I think what we’re seeing at this point is a debate, and it’s a healthy debate in my opinion about whether it’s time for these massive platforms that have benefited so much from insulation, from liability to have some accountability and skin in the game.

Host (11:29):

Do you see his smirky smile popping up?

Andy Hepburn (11:29):

So, Yeah. So, so share your opinion as well. I mean, let’s have a conversation.

Host (11:38):

I think if I do that, I’ll be booted off of this podcast. I guess, Andy, I’d like to have a followup question with you, which would be it appears that we will be changing administrations as you and I talked today. The department of justice has sued Google. It appears the FTC may have an action against Facebook. Do you have any thoughts around the regulatory environment and what we may see given what happened in the election and a new incoming administration on some of the regulations for the sectors that aren’t being regulated right now?

Andy Hepburn (12:16):

It’s not just around section two 30 there’s there’s legislation and being advanced on multiple fronts, both on federal and state levels to regulate these large players that kind of dominate the, the internet space. That means that there’s not a lot of other players, but so I think we’re going to see more and more legislation. I think we’re going to see a tightening of accountability for companies that harvest data from consumers and, and use it for their for their own profits and that I think that’s it. I think it’s time for more of that. I think there is time for more accountability. The interesting thing, Justin, you know a lot of a free market folks would say, well, you know, companies, good companies, regulate themselves and they ought to, and in fact, that’s happened in the digital advertising space.

Andy Hepburn (13:17):

There’s no less than off the top of my head. I could name four or five industry consortiums that are focused on self-regulation of digital advertising. And I’m really trying to make sure that it’s done the right way that it’s based on consent, or at least that you’re informed and have, have a right to share your preferences as a consumer about whether or not a company can track you and that sort of thing. SoI think we’re going to see more legislation. I think eventually we’ll probably see federal legislation. I think section two 30 is one kind of probably the most pressing Avenue, but I can there’s been at least four or five federal bills written and presented for federal privacy legislation. So, you know, the awareness is growing among the regulators and the lawmakers that privacy and information security are not just important to you and I, and our personal wellbeing and privacy space, but also for the health of the, the country, uh, w you know, because the internet is capable of being used by bad actors and we’re seat, you know, it’s been all over the news, we’ve seen a lot of it.

Andy Hepburn (14:39):

So how do you, how do you strike that balance between freedom free market, all that stuff, and also protecting consumers from bad actors and protecting governments from bad actors. And there’s a there’s a path to walk there that I don’t know the outcomes yet, but I’m sure we’re going to see change.

Host 2 (14:58):

I’m always asked, will there be a federal law? Will have more state laws. You’ve hinted already a little bit about your thoughts on more regulation and out of federal law. I mean, it feels like every week there’s a new federal law introduced and still nothing has happened. What are your thoughts on, will it be a federal law? Will it be more state laws, sort of Andy’s crystal ball?

Andy Hepburn (15:23):

Well, I mean, I think we know that we’re going to see more state laws and probably before anything on the federal front, we already have legislation, we have consumer privacy act, and they just passed version two of that. Which is an interesting, the California privacy rights act, I think, is what it’s called. That’s an interesting law, because it doesn’t take effect until is it 2023, Jodi, I think 20, 23. And you go, why would they pass a law? It doesn’t take effect for, two years, three, two to three years. And the answer is they’ve been explicit about it is they hope that the federal government’s going to come in and supersede it with a federal privacy law, which is kind of a weird way of a state trying to twist the arm of the federal government.

Andy Hepburn (16:10):

But I think it’s partly in response to what you said, which is the federal, you know, there have been lawmakers in Congress that have promoted bills for PR for consumer privacy. I’d say if we added them all up, it’s probably close to a dozen separate bills and they’ve never been acted on, and you’ve got to ask why, you know, why, why have, why has there never been enough momentum to get one of those things passed. We could have a very interesting political discussion about why, but the, the answer is it has not. So the States are stepping in to fulfill, to fill a gap and that’s what’s happening. So I think there’s, there’s legislation pending in the state of Washington, Illinois, New Hampshire. And I think we’ll see, and there’s been others passed. I think Arizona passed a law.

Andy Hepburn (16:59):

So there’s already States, you know, that have passed laws and there’s more pending. And I think we’ll see those most likely before federal legislation is passed. My view is, and you’re seeing it’s really interesting because I had some serious debates with businesses that were part of these self-regulatory efforts by industry is essentially the digital advertising industry where I said, yeah we need a federal law. And people are like no, no, federal law. And it’s interesting to see them coming around right now saying, well, actually, rather than a patchwork of state laws that are going to be so difficult to comply with that, it’s going to be nearly impossible. So a reasonable federal law is probably the right thing. So here’s what I’d say, it’s, this is something that’s a theory that I’ve developed over years now is the problem with privacy laws is the best way for anybody.

Andy Hepburn (17:57):

That’s not a privacy expert to understand it is all those annoying Pop-ups you see on websites that say, Hey, we use cookies that we’re going to track you, unless you say otherwise, and you, you know, they’re just every time you visit a website, you see the cookie pop up and it’s just annoying to consumer. So what do they do? They click agree and move on. So is it really reasonable? And let me ask you guys a question, is it reasonable for legislation to be passed that requires consumers to enter into a contract with every single website they visit? Cause that’s, what’s happening

Host (18:41):

First. I guess I have a couple of thoughts, one, Andy, and this is just, just an editorializing.

Andy Hepburn (18:49):

Well, that’s what a podcast is. So please there,

Host (18:52):

I lived through 2008 and I watched the global economy almost melt down insignificant part because we eased regulations in the mortgage industry and we had derivatives and other kinds of financial instruments that CEOs didn’t understand. And so when I think about what we’re talking about today, I have to physically go onto my phone and hit a button so that I’m not tracked. And so I wonder if there aren’t ways to regulate where the default has to be for apps that they’re automatically put on their most consumer privacy friendly settings, and then the consumer can choose to do something different as you know, the view in Europe. And I’ve been to Europe and spoken there, they view privacy very differently than we do in the US. And I think it will probably take regulations that say the default setting on apps has to be pro privacy.

Host (19:55):

When you go to a website, it might be that you aren’t tracked unless you want that to happen. And look, I’m not foolish. I know that the industry will fight hard with all of that because there’s a reason why Google and Facebook are the most profitable and, and what companies in human history. It’s because we’ve learned, they’ve unlocked how to Rumpelstiltskin data into gold. And I think there are ways to reasonably set forth guidelines of what reasonable privacy and re reasonable security and how we set the default settings for apps and websites. Do I think it’ll be difficult because there’ll be different interests. Sure. But I personally think there are ways to get there. That makes sense, but we have to have this debate to bring it as an open issue. So that legislators now realize we now need to act because if you’re an in-house counsel and you’ve done it, think about, yes, we’ve got 48 different state privacy laws to comply with. You already know we have 52 breach notification laws, including Guam and Puerto Rico. And in reality, that stifles innovation and it’s just not sustainable, in my view.

Andy Hepburn (21:05):

Yeah. I mean, I think that’s an interesting view and it’s not uncommon. And I think, what I’m focused on is I think, you know, my personal view is that whether it’s at the state level or the federal level legislation is coming, that’s really aimed at setting that threshold standard, which is people’s in from people. People need to have a choice over whether or not they’re, they’re being tracked online and whether or not their information is being harvested and used for companies to make profits. People need to have a choice about that. And I don’t think anybody seriously argues against that. I mean, at least not, not in a credible way in my view, but the challenge for businesses is how do you do it? How do you give that choice in a way that doesn’t just annoy customers and make them angry?

Andy Hepburn (21:56):

And that’s what I’m thinking is, you know, most privacy laws right now, they’re focused on penalizing, the business that does that, does that work, penalize the company that harvest data restrict them, compel them to do differently. That sort of thing. And that in the answer for businesses right now is to put up a contract in front of it in front of the consumer, dozens of times a day. That’s just not a sustainable way in my view. And we’re going to see it, you know, next early next year, I think Apple has announced this change to their iOS platform where you’re going to start seeing popups that are asking you to permit various information collection and use things because Apple’s turning down the turning on the privacy protections by default. So that’s happening, you know, by a major manufacturer, Apple’s kind of the, that’s the other wealthy company.

Andy Hepburn (22:54):

That’s kind of got a different view on things, but I think the problem is it’s just the way they’re going to do it. Is it going to annoy consumers to death? And what do we have we seen when consumers get that annoyance? They just click through and agree. And so that’s not the answer because you’re going to just give permission because you’re tired of, you’re not going to read a contract just to read that article or whatever the case may be. So I think this is I’m getting to a long story to get to a quick point, which is, I think legislation needs to focus on the consumer and making it easy through some, through some legislatively mandated standards, make it easy for consumers to set their default privacy preference that they don’t have to say that they don’t have to say it over and over again, set it one time and then it gets promulgated to any new service provider website, web service, so forth that you encounter as a consumer online.

Andy Hepburn (24:01):

And I think that would go a long way toward, and, you know, you think about, well, where has that anything like that ever been done? Well, you know, um, how about the labels on food, food products? You know, you have a sense, you, you have an easy way to read a food label and understand whether that food is healthy or not. So what if you said, you know what you’re going to say every website is going to have a digital privacy label. That’s kind of a very standardized approach to, uh, disclosing what your privacy stance is toward consumers. And then I can, and then in my browser, or on my phone or in the iOS preferences or Android preferences, I can say only food levels are above this standard can use my data without asking me about it. So I don’t know if that’s an answer, but I think legislators need to think creatively about giving the power to the consumer rather than annoying them to death. And ultimately, I mean, it’s what we’re seeing with GDPR is cookie preference manners are being ignored. And so, you know, the businesses are getting to the end result, just annoying the consumer along the way. So I think we need to be very creative. We, the legislators need to be very creative about the standards that they create and the privacy approach that they create, so that it’s helpful to consumers, which is ultimately the goal of it in the first place.

Host 2 (25:34):

Yeah, I have seen, it’s funny, you mentioned digital sort of the nutrition label approach because there’s been a couple of different attempts at having nutrition label, privacy notices, there’s there’s companies that have created them. California, even once tried to suggest that the mobile app industry has tried to suggest it. It just hasn’t, it hasn’t been adopted yet, but I do think we’re at the cusp of a big shift and a change. And I do think we’ll get to something along a nutritional label because the long notice is that let’s be honest, we help, write, Because they’re supposed to be concise yet complete. And so it’s kind of impossible at the moment to do it, but, but I do like some of the companies that have created it’s quite nutrition label, but we at least have summaries. And there’s a very visual approach to being able to have a privacy notice. You can go to a page and figure out what it is that you’re looking for. We’re using icons, you click that and then you get more information. So it’s I guess the baby, the baby approach, the sit up then we’re going to crawl, then we’re going to walk and then we’re going to run like we’re in like, just pass it up across

Andy Hepburn (26:45):

The evolution of man, of men and women.

Host 2 (26:48):

There, there, there, there you go. You used cakes earlier. Does that mean you’ll watch planet of the apes now? No.

Host (26:54):

But I want to switch gears because Andy, kind of in our pre pre-show chat here, there’s something that I know you work on and also have some thoughts that I think would be really, really helpful for people to hear. And that is oftentimes companies will enter into a contract and maybe not realize or appreciate quite the privacy and security provisions that are in it that they might’ve agreed to. I come across this a lot in some of the younger companies or they’re a company and they need to have those provisions in them, but they don’t have them. So I’d love if you can share sort of the, why companies want to pay attention to the privacy and security provisions in these, in these contracts, as it relates very specifically to these privacy laws. And then I’m sure Justin over here is going to also have some fun with this same question. So it can be the Justin, it can be Jodi interviewing and the Justin and Andy conversation here. We’re here for Andy. We are here for Andy. We are here for Andy. So yes, Andy please.

Andy Hepburn (28:03):

Well, I mean this is a challenge for businesses that goes way beyond privacy is, you know, I need to keep my business running and to keep my business running. I need to sign that contract with a huge buyer. And the huge buyer has a team of an army of lawyers behind them that are writing these very pro buyer contracts. And you have two choices as a business, right? One is, well, I’m going to negotiate the stuff I can’t comply with. Okay. Well be prepared to spend six months at that effort to get it through the procurement process of a large buyer organization. Well, what if you’re a business that can’t wait six months to sign deals while you’re out, the alternative is just sign it and hope for the best. So it’s an interesting and dynamic, and it’s not just privacy where that dynamic occurs.

Andy Hepburn (29:05):

It’s all sorts of all sorts of risks, but that scenario of, you know, powerful buyer or powerful seller, you know, a powerful, you know, Apple, Apple, and Facebook and Amazon and Google, they don’t yield very much on their contract terms. I know this from personal experience. And so it’s, there’s always a question of who has the power in the negotiation that drives a lot of this stuff smaller and medium-sized companies often just capitulate to keep the business flowing and keeping revenue coming in the door. And so I don’t really blame them for that. The one thing I would say is it’s not, you don’t have to be either, or, and that’s where I think really good advice can help companies is, you know, don’t you don’t have to go into a six month long negotiation with a large counterpart in a contract to get to some, reasonable concessions that enable you to, at least have a fighting chance at compliance with that contract and the law the way you do that is be very focused on on the key critical terms of that contract are where you need some concessions to not put your company at great risk or a great expense to, for compliance.

Andy Hepburn (30:27):

And so I think what happens though, is companies just throw up their hands. You know, we’re not gonna spend any time. They just throw up their hands and they move on. And I don’t really advise that to anyone. I’m all for risk-taking. One of my in-house counsel kind of learning experiences is that every company that’s successful has taken significant risk. So it’s not about not taking risks, but stupid risk is not the right way to go. Intelligent risk-taking is the way to go. And you can’t take intelligent risk about a contract unless you’ve actually read the contract. So you got to read them, you gotta focus in if you’re, if you’re in an imbalanced power negotiation between you and your counterpart, focusing on the critical things like, well you can come audit me, but how about we only do it once a year, unless there’s been a problem, uh, or how about it’s only focused on very narrow, you know, outcomes so that you can avoid the cost of audits, or you can focus on just outright limiting your liability and saying, you know, we’ll accept liability for our privacy or data or data breach, but not the kind of liability that would sink our company and caused us to go out of business and file bankruptcy.

Andy Hepburn (31:45):

Those are, those are arguments you can advance to even a large counterparty and in a negotiation. And if you’re focused on the big critical things, sometimes you can be successful, but I have a lot of sympathy for companies that are in negotiations, where there’s a real imbalance in the negotiating power and the number of lawyers they have on the, on the effort, that kind of thing. The end result or the, the kind of foundational objective should be, informed risk-taking rather than throw up your hands and don’t read your contracts at all.

Host (32:25):

That’s reasonable. I’m going to flip this question back to Jodi, because Jodi, you’re a business owner. You’ve worked really hard to go through an RFP process. You’ve sold the company that they should hire you all of your expertise, and then you get the contract. And some of them have been pretty involved from your perspective as a business owner and privacy professional, you know, what are your thoughts on that contracting process? And are you able to go through all of those terms or who are you relying on to make sure that you’re covered?

Host 2 (32:59):

Well, I have a slight advantage with my in-house attorney living in the same house as me. Not everyone has that, but I am also someone who wants to understand what’s in the contract and what I’m agreeing to, uh, as a business owner. I mean it’s me, whether there’s a corporation it’s my brand. It’s, it’s my livelihood. It’s everything I’ve worked so hard for. I want to know what’s in that contract and that I’m, uh, what I’m agreeing to, and then I want it to be reasonable. So if I’ve been asked to do something that ha didn’t seem very reasonable, we pushed back. I mean, and there were some interesting ones we’ll save that for the cocktail, not recording conversation,

Host (33:48):

But I guess my point is, Jodi, we’ve been through this enough now, what are the issues that come up almost every single time that we can discuss in a recording, what are always the issues, limitation of liability indemnity. So those are the ones to Andy’s point that if you want to be strategic about what you’re going to discuss to get in a better position, that you’re comfortable signing the contract, you’ve seen that in your own business for the issues that are the ones that always come up in every single transaction and just, you know, your thoughts around that process. Cause Andy, I’m sure you can appreciate this. Most clients, they go through this arduous process and the contract is the, like one of the last things they do. It’s like the one thing standing between them and the finish line. And then the other thing is, is how many people want to pay their lawyer to get into a protracted negotiation with a large company who can just keep saying no and wear you down and you see your legal bill.

Andy Hepburn (34:48):

I think it just needs to be the balance of making sure that I’m, I’m able to deliver the services and still be protected in, and it be reasonable. That’s my business owner perspective.

Andy Hepburn (35:06):

Yeah. And I think one of the learnings I have from many years of doing this kind of work is if you do come back even to a large buyer or a large seller that you’re doing a deal with, if you do come back with just some very, well-reasoned kind of three or four critical points that need to change in order to make it a fair deal for you at the price you’re offering, you know, that there are a lot more inclined to engage in that. Then if you’ve completely shredded their contract in a red line and tried to get, you know, changes to stuff that’s maybe important, but not critically important. So that’s a strategy that I think a lot of companies can employ is one of those critical risk issues that I need to address in my contracts.

Andy Hepburn (35:56):

But it’s kind of interesting, if you bring this back to privacy, because when we started doing data protection agreements, which are essentially privacy agreements about that, that are designed to help companies comply with GDPR requirements. When I first went the very first one I saw from a vendor, it was from a customer. So we were, this was, we were the vendor. I was representing the vendor and the customer was saying, um, when you were providing your service, if you mess up and have a data breach or whatever, we want your liability to be unlimited. And we said, well, the problem with that is most of the time where that, that bad thing happens, that causes us to be liable to you. It hasn’t just happened to you. It’s happened to you and maybe all of our other customers or several of our other customers.

Andy Hepburn (36:49):

And we can’t really take that risk that we lose our business. We destroy our business over one accident and they weren’t very open to that concept. And I understand why, but it’s interesting that as privacy negotiations have gone on now over the years, you’re starting to see on both sides, a more reasonable approach, I would say, toward compromise and that compromise often results in the privacy space. And there’s a higher cap on the damages or the harm that, that you’re responsible for. If you cause a breach then the normal cap for normal. We didn’t do what we said. We were going to do it in the contract as it relates to the product or service. So there’s a higher damages and liability cap, but it’s not unlimited. And I’m seeing more and more compromised in that space.

Andy Hepburn (37:44):

So it’s kind of an example of where the initial reaction to a new risk, GDPR compliance is you have to take unlimited liability if this bad thing happens. And now people are starting to see a lot of times the regulators are reasonable and they don’t impose except on Google and Facebook, don’t impose $20 million fines on medium sized businesses. They’re more reasonable than that. So maybe we don’t need unlimited liability. And it’s just kind of interesting to see the standards of negotiation of contracts evolve over time. As people become more comfortable with risk-taking and what the risk actually is.

Host 2 (38:27):

And what I’d like to offer is I’ve seen companies not even have a DPA. And if you’re listening to take away that you need both, you need this DPA to accompany your main master service agreement, because if there’s personal data involved to comply with these privacy laws, you need to have that. So one of the big pieces that I see missing is just even that whole section.

Andy Hepburn (38:51):

Yeah, no addressing of cyber liability or privacy at all. Yeah, that’s true.

Host 2 (38:58):

Right. Well, the privacy discussion has been super, super fun. And since you’ve been in the privacy space, tell us what is your best privacy tip? Maybe it’s a security tip or privacy tip that you would offer either companies or just individuals.

Andy Hepburn (39:16):

Yeah. For individuals, I would say don’t allow location tracking except by companies that you really trust deeply, turn it off, say no. Um, and if you want evidence for why I’m recommending that, I would say, go check out the New York times report on location tracking. If you think I’m being silly, it’s astounding, it’s scary. You know, it really shows the capabilities of technology to track where you go and, you know, most of that stuff is not used for bad purposes, but it can be. And so I think consumers need to be really careful about sharing their location. Do you really want your location being broadcast like you’re a taxi driving around? So that’s one recommendation I would, I would make to consumers is it’s not, it’s not saying like, if you turn off location tracking for Google maps or Apple maps, you’re not going to be able to use the map feature to go where you want to go.

Andy Hepburn (40:22):

So there are plenty of cases where it’s reasonable and the only way to have the convenience that you want, but in general, don’t widely allow companies to track your location. So that’s one and then kind of a similar one. And Justin, you can probably talk more about this than I can, is learn what a phishing attack is and how they come into your email inbox. Because one of the things I’ve advised my own parents on this. Phishing attacks are just, I see more and more of them every day. And they’re more and more carefully crafted to fool you into giving up personal information. And so I’d say, you know, you can learn what that is and learn how to spot them so that your risk antenna go off when they come into your email inbox. And there’s some good resources like the consumer.ftc.gov, they have a good guidance piece on how to spot phishing scams. So I would recommend that for consumers to learn about phishing, what it is and how to take some precautions to avoid phishing attacks.

Host 2 (41:30):


Host (41:34):

So, Andy, last question, and let’s get off the topic of privacy and security, what do you like to do for fun?

Andy Hepburn (41:42):

What do I like? Well, I like to have podcasts with good friends to talk about privacy.

Host 2 (41:47):

Well, thank you. That counts as the privacy piece. Do you have to pick something else?

Andy Hepburn (41:51):

Yeah, you can. Well, so you can see one of them over my shoulder here, that guitar. I’m a huge music fan. I’m a lot better at listening to music than I am at playing music or making music, but I try. So that’s one of my passions in life. I love music. I’m an outdoors guy. I like to hike and boat and be out in communing with nature. It brings a little peace to my soul. So recommend that for anybody that’s looking for a little piece and it’s COVID friendly activities.

Host 2 (42:24):

Good points, good points. Well, Andy, thank you so much for joining us. Where can people connect with you and continue the conversation?

Andy Hepburn (42:31):

So firm is called Neo law and you can get me at neolaw.com. I’m happy to help anyone who’s struggling with privacy compliance or how to write a contract or negotiate a contract with a big company. That’s kind of my bread and butter. So Neo law.com is where you can contact me and find out more about what I can do to help you.

Host 2 (42:55):

Wonderful. Well, thank you so much.

Host (42:58):

It’s a pleasure.

Andrew Richardson

Andrew Richardson is the Senior Vice President of Analytics & Marketing Science at Tinuiti, the largest independent performance agency across Google, Amazon, Facebook, and other top web-based platforms. Tinuiti has worked with a large number of high-profile clients, including Tommy Bahama, Nestlé, Etsy, and more.

Andrew is passionate about data, marketing, technology, and social media. His areas of expertise include Tableau, Google Analytics, Adobe Analytics, and many more.

Available_Black copy
Available_Black copy

Here’s a glimpse of what you’ll learn:

  • Andrew Richardson talks about his transition into the world of digital marketing and analytics
  • Andrew explains how advertising has evolved over the years—and why personal data is so valuable to advertisers
  • Common advertising challenges your brand may be facing due to new privacy regulations
  • What are cookies, and how do marketers use them?
  • Andrew discusses the give and take that consumers experience with privacy and personal data
  • Who is responsible for promoting privacy and security between a brand and a marketing agency?
  • Andrew’s best privacy tip for listeners: pay attention to your phone’s location settings

In this episode…

As a consumer, you probably already know that the top tech companies in the world—Google, Amazon, Facebook, and the like—record, track, and profit from your data. However, did you know that top marketing and advertising agencies have just as much to gain from collecting your personal data?

According to marketing and analytics expert Andrew Richardson, consumers experience a significant give and take when it comes to privacy and their personal data. As he says, marketers and their clients need to understand the impacts of their advertising methods—and learn how to promote improved privacy and security measures in 2021 and beyond.

In this episode of She Said Privacy, He Said Security, Justin and Jodi Daniels sit down with Andrew Richardson, the Senior Vice President of Analytics & Marketing Science at Tinuiti, to reveal what marketers and consumers need to know about data privacy and security. Listen in as Andrew discusses how advertising strategies have evolved over the years, the importance of cookies in data tracking, and where the burden of privacy really lands in agency/brand relationships. Stay tuned!

Resources Mentioned in this episode

Sponsor for this episode…

This episode is brought to you by Red Clover Advisors.

Red Clover Advisors uses data privacy to transform the way that companies do business together and create a future where there is greater trust between companies and consumers.

Founded by Jodi Daniels, Red Clover Advisors helps their clients comply with data privacy laws and establish customer trust so that they can grow and nurture integrity. They work with companies in a variety of fields, including technology, SaaS, ecommerce, media agencies, professional services, and financial services.

Their free guide, “How to Increase Customer Engagement in a Private World,” is available here.

You can also learn more about Red Clover Advisors by visiting their website or sending an email to info@redcloveradvisors.com.


Host (00:00):

Hi Jodi Daniels here. I’m the founder and CEO of Red Clover advisors, a certified women’s privacy consultancy. I’m a privacy consultant and a certified informational privacy professional. And I provide practical privacy advice to overwhelmed companies.

Host (00:20):

Hi, Justin Daniels here, I am passionate about helping companies solve complex cyber and privacy challenges during the life cycle of their business. I do that through identifying the problem and coming up with practicable implementable solutions. I am a cyber security subject matter expert and business attorney.

Host (00:40):

And this episode is brought to you by Red Clover Advisors. We help companies to comply with data privacy laws and establish customer trust so that they can grow and nurture integrity. We work with companies in a variety of fields, including technology, SAS, e-commerce digital agencies, professional and financial services. In short, we use data privacy to transform the way companies do business together. We’re creating a future where there is greater trust between companies and consumers to learn more, visit redcloveradvisors.com. And today I am so excited to welcome Andrew Richardson, who is the SVP of analytics and marketing sciences at Tinuity, which is the nation’s largest independent digital marketing agency across the triopoly of Amazon, Facebook and Google. Andrew, welcome to the show.

Andrew Richardson (01:36):

Thank You so much. So happy to be here.

Host (01:38):

Well, we are delighted now, before we dive into all things, privacy and security and digital advertising, help us understand who is Andrew. So tell us a little bit about how did you find your way to the worlds of digital and at tenuity?

Andrew Richardson (01:56):

Uh, I’ll go way back. I actually used to be a women’s volleyball coach at university of Notre Dame and at U Penn. And when I stopped doing that I realized that the things that I loved about coaching were actually the numbers and the people. And so I transitioned over and found a great company, worked at the college board for a while and did some educational analytics and had a really great time there and moved my way into digital marketing by way of a local user group in Philadelphia for the software that I was using at the time called Tableau, which is a great partner of ours. And really just kind of fell in love with this stream of information that was headed my way when I was an analyst and the ability to take that information and do something with it that actually made sense as opposed to just looking at numbers on a, on a page.

Andrew Richardson (02:51):

And that was kinda my first foray into digital marketing, did a little bit of a work when I was the college board around some of Google analytics, which was relatively new at the time and some of their web data, but from there kind of made my way through the ad tech ecosystem working at places like point roll and did some pharmaceutical marketing, a place called CMI some more web design and tracking at a place called Delphic digital, which is now owned by hero now at tinuity, which used to be elite SCM where we’re doing a whole slew of different things. So you mentioned the Triopily obviously, but our teams do a lot more with web analytics and data implementation, as well as a lot of work in marketing sciences, and trying to just take, you know, the money that our brands are spending with us and putting them to their best use possible. So yeah, that’s kinda my background

Host (03:52):

Really fascinating career starting all the way in volleyball who would have thought counting and the measurement from sports would land you to measuring Facebook and Google.

Andrew Richardson (04:05):

Exactly. I know. Right. Never, never would have thought. I remember when I was at U Penn, Facebook just had been coming out and now to see, Oh, you can’t log in, unless you have a.edu email address to now being what it is kind of, kind of nuts.

Host (04:21):

So on that stream help us, you know, you’ve seen a lot over this period of time from, you know, obviously we’re going to be talking a lot about privacy and security, but kind of think about the evolution of the advertising ecosystem from that personal data. So maybe help explain where we were and where do we find ourselves today?

Andrew Richardson (04:44):

Yeah, I think the, where we were, if you remember I vividly remember seeing these ads where you would get, like the clapping monkey were like those were ads, right? Those were, were digital advertisements where someone was trying to get your attention. You would end up with the flashing number advertisements. Usually none of those had anything to do with anything that you actually cared about You know, if you were 25 year old you know, recent college grad working a $30,000 a year job, you didn’t really care about mortgage rates. Those were flashing in your face, quite a bit, and trying to get you to refinance your mortgage. And so going back from kind of the spray and pray, so to speak, let’s throw as much spaghetti against the wall and get as much reach as we can was kind of the mentality.

Andrew Richardson (05:36):

I think, of old, you know, broadcast media where we knew kind of what the demographic was of people in certain markets, based off of market research. And we would then go out and try and just advertise to as many people within those different markets as possible. Um, now obviously with the rise of so much data collection, I think the pixel in general and its development and what you’ve been able to do with a very simple piece of technology really at its core as has obviously transformed into being able to do a lot more than just, uh, you know, clapping monkeys and flashing numbers on a page from a display advertising perspective you know, being able to move into having compelling creative that, you know, as we talk about hit someone at the right place at the right time with the right message it’s been quite a transformation and now obviously looking towards the future and the deprecation of cookies and how are we going to measure the things that we’ve been able to measure for so long? And what does it mean? And, um, obviously with privacy being at the forefront of that, and I think it’s just been a big shift from the, my early days of, of advertising and what we were doing.

Host (06:58):

This is the longest I’ve gone on a podcast without first speaking. It was an interesting experience. So when I heard volleyball, I thought we should have a sign that says Wilson in the back. I’m thinking what we’re getting prepared to talk about. We should call this the cookie monster episode. Well, chocolate chip cookies are my favorite food, but kind of breaking back into segwaying into our next topic is obviously in the last year, year and a half, especially with the advent of CCPA, as you talked about, privacy is really coming to the forefront and that has a potential tremendous impact on how advertising is done because it’s so successful because it can be so highly targeted. So with that context in mind, could you talk a little bit about the challenges that you are seeing with brands advertising in this new privacy environment that we find ourselves in?

Andrew Richardson (08:00):

Yeah, I think first and foremost is I think about the changes that have been happening and even some of the ones that, you know, we’re on the precipice of depending on when, you know, the IDFA changes happen on the Apple side and with Facebook and everything within the app ecosystem. I think there’s a few things that come to mind. One is brands that have not invested in data privacy components within their business that really care about first party data for their consumers and keeping it secure, but also then using it to ensure that kind of what consumers are used to doesn’t go back to, you know, clapping circus monkeys is not lost the, the folks that are not investing in that first party data, infrastructure and architecture, and really learning who their customers are, not as in an opportunistic way, but really to, to talk about the, the give and take of learning who you are as a customer.

Andrew Richardson (09:02):

I think that we’ve been pushing a lot of our brands in that direction. That first party data is King. We’ve all relied in the marketing world, on third-party data for a long time on being able to, you know, track and pixel everything. And with those things being less and less, that first party data, and what you know about your consumers is paramount. So that’s one thing. The other piece is just around measurement, right? We’ve gotten so used to being able to measure down in some cases to an individual and that changes and goes back to some of the ways that we used to have to do things which is much more modeling, enabled. It’s much more about looking at things like media mix modeling or marketing mix modeling and leveraging those types of tools and platforms as opposed to attribution modeling still important, but now becomes a lot more modeled data as opposed to one, one to one data.

Andrew Richardson (09:58):

So those are some, some big changes that I think that I’ve seen in some conversations we’ve been having with our clients around how they need to be prepared for this. And I think also brands that are privacy centric and let their consumers know that is an important and important point, right? If we, as, as marketers care about data, as much as we do it to Tinuiti but care about privacy even more. So we should be letting people know that. And if you’re a brand who’s in that same boat and you want to know that if I’m a customer of yours, that my data is secure, that’s important to me. I want to make sure that I’m communicating that if I’m a brand to my customers,

Host (10:40):

So many interesting nuggets and everything that you just shared, let’s help explain the cookie lists and kind of this new frontier that we’re expecting to have. Can you explain what that is when we think it might really make a splash and an impact, and then what should companies be doing? So you kind of had talked about first party data. What might be some tactical suggestions, maybe a company hasn’t been focused on that as much what they should be doing now?

Andrew Richardson (11:13):

Yeah, I think that a couple of things that I would highlight one from just to, to orient people a cookie, right. And what is a cookie is really just a small text file that sits in the browser that then websites can write data to you know, sometimes it’s specific to a device or it is specific to a device and cookies have been around since the mid nineties really. And the initial goal of those was kind of improving the e-commerce experience. There’s different types of cookies. So first party cookies would be ones that are kind of created and published and controlled by the website that you visit. And with things like, I remember what was in your shopping cart, and I know what items you viewed and different preferences that kind of improve the user experience within, you know, working with that brand, right.

Andrew Richardson (12:05):

They get that behavioral data to help kind of the web website owner improved services. And it only goes back to the owner of that domain. So first party cookies and the data collected only go back to the brand. And what you’re, you’re engaging with third-party cookies are ones that are set by third-party servers, ad technology, um, usually is where we think about this. So that’s a code, a, another piece of code that’s placed on the web domain by the owner of that domain. But the data that’s collected there is accessible on any website that loads, that third party servers code. So that allows advertisers to track users across the internet, uh, and kind of target advertising wherever that user goes. Right. So thinking of cookies in that way within the e-commerce landscape, I think is probably the most applicable way to think about this.

Andrew Richardson (12:59):

We obviously use them for ad targeting. You would come visit brand.com. You look at a specific let’s say, pair of shoes, and then, you know, hours or days later, you’re getting re-targeted with an advertisement about that brand or about those shoes, right? That information is going away specifically as it pertains to third-party cookies and their ability to be able to still leverage that data, to do those things. So from an impact perspective, and when it happens we know that in 2021, that this is going to happen. The exact dates of certain things are still yet to be determined. Like I mentioned, there’s things happening in the Apple ecosystem with Safari and with some of the ways that they’re tracking apps, Facebook is changing some of the ways that they actually do like the attribution of marketing as a result of some of this.

Andrew Richardson (13:56):

But there is a lot of shakeup that’s going to come. If you’re a marketer who’s used to relying on retargeting for people that have come to your website and leveraging those third-party cookies as a result of that, some browsers that are already out there brave for Firefox Safari, they already block a lot of the third party cookies. And Chrome has actually also announced that they’re deprecating third-party cookies this year. That change on the Chrome side, because chromium specifically is the underlying technology of Chrome, um, is the largest reach of any of the web browsers out there that will make a massive, massive dent within the industry and third-party cookies in that result. Um, so yeah,

Host (14:42):

So you talked about all this being a by-product of a shakeup, and when you mean shakeup, I interpreted that to mean a lot of what’s going on in Washington, DC, from the Democrats and Republicans, and what’s gone on with big tech. Is that what you’re referring to, or can you kind of give us some further color about that?

Andrew Richardson (15:05):

I think for me, this goes back actually to the fallout of the Cambridge Analytica kind of data conference controversy. I go back then a little more recently to GDPR in Europe and CCPA in California. In just the legislation that has now started to come to the forefront all of the antitrust kind of aside, I think this started before that but it started to bring that more and more because of the antitrust hearings to the forefront of a lot more kind of everyday consumers minds. I think that it’s good for consumers and for companies that this is happening. I’m I am consumer focused, even though I work for a marketing technology marketing company. We know at Tinuiti how important consumer privacy is. And, um, but I think this is a good thing. It’s just a big shakeup for how, since the nineties effectively the advertising ecosystem has operated. And some of the things that folks in marketing are used to being able to do changes as a result,

Host (16:13):

How do you see consumer habits changing? And you had mentioned that you see a lot of consumer privacy focused brands that you’re working with all the time. What do you see kind of between the consumer focused brands, right? Why do they put privacy first and how does that connect with the consumer habits in any changes that you’re starting to see?

Andrew Richardson (16:37):

The way that I think about this is that for many, um, for many consumers and there’s research out there that’s, that’s been done on this. I think when we started to see newspapers digitally needing to charge for subscriptions, but then people thinking about what alternatives are there to charging for subscriptions. Well, would you give me some of your personal information so that then I can give you this thing for free? I think that there’s always that play of give and take with, with privacy and with personal data to how much you’re willing to give up for getting something in return. And we think about that. Let’s just think about location, right, is a very real one today where browsers are asking to know your location apps on your phone are asking to know your location, and depending on how much you want to give up or not.

Andrew Richardson (17:25):

I think it really depends on from a marketing lens. What is the give that I’m giving back to consumers in order to take something from them from from their personal what they may consider private data. So I think that the top marketers out there are really trying to explain what the value is of some of these things. Even just go into things like cookie consent banners, right on websites, you see a lot more of them getting a lot more detailed, because I think when GDPR first happened specifically, they were pretty sparse cookies. Yes or no. Here’s why maybe a privacy control center, if you were lucky. But now they’re, they’re getting a lot more detailed because I think they’ve seen a drop off in a lot of that data. And they’re saying, well, we have some necessary cookies. Are you okay if we just have the necessary cookies? Yes or no advertising cookies, do you want the, like, they’re giving users a lot more choice and a lot more information to determine is my personal information and data valuable enough to give away for what I get in return. Great. Marketers are making that a focal point. You give me this, I give you that, right. It’s not just, I expect you to give me everything and you get nothing in return.

Host (18:39):

Makes sense. You know, it’s interesting that we bring up these points. You talk about geolocation because Jodi convinced me to get a new car. Last year, I got to tell your ride. And I actually turned off the function on my phone, that geo locates me. So then my car was so kind to say, Hey, you know, the Apple maps function will do better if you turn this on. And I’m like, no. So I think I’m good. But then of course, Kia has a map on my car that shows exactly where I’m going because of the sensors. And then his Jodi’s going to laugh in a minute in San Diego. They have drones that fly up in the sky that are monitoring where traffic and whatnot is going. And so I’m sharing this story in a way, because all of those different forms of geolocation impact our privacy. And so I’d love to get your take on how you see companies in your space and your customers dealing with privacy, because it comes in so many different forms. Now with all of the data that’s out there, that’s being collected by all these different devices. Love to get your take on that.

Andrew Richardson (19:52):

Yeah. Just a quick aside, I was at a conference, this was not obviously pre COVID, so it was probably two years ago now. One of the speakers was a the head of data for one of the, the top five car manufacturers in the country. They were collecting more data in a day, in a day because of all of the different sensors and devices, like you just mentioned that are in automobiles. Then they had collected as an entire company in the previous, like 15 years, right. One day, 15 years. And so I bring that up and you were talking about cars because I think that oftentimes consumers don’t really understand how much is being tracked about them. I don’t know that they understand what the implications are of them being tracked or not being tracked. Again, Apple maps does better tracking you. If you’re looking for GPS directions, you should turn it on. If you actually want to know where you’re going, there’s a very clear give and take there. What may not be understood. Um, pre some of the changes that are happening within the app ecosystem is, is there additional data that is being tracked as a part of this, not just you have my locations, so that then you can tell me where I need to go, but what additional information and data are you actually collecting and using this for? So going back to my point of marketers are need to be very clear about what they do with this data beyond just, it gives you a personalized web experience, quote, unquote, I feel like that’s a generic statement that you see in a lot of cookie consent banners is in order to give you the most personalized experience, you should enable everything that I’m asking you to enable.

Andrew Richardson (21:37):

That can be fine for some brands, but not for others. And so I think that, um, in order for brands to really be considered privacy focused, it can’t just be about our data centers are, you know, this and we care about consumer privacy. They really need to be explaining what that means to them. Well, what did, what is my data privacy worth? What are you doing with that data? What are you not doing with that data? And can’t give away trade secrets, obviously, but the ability to still track users, but with their consent and explain what they’re getting as a result of being able to track this data, um, is important. I also think on the flip side of that to a degree because of how much we’ve had users or how much consumers are used to being tracked, when some of this tracking goes away, we’re going to end up with a worse experience as consumers.

Andrew Richardson (22:27):

And I don’t think we understand what that means either. Um, some people may not care, but one that I can tell you for me is, is, uh, impactful. This just happened to me a couple of weeks ago. I was on a website. I was browsing for some clothing items, added them to my cart, didn’t sign in on the website, but added them to my cart. And I use a browser that anytime I leave after there’s an activity, it clears my cache. I went back to the website. Those items were no longer in my cart. And I was like, what the heck? These things aren’t in my cart anymore. I don’t understand. Well that’s of course what would happen? Like the data was no longer being collected because I had expired at the time window. Hadn’t expired. The like these items can only stay in your car for two hours because they might go out of stock. This was an entirely different thing. And so I think that we’re going to see some of that at the same time where, yeah, I’m not going to get the same user experience. I think my conversion rates, if I’m a marketer will probably go down as a result of not being able to make, you know, frictionless conversion paths, so to speak for, for consumers. And it’s going to be an interesting thing to navigate along the way.

Host (23:35):

So it’s funny, you said that. So in another way, you’re saying as we rebalanced towards more privacy and respecting it, how inefficiency are we willing to put up with? Because in my experience, when it comes to having the best experience or the most efficiency with technology, it beats the pants off of privacy and security every time. And now you’re really talking about rebalancing that Seesaw.

Andrew Richardson (24:02):

Yeah. You’re, you’re exactly correct. And how far it balances, like that’s the one thing it’s there, there’s no relative measure to how far out of balance we are right now. Right. It’s just been a constant thing where we’ve gone up and up and up and up. And there is, what’s the diminishing return point here that we actually see. We don’t know. So let’s say a lot of companies I’ll use it as an example. They rely on abandoned cart emails, right. That doesn’t necessarily go away because that’s a logged in experience. And so if you’re a logged in consumer, you still have that. But to your point, if I want to be not inconvenienced, I may want to be able to say, yep, I’m going to just go ahead and turn on that. You can track me for this stuff because the lack of convenience is actually worth it for me. That’s part of the give and take that I was talking about.

Host (24:55):

There are some brands out there that have figured out how to do abandoned cart emails without being logged in, because I know that they are relying on my cookies and I was not logged in to buy the item. And yet I, one hour later did not buy the XYZ item. We’re going to be very kind to this brand when I know who they are, big well-known brand. And they’re always targeting me to keep buying whatever it is. I did not choose to buy at that time. Um, but you had mentioned something really interesting around how marketers need to consider these privacy items. You know, when they’re explaining to consumers. And one of the questions I’m always asked is where does privacy sit, who owns privacy? Who’s responsible for privacy when you’re working with clients, who do you tend to, to discuss these types of privacy items with, is it marketers? Do their privacy teams get involved or are they lawyers? Are they, you know, business people? It would be interesting to hear, I think for people to understand where privacy sits amongst companies.

Andrew Richardson (25:57):

For us at Tinuiti it varies. And I say that because we work with clients that are small and medium business, and we work with clients that are fortune 100 companies. And so it varies to a, to an extent there I’ll give that caveat in the, in the companies that are fortune 100, they have privacy teams, right? That are focused on thinking about marketing and privacy they’re in, they are very closely tied with or have on those teams, legal folks as well that are talking through the implications of different things that happen within the way they collect data, their website operations, how they are in compliance with GDPR CCPA regulations. So it can vary there. The majority of the conversations we have are with, um, with marketers and, and the way that I think about this is actually a little different than pre GDPR world, where pre GDPR, you would go and talk to a client about privacy, and you would kind of get the like eyes glazed over look because, well, like there was no regulation that said I had to do something here with the exception of email, right.

Andrew Richardson (27:11):

CAN spam and all of those types of things that came along. Now I see more and more on the rise of, of companies either building or outsourcing to be able to have privacy groups within their org right now, I think a lot of that responsibility sits within marketing. This is where we are collecting the data, right? It’s not IT it’s not InfoSec that has a play into what we’re talking about. But I think that the, the marketing privacy is in many ways, especially at large brands, very, very different than how their InfoSec or it teams are thinking they will not have the working understanding of the way that the technologies work or how the data is being collected. And oftentimes frankly, the marketers don’t know that either they’ve bought a tool because it solves the problem, but they don’t understand all of the things that are happening.

Andrew Richardson (28:02):

There was a great, um, a great article recently that came out from Ghostery, um, and Ghostery does their tracking the trackers thing every single year. And I encourage everyone to go take a look at that because it’s a really fascinating look at to which brands from a third-party perspective are tracking the most. Where do you see the most cookies and pixels? Um, and it’s the AAA, right? It’s Amazon and Facebook and Google, but then you end up with, um, you know, Twitter and you see Microsoft and comScore and CloudFare and Adobe and Critio, and you look at that and some people might say, I’ve never, what, what is a Critio right. Like, I don’t even know what that is. ComScore, I feel like I may have heard that, but I don’t know what it is either. So I think that, that it really, we rely on the marketers to, to have this understanding, but oftentimes they don’t, they are just trying to solve a problem with a tool or a solution or a cookie in whatever case it is.

Andrew Richardson (28:59):

So I think that there’s a lot of responsibility that falls on them. I will also say working at you know, technology enable enabled marketing services company or agency, like we are, it’s our responsibility to do a lot of that education. If you were to go to our website and just search for privacy or cookie, you’ll see a lot of educational materials that we’ve put out for our brands and for the market in general, because we think that privacy is hugely, hugely important to the success of our brands. We care about, like I said, earlier, consumers and their privacy. And so, um, there’s still a gap though. There there’s definitely still a gap in what exists within a lot of brands and their understanding of, of data privacy and what they need to do about it. Who’s ultimately responsible for making sure that they’re all compliant.

Host (29:51):

I love the reference to the Ghostery report. And if anyone has also not already downloaded, Ghostery has an amazing plugin that you can put on your browser and you can see, you can kind of have it set up to auto block or allow, but then you can choose and you can see all the trackers that are happening on a site at that time. It’s a, it’s a free, really great plugin. So I highly recommend that you, you grab that. Well, I think we could talk for hours on all kinds of topics. So, um, we kind of always ask everyone, given everything you’ve learned from the knowledge that you have in a privacy and security sense, what’s the best privacy tip that you would offer our listeners

Host (30:34):

Best privacy Tip, to offer your listeners? I think the number one thing that I would probably say is I recently, did this go through your phone and take a look at the apps that you have on there and go through each of them to see if you’ve got certain data collection components turned on or not. The location. One is one for me that I turn off every single location piece that I possibly can because I don’t feel like I need that to happen as part of the apps that I work in where I need location. I have it enabled where I don’t, I don’t. So that’s probably probably one of the biggest ones for me, because again, you don’t know what you don’t know. And so we’re making assumptions that just because we have an app, there’s certain data that’s going to be passed back. It could be more than what you think. So I definitely would take a look at that.

Host (31:25):

Great tip. Good reminder.

Host (31:27):

So our last question, but not least is what do you like to do for fun besides coach sports?

Andrew Richardson (31:35):

So I don’t coach sports anymore, which is a whole different conversation for another time. Still loves sports. I live in Philadelphia I’m a big Eagles and Sixers and a Phillies fan. Uh, I’ve got two kids, one in high school and one getting ready to go into middle school this next year and my wife. And so we spent a lot of time together, especially during COVID. We have played a whole heck of a lot of UNO last year, literally every single day from like March to June, probably we would be playing UNO after dinner every single day. So that was a lot of fun. We have not picked up UNO in quite a while because I got a little bit to be a little bit too much. But we love board games and games in general. And so we spend a lot of time, a lot of time together doing that. It’s a, it’s a really interesting question in the midst of this pandemic because the pre pandemic answers are going to be completely different. I feel like for anyone who answers that question than where we currently sit.

Host (32:32):

Well, you know, some people got dogs and COVID, we started a podcast. So I guess when we get out of the pandemic, it’ll be interesting to compare the answers will come to you for some analytical help. I love it. I love it. Well, Andrew, thank you so much for your time. Where can people connect with you to learn more and stay in touch?

Andrew Richardson (32:50):

Yeah, so LinkedIn, obviously Andrew Richardson, I think on there it’s analytic Andrew, which is actually also my Twitter handle. And then through our website, uh, check us out tinuiti.com T I N U I T I, and we’d love to have conversations with you all. If you’re a brand that’s, you know, looking for some help because we know this is not an easy thing to navigate.

Host (33:16):

Well. Wonderful. Thank you so much for sharing your expertise with us today. It was a really fascinating discussion and we appreciate it.

Andrew Richardson (33:22):

Thank you guys. Appreciate the time.

Paul Caiazzo

Paul Caiazzo is an entrepreneur, strategist, and cybersecurity expert with more than 20 years of experience. He is the Senior Vice President of Security & Compliance at Avertium, a company that delivers every facet of cybersecurity services to more than 2,500 esteemed organizations. In this position, Paul oversees technology alliances, guides clients through tough security issues, and leads internal security and compliance initiatives.

Before his work at Avertium, Paul was the Co-founder and CEO of TruShield Security Solutions, one of the fastest growing companies in the cybersecurity industry—which was recently merged with two other companies to create Avertium.

Available_Black copy
Available_Black copy

Here’s a glimpse of what you’ll learn:

  • Paul Caiazzo talks about his 23 years of experience in the cybersecurity industry
  • Paul shares the ins and outs of Avertium’s wide range of cybersecurity services
  • How the cybersecurity market has evolved in recent years—and where it’s headed in the near future
  • What is the “zero trust” security model?
  • Paul discusses how increased privacy regulations in California will impact businesses across America
  • Common privacy and security issues that companies struggle with: threat detection, information governance, and user error
  • What to do if you get caught in a situation involving a ransom note and stolen data

In this episode…

Have you ever worried about experiencing a security breach in your business? Do you wonder if you’re doing enough to protect yourself and your customers? If so, you’re not alone—and there are some tried-and-true privacy and security tactics to help you keep your company safe.

Despite your best efforts, there might be glaring cybersecurity issues that are actively putting your business at risk. With ever-advancing technology, it can be difficult to keep up with the latest updates in the security and privacy industries. Unfortunately, this lack of awareness and know-how can increase your risk of cybersecurity breaches, ransomware attacks, and loss of private data. So, what steps can you take today to actively protect your business?

In this episode of She Said Privacy, He Said Security, Jodi and Justin Daniels sit down with Paul Caiazzo, the Senior Vice President of Security & Compliance at Avertium, to discuss how to identify and remedy cybersecurity issues in your business. Listen in as Paul reveals how Avertium locates weaknesses in security systems, the implications of recent privacy regulations in California, and his strategies for overcoming worst-case-scenario security breaches. Stay tuned!

Resources Mentioned in this episode

Sponsor for this episode…

This episode is brought to you by Red Clover Advisors.

Red Clover Advisors uses data privacy to transform the way that companies do business together and create a future where there is greater trust between companies and consumers.

Founded by Jodi Daniels, Red Clover Advisors helps their clients comply with data privacy laws and establish customer trust so that they can grow and nurture integrity. They work with companies in a variety of fields, including technology, SaaS, ecommerce, media agencies, professional services, and financial services.

Their free guide, “How to Increase Customer Engagement in a Private World,” is available here.

You can also learn more about Red Clover Advisors by visiting their website or sending an email to info@redcloveradvisors.com.



Hi, it’s Jodi Daniels and I’m the founder and CEO of Red Clover Advisors, a certified women’s privacy consultancy. I’m a privacy consultant and a certified informational privacy professional. And I help provide practical privacy advice to overwhelmed companies.


Hi, Justin Daniels, here I am a cybersecurity subject matter expert and business attorney. I am the cyber quarterback, helping clients design and implement cyber plans as well as help them manage and recover from data breaches. Additionally, I provide cyber business consulting services to companies.


This episode is brought to you by Red Clover advisors. Red Clover advisors helps companies comply with data privacy laws and establish customers’ trust so that they can grow and nurture integrity. We work with companies in a variety of fields, including technology, SAS, e-commerce media agencies, professional services, and financial services. In short, we use data privacy to transform the way companies do business. Together, we’re creating a future where there is greater trust between companies and consumers to learn more, visit red Clover advisors.com or email info@redcloveradvisors.com.


All right, and today we’re really excited to have a Verdean Senior Vice President of security and compliance. Paul Caiazzo. He focuses on corporate development, technology alliances and strategic initiatives guiding a Verdean clients through challenging security problems. Paul also leads a Verdeans internal security and compliance initiatives and works to reduce risk across the organization and its customers. Paul has 23 years of experience with an extensive background in the federal government and financial sectors. And I’m also going to add that he is a successful entrepreneur as well cause he had his own business for many years before becoming part of a Verdean. Paul Whats up?!


Justin and Jodi. Thanks. I’m delighted to be here today. Excited doc.


Yeah, this’ll be fun. It’ll be like the continuation of the fun that we had earlier. Was it really the spring? It feels like that was yesterday and really long ago. All at the same time. I thought you


Remember that that’s when Jody was in the closet


Doing her part. I think I love graded now have a blank wall. I really was hiding in the closet. That was fun. That’s it. That’s the way to do it, in the closet. I was in the closet. Oh, a different one.


It was in a hotel room. So, you know, it all worked out. I haven’t been in a hotel room in quite some time now for some strange reason, but uh, those sessions were great. I thought, uh, you know, we had a very lovely conversation around incident response, in the work from anywhere sort of paradigm. So yeah.


Yeah. Well, I think to get started, you know, Justin, you kind of started on the, you know, Paul was an entrepreneur him, yourself, and I think that would be really is help us understand how you got to Verdean, but kind of rolling back to the company that you started and, and, um, you know, maybe even how you found your way to security overall. Yeah,


Sure. Boy, that’s a, a long story. I’ll condense it. So, um, as you noted in part of that introduction, Justin, I have about 23ish years of experience in cybersecurity. I started, um, really as a systems administrator, a very technical hands-on person. And I think having exposure to a really broad variety of technologies allowed me to become successful at security because you have to sort of understand how things work together, to really be able to secure them. I worked, and live in DC or the DC area and I worked for some defense contractors on various different, you know, military bases here supporting submissions for the Navy, for the Marine Corps etc. And then, moved to a different program that was supporting a foreign military sales for Northern African country. They had a really significant counter terrorism initiative going on because Al Qaeda, was you know, really prolific in that country at the time.


And so we were working on building out a network that basically supported the national 911 type systems so that people could call in emergencies. And I was able to take the lead for all the security initiatives with respect to that, implementing all the technologies, evaluating, selecting, and et cetera. And that single program really instilled in me, you know, sort of the self belief that was required to go out and do my own thing. That led me to start my company true shield in 2008. We did a lot of federal government contracting a lot of financial institution contracting and consulting and things along those lines and ended up starting a managed security services program about three years in. So in 2011 we became a MSSP. So 24 seven monitoring and threat detection and things like that, which is what sparked the interest of the private equity group, but ultimately ended up purchasing a true shield in 2018 that is Sunstone partners, the company that bought my company, as well as the now three other companies that have been purchased and rolled into what became averdean.


I’m still around here because I’m super excited about what we’re doing in averdean. And I think we make a difference and that’s something which I’ve really held near and dear to my heart for a long time, is that cybersecurity at the end of the day you are making a difference. You can you help companies protect themselves, protect their customers, similarly Jodi, for, from your perspective with privacy, you know, helping to build the trust of our customer’s customers so that they can operate securely and protect everybody. And that’s sort of the key thing for me is we strive to make a difference.


I love it.


Well, Paul, on the Verdean front, would you like to talk a little bit more to educate our audience about the kind of services that have averdean been provided since it is the merger of four companies provide a pretty vast array of different cybersecurity services?


Yeah, sure. So, you know, we are, we consider ourselves to be sort of a Fullscope cybersecurity company. We split the business into really kind of two practice verticals, one professional consulting services, and the other is managed security services. On the professional services side, we really focus on compliance, risk governance, things like that. So a lot of PCI work where a PCI QSA, we do quite a lot of HIPAA work, high trust where high trust certified, assessing organization. So there’s a lot of privacy impacts there as well. Quite a lot of, various different bespoke offerings when a customer’s got some complex security problem they’ve got to try to solve, we can parachute in some really seasoned experts as well as incident response falls under that practice as well. So when the company gets breached or has, you know, a major ransomware hits, which happens frighteningly ofte you know, our experts can help get a company through that on the managed services side of the house. We’re a 24 seven shop. We, do threat detection and response, using a variety of different technologies like a SIM or EDR tools. I mean, you probably are hearing my dog in the background and I apologize,


There’s not property there. So there’s not a threat as we’re talking about threat, we have a very sophisticated security system here called the loud dog.


Yeah. Mine’s a tiny little Westie. So I don’t know that he’s going to really scare anybody off, but he certainly thinks he’s bigger than he is, but anyway, apologies for that. But yeah, within the MSSP space quite a lot of endpoint detection and response or managed detection and response SIM based monitoring and threat detection vulnerability scanning, etc. So again, Fullscope cybersecurity shop. We try to do a lot for our customers with all those different services and things up in here. So our timing is great.


I just did a video yesterday actually about how we’re not going to apologize for our dog. This is a podcast. So everyone listening, just enjoy the dogs. They’re part of the fun. So in all seriousness, thinking about how you started your company and where we are today, and now that you’re a combination of other companies and thinking about the threat landscape that exists, what, how have you seen the market evolves and a little bit of where, where we came from and where we are, and then a little crystal ball, like where you think we’re going to be going, what are you seeing happening with the evolution the next, maybe two to three years?


So, you know, I tend to think that business problems don’t change anywhere near as fast as technology does. I think people struggle with some of the same stuff and will continue to struggle with a lot of the same stuff, mostly around information governance and threat detection, which is really the problems that we try to solve. But I think the strategies that companies take to try to overcome those challenges, they do tend to change as technology evolves and a lot of what we’ve seen, um, interest growing in, uh, and really just thirst or hunger for information about it, revolves around zero trust networking as a strategy to, to really help secure data assets, um, and also to assist with, uh, threat detection and response. Also, we’ve seen sort of the emergence of a category of tools called XDR or extended detection and response. And there’s been a huge amount of interest in that this year.


Traditionally people use things like a SIM. That’s focused on monitoring the network edge for threats that are coming inbound, but really, you know, that doesn’t work anymore because the perimeter sorta no longer exists, you know, we’re working from home, everybody’s working from home, data is no longer in a data center. It’s in, you know, one of many cloud providers and often in multiple cloud providers. So trying to maintain consistent, data governance, information governance, and even identity governance across that diverse environment is a challenge. And I think that’s where organizations are investing their time and effort and also their technology spend on solutions that will work well to provide secure connectivity, to provide, you know, that, that real manifestation of need to know and least privilege. And to me, that all boils down to zero trust, I think in 2021 and beyond, you’re going to see much more adoption of zero trust strategies.


Zero trust is not really a tool necessarily. It’s more of a discipline or a philosophy of how, how do you ensure that first off the people that need to access a data resource or are the right people contextually identify that human and then give them access to only the things that they need in a dynamic manner, which supports a lot of the orchestration and automation that rapid response really requires. So that’s where I think the market’s going and what our customers have been telling us.


Paul, just for the benefit of our audience, you talked about zero trust being a change of philosophy. So, you know, if I’m a person in C-suite and I’m not as familiar with the term, like zero trust for the benefit of our audience, explain the difference with the zero trust philosophy from what you might typically see now.


Yeah, sure. So I think traditionally the sort of paradigm is that if a user is on your network like physically located on your network, they are trusted sort of by default sort of a de facto trust is granted to that individual. People outside the network are not trusted. So there are extra steps of authentication that are required, um, or there may be different levels of access that are granted to them because they’re not on the network. So the idea behind zero trust is we want to treat all of those types of users, the same. We don’t trust any of them, whether they’re on the network or not on the network. So location is not really a factor for whether or not you should trust a user. It’s all, it all boils down to the, the identity context.


So, you know, where is that user logging in from is the device that they’re logging in secure, has that user logged in at that particular time, in the past a wide variety of different things you can use to contextually identify the user and then only grant them to the stuff that they actually need. Now, the problem with that is that most organizations have not gone through the comprehensive sort of workflow analysis and that data classification process that has to happen to support that. And that’s one of the key barriers that has to be overcome for an organization to start drifting towards the zero trust model, but it really boils down to don’t trust anybody until they’ve given you a reason to be trusted.


So what does that look like? The data classification piece and the work that has to get done when, when does that company need to do, to be able to get to that foundation that you just described


First off, what sort of business are you in? You can get some good insight into the types of data that you’re going to have depending upon, you know, what what’s, what industry you’re in. For instance, a financial institution is going to have a different type of data than a healthcare institution, but at the end of the day, it probably needs to treat that data pretty similarly. So you’d really need to map your business processes. You’re consuming data from your customers. You’re consuming data from your third party vendors. You’re consuming data from your employees. And all of that is, you know, sensitive data in some way, shape or form. There’s also going to be intellectual property that an organization has that they’re going to need to, you know, take care of. And for some organizations that might be the most important. They’re know, you’re a heavy development shop that has, you know, some interesting new technology or you’re a startup, you know, that IP is probably the most valuable thing in the network. And so you have to you have to protect that. bBut I think the other thing about it and that’s, this is what I always think is the most challenging component for our customers is not just what data do you have, but where is it? Because, you know, I think very often, people have sensitive files on their laptops, for instance you know, in a Dropbox account or something like that, or in their email. It can be difficult for an organization to map all that stuff out. There are some tools that assist with that, but it really boils down to understanding your business very well. The processes that your users use to interact with data and resources, and then ultimately how you’re interacting with third parties, like your customers or business partners.


Well, thank you. I think that’s really helpful.


I think we want to change gears just slightly and I’m going to take over Jodi’s role for a minute and yes, well, now that Paul just taught me that if I text you in the house in a zero trust environment, you’re not going to trust me whether I texted you in the house, outside the house in another country anywhere. Yes. Well, you do that. Well sometimes anyway, Paul, the question I had is, what role do you think the increased privacy regulations, especially what we’re seeing in California will impact your business from a security standpoint?


Well, there’s a couple of ways. Whenever I see new regulations and this is really the business development side of me speaking here, but whenever a new regulation comes out, we view that as opportunities, right? Because a customer, a company is going to struggle, trying to understand how to comply with the new retina set of regulations. And we generally can help with that sort of thing. So we view it as opportunity, but there’s also a challenge there for us, right? Because if you think about what, I described earlier around our MSSP services, our job is to monitor our customer networks, right? And so we’re in the process of collecting a lot of identifiable information, right? A geo locate, a user we’re going to profile that user’s behaviors. We’re going to track their email addresses and all those sorts of things, which, you know, could constitute PII in one way, shape or form.


So we have to be careful about how we protect that data. In the normal course of operations, we’re not collecting things like social security numbers or Phi or things like that, but it sends we’re capturing log data. And in some cases can do packet captures of networks. There exists a chance that we’re going to collect something which is protected data, or really needs to be protected. So we had to work pretty hard, to, you know, first off, build out the systems in a way that we have that zero trust architecture built into our platform so that the only people that can access that type of data are the ones that need to be able to use it. So users outside of our cyber ops centers of excellence, which are our security operation centers, they can’t access any of that customer data at all.


Only the analysts can, um, and that’s really, you know, I think part and parcel to doing the job correctly, but we’ve also had to approve, you know, that we’re able to do that. So, you know, by going through the various third party audits that we go through, which, you know, fall down to me as our CSO, um, that, uh, I think it gives our customers some confidence that we know what we’re doing, which we do. So that’s a good thing. But it’s been a challenge and it’s simply because just like anybody else, when a regulation changes, you have to react to it. And preferably try to get in front of it by doing it correctly, to begin with. But there’s just so many different jurisdictions for privacy at this point that I think that’s going to be that that’s going to be the big challenge until there’s a single national privacy standard. People are going to struggle to have to figure out which jurisdiction do I have to comply with.


So have you seen companies kind of, I’m not sure anyone loves regulation and signs up for it and says, this is great, but are you seeing companies being more willing to really review their security measures now because there’s a regulation because there’s fines and penalties associated with it, whereas before, it was well, I probably should have, but maybe I won’t. Are you seeing that or not so much? And are you seeing only the companies who have to deal with the California regulations? Or what about we’re in Georgia, maybe I’m a local company here and I really don’t have California companies. I’m pretty regional. Am I paying attention to it because it might come or I’m still ignoring?


So I think that anytime there’s a thou shalt versus you should have, then you have people tend to move more quickly in response to that sort of thing. Uh, so certainly the potential for fines and things like that as I’ve moved the needle for some of the customers that we talked to. I don’t think I’ve seen too many organizations that really have no current potential exposure to CCPA, really worry about it too much. So to your example Jodi, however, you know, if you’re a Georgia based e-commerce company, for instance, a very good chance you’ve got California, citizen customers. And so therefore you do have something to worry about. I certainly think a lot more people are paying attention to security right now, but I would actually say in a lot of cases, it, that is more incident driven.


Um, and what I mean by that is you don’t need to look through too many, Google search pages to find really scary ransomware stories. I think that’s, that’s really driving a lot of people, to take it more seriously than ever before. Again, from a business development standpoint, the business guy in me, when I see one of those reports says, well, there’s opportunity there. But the security guy made it sort of shakes his head cause there was probably some fundamental problem that should have been solved a long time ago that created that particular incident. I always hate to see somebody have that really bad day of being hit by a serious security incident.


Paul, I want to kind of walk you through a scenario and we’ll get to a question. So we talked about increased regulation. Companies who get hit with ransomware or my other favorite is phishing that leads to wire fraud, and then they realize, Hmm we need to have more security, but that’s not the business that we’re in. So we’re going to go to a third party provider like Verdean and have Verdean help us with our security. But now what happens is as a Verdean gets bigger and you have hundreds of clients and I’m the threat actor I’m thinking, I don’t need to go after those. I’m going to go after the common point, which is the MSSP. And so I’d love for you to talk a little bit about your thoughts around what may keep you up at night, which is the security of the MSSP. Who’s helping lets the security of all its customers.


Yeah, yeah, absolutely. And that does keep me up at night. So as I mentioned, I Marcyso in effect, I view that as being the de facto Caesar for all of our customers and you know, the million plus devices that we’re monitoring you’re effectively they roll up to me for accountability. So it certainly does keep me up at night, that potential impact to us could really compromise the security of our customers. Now, the way that we’ve architected our platforms really mitigates that risk down to basically zero because there isn’t the ability for it’s right after the jump from environmental environment. However, I do agree as our profile grows or as any service provider’s profile grows you do become a bigger and bigger target. And I know that there are threat actors out there. APT10 is a great example.


They focus specifically on service providers and an app to great effect that night. I actually worked on an incident where they were the bad guys compromised, managed services provider, not managed security services, but more of a managed it provider. And because their networks were not architected very well, they were able to compromise, a large array of enterprise class companies that were, customers of this MSP. So that happens pretty frequently. And given that, you know, those threat actors, they’re there to find the quickest way to monetizing that illicit access. And so sure if I can compromise one company and by proxy also compromise their dozens of customers or hundreds certainly that’s going to be an attractive proposition to a bad guy and I guarantee you they’re going after it. So we we take it seriously.


We not only have architected our systems to provide the defensive measures, but we use the same detection and response technologies to monitor ourselves as we use to monitor our customers. So the same threat intelligence, the same detection strategies, you know, the same sort of rules that we use to monitor the customer metrics. We’re also using ourselves, give ourselves some advanced warning that also helps us curate additional threat intelligence, which we can provide to customers. And also published out to the markets. If you follow us on LinkedIn, you’ll read some of our threat advisory reports. Yeah it’s a serious issue that it’s happened recently. In fact, I think there was some emesis PAs recently that were attacked by a networker, a ransomware as a service threat actor pop one of our competitors, which I’m not going to name. That certainly opened my eyes.


Yeah, it’s a big risk. So what, we talked a little bit about privacy and security and new regulations and how companies might be struggling with that, where do you think companies are still struggling just in general different threats that are out there? What are some of the top areas where they’re struggling?


Yeah, there’s a handful. And I think the threat detection is challenging. Being able to detect a sophisticated threat actor on a company’s network is not an easy thing to do. And if a company’s trying to do that themselves, chances are pretty good that they’re not doing well at it. The tools that support that sort of motion are tricky to, to make work and to integrate well together. So that’s, I think one of the key things is just being able to detect the bad guys. Then they’re actually responding to it when, when something does happen. The other big thing really, and I think one of the other two big things is information governance. Like I mentioned earlier, just knowing what data you’ve got, where it exists and how to protect it and secure it in a manner that actually works for the business.


That’s, that’s a big challenge. And I think the third and last one I’ll mention is really just the users themselves, users still, unfortunately security awareness is just not as high as it needs to be. People get fished all the time, day in and day out. That I think is not going to change. You can try to implement all sorts of email gateways and things like that, but just, you know, one good phishing email lands in the wrong inbox person clicks it. And either, you know, pays the fraudulent wire to your point, Justin, or download some malware, which because the network is very open and flat creates a really significant propagation event where malware gets dispersed throughout the network very quickly it’s through that simple one phishing attack. So that I think is something to keep focused on and it’s not going to taper off.


I love the comment about data. I mean, I feel like that’s what I do all day. People will come and they’ll ask for a privacy notice or something. And I said, well, but we have to go back to the data. So the single point of truth is always understanding the data that you have in what you’re collecting. And on my side, I’m always about, well, what are you doing with it and who you sharing with it with and how are you using it? And then on the security side, we need to know where it’s being stored and all those different places, just like you described.


Yeah. And I think the other thing, sorry, Justin, this speaks back to the ransomware, situation, and I’ll just, real quick on this. So ransomware threat actors now are moving towards double extortion, which you’ve probably heard of, but for the sake of the audience, this is where the ransomware threat actor, not only encrypts the systems for impact, but then also steals data, before actually encrypting all the systems. So that data theft is what there’s, what is being used for leverage to get the victim, to pay the ransom. What will often happen is the company that’s been ransomed does not know exactly what data’s been stolen. Very often, and I asked this question during most events where I give a talk on ransomware, how many of you would be willing to negotiate with threat actor?


And there’s very few people that ever raised their hands and say, yes, but the problem is because of this double extortion, you have to, you have to know what database take and it’s incumbent upon you to understand what data has been stolen from you, because you may have breach notification guidelines that you must comply with CCPA, HIPAA. They all have breach notification regulations that you must comply with. So if you don’t do the job of understanding what the threat actor is taken from you, you’re sort of negates as it relates to the stewardship of that data. So that’s a key thing. And Justin, I know I cut you off. No, that’s all right. Jodi does it all the time.


So thankfully this won’t air before Thursday because I’m working on a tabletop and one, and that’s exactly what it is. But the other interesting point that I’d love to have you talk about with that is how often if they’ve paid the ransomware, do they find out that the Extraction of the data was not as widespread as they thought it was? And that’s part of what the threat actor is counting on, because if you don’t know and you don’t pay it, it really puts you in a position where they’re incentivizing you to pay.


So I think the first thing you’ve got to do, if you’re in a situation like that if you, you know, you find a ransom note on your laptop, for instance, or on a server is call a guy like Justin, not because you need counsel in a situation like that, you need outside counsel it’s really important.


And something, I always say, it’s not just a fluff you up there, Justin, but it’s, it’s really important that, outside councils very quickly anyway, um, what, what you then do if you start negotiating with the threat actor and you’re, you’re concerned about that data theft is you ask for what’s called a proof of life, which is effectively going to be screenshots of folders or directories, or, you know, in some cases the actual files that have been taken and a threat actor will be able to provide that with you or to you. And if they can’t like, if they’re saying, well, we stole, you know, a terabyte of data, but they’re unwilling to show you what data they had stolen. I wouldn’t trust them. I would think they’re probably just, you know, using that as leverage and probably having to actually purloin the data that they purport to.


So I would be looking for the proof of life. Then if you ultimately do pay the ransom, uh, what I’ve seen happen is the threat actors, especially the more sophisticated ones they do destroy the data, or at least don’t leak it because it’s, it’s in their interest from a business perspective to do so, uh, if they get the reputation of people pay and then they still have the data leaks and then people are gonna be less likely to pay because the incentive is sort of diminished. Then additionally, they’re kind of defeating their own, you know, again, interest of monetization by, by not doing that. Um, what I have seen happen is less on the data that side, but on the decrypter. So when you get a ransomware, it’s not going to encrypt your systems, what the threat actors going to do if you pay it is provides you with what’s called a decryptor.


Those are generally pretty sloppy on because they’re not, you know, sort of enterprise class software that’s built to encrypt and decrypt consistently. And so you often will wind up with corrupted files even after having a paid the ransom. The decryption process itself can be very cumbersome depending upon whether you get what’s called a universal decryptor, which is basically one decryption key, which is gonna work for all the systems that have been encrypted, or if it’s unique to each individual system. So if you can imagine a situation where you’ve got, you know, let’s say 10,000 machines in your network and they all get encrypted and I’ve seen that happen. And then you’ve got to have the individual encryption key for each individual system recovery from that you might as well just start over. And we have guided customers to do that as well.


I’m thinking that we should have the secret of that Russell Crowe, Meg Ryan movie, proof of life. I’m thinking screenplay Paul. Well, Paul, this has been a really fun conversation with proof of life screenplays and zero trust conversation, and just overall good practices that companies should be thinking about. And what we are asking everyone is if you can share. So since you do this all day, every day, you probably have some favorite personal cyber security tips or things that you do. So what would be a favorite personal cybersecurity tip?


Multifactor authentication all day long, that’s the most important thing anybody can do because you know, your credentials are just assume at some point they’re going to be compromised, but if you have that second factor, you’re doing a better job of protecting yourself from a personal standpoint, whenever I’m on the internet, like even right now, I go through a personal VPN. So there are platforms out there which can encrypt every single bit of traffic that you’re ever engaged with on the internet, the platform that I use, I have on my laptops, my mobile devices and it’s consistent across all of them. So the user experience is pretty transparent and I use that for everything. And since, you know, at least in the old times, I traveled pretty much constantly. I, you know, I did not want to use a unsecured wifi in an airport or a hotel without having some added level of protection. And so that personal VPN is important for me. So I think those are really the two things that I would recommend everybody do is multifactor and a personal VPN.


So do you have any particular brands or services that you use we’d love to include them in the show notes so that people can go and grab them.


Yeah, sure. Happy to have. I have no personal relationship with any of these companies. But, um, the VPN that I use is called a private internet access or PIA. Um, so, you know, it’s a subscription, you pay an annual subscription. It’s not terribly expensive. And it works very well for multi-factor authentication. So at a Verde and we use Okta, which actually is a strategic partner of ours. Um, and that’s more for corporate use for personal use, generally your platforms that you interact with, whether it be your bank or Gmail, there’s going to be a multi-factor option built into that, and you can just simply enable it.


Great. Well, thank you.


Yes. And now we’re going to divert completely from security. What do you like to do when you’re not in the office? Not securing everyone’s network for fun? I’m a musician, so I like to play guitar and piano. That’s a nice way to relax a little bit. I like cooking so last week was fun for me being Thanksgiving. I did still, even though the crowd was rather small, I cooked a large meal. Then I think probably the other big hobby that I’ve always had is working on cars. So I’ve got some old cars that I, take apart, put back together and sort of in perpetuity and working on some cars someplace.


Well, my dad used to enjoy that when he was, when he was young. I personally don’t get it. I have like go bring car to the car. People, they do that, but I like the cooking one. I cook, it’s very relaxing and therapeutic. I like it.


Great. So for me, it’s trapping an onion, right?


Chopping an onion is good. I like baking and mixing, and then you put it in the oven and then, magic upstairs


Baking. I’m not good at, uh, and this is, I think the difference between baking and cooking, one’s an art, one’s a science, right? Because baking, you got to get the recipe. Right. And I’m not a recipe guy, so,


Yeah. And see, I’m very methodical. I want my recipe, then I can follow it. And this magic happens. The whole art thing. I got to know how to magically put it all together.


Well, Paul, thank you so much. How can people stay in touch with you? Well, I’m pretty active on LinkedIn. Um, so you want to maybe drop my LinkedIn profile link. They can call me there. There’s always some content that we’re producing around cybersecurity threats the nature of the risks that we’re all basing. And that’s pretty much it. Other than that, if you, if you follow aVerdeans websites I write a lot of the white papers and content there with the rest of my team. So there’s always good resources there too.


Awesome. Well, thank you very much. And I think Paul and I are going to get to work on that screenplay on big visions. Now cyber security can be cool. There you go. Ransomware screen. No proof of life. There you go. Yes. Ransom. I think Russell Crowe going to play you or me? I’m going me. I’ll have to grow the beard out though. Yeah. Well, thank you again,


Jodi. Justin, this has been great. Thanks very much. It’s happy to do it. And then look forward to doing it sometime again soon.

Jodi Daniels

Jodi Daniels is the Founder and CEO of Red Clover Advisors, a boutique data privacy consultancy and one of the few certified Women’s Business Enterprises focused solely on privacy. Since its launch, Red Clover Advisors has helped hundreds of companies create privacy programs, achieve GDPR, CCPA, and US privacy law compliance, and establish a secure online data strategy that their customers can count on.

Jodi is a Certified Informational Privacy Professional (CIPP/US) with over 20 years of experience helping a range of businesses in privacy, marketing, strategy, and finance roles. She has worked with numerous companies throughout her corporate career, including Deloitte, The Home Depot, Cox Enterprises, Bank of America, and many more. Jodi is also a national keynote speaker, a member of the Forbes Business Council, and the co-host of the She Said Privacy, He Said Security podcast.

Justin Daniels

Justin Daniels is a cybersecurity subject matter expert and business attorney who helps his clients implement strategies to better manage and recover from data breaches. As outsourced general counsel for Baker Donelson, Justin advises executives on how to successfully navigate cyber business and legal concerns related to operations, M&A, incident response, and more.

In 2017, Justin founded and led the inaugural Atlanta Cyber Week, where multiple organizations held events that attracted more than 1,000 attendees. Justin is also a TEDx and keynote speaker and the co-host of the She Said Privacy, He Said Security podcast with his wife, Jodi.

Available_Black copy
Available_Black copy

Here’s a glimpse of what you’ll learn:

  • What is a smart city, and how does it differ from a typical city?
  • The major privacy concern about smart cities: less control over personal data
  • Justin Daniels shares the real-life issues he has seen with smart cities around the US
  • How a city’s funding impacts its ability to prioritize citizens’ privacy/security needs
  • Jodi Daniels discusses the importance of building trust with your citizens as a city manager, administrator, or regulator

In this episode…

Have you ever thought about what the future of technology, urbanization, and transportation would look like? Perhaps images of flying cars, holograms, or teleportation come to mind. While flying cars aren’t yet a staple in the 2020s, other technology has advanced our cities to a startling degree.

Today, the normalization of “smart cities” is on the horizon. On paper, a smart city is simply an urban area that utilizes advanced technology to make day-to-day life more efficient and convenient. However, the reality of smart cities can be a bit more complicated. When technology is being utilized to monitor the behavior of citizens, there is bound to be a plethora of privacy and security issues. So, what do you need to know about the data and privacy concerns of smart cities—before they become a normal part of our lives?

In this episode of She Said Privacy, He Said Security, Rise25 Co-founder John Corcoran sits down with Justin and Jodi Daniels to discuss the pros and cons of smart cities. Listen in as Justin and Jodi reveal what smart cities are, the privacy and security concerns they present, and how local government officials can better protect their citizens’ personal data in an age of technological advancement. Stay tuned!

Resources Mentioned in this episode

Sponsor for this episode…

This episode is brought to you by Red Clover Advisors.

Red Clover Advisors uses data privacy to transform the way that companies do business together and create a future where there is greater trust between companies and consumers.

Founded by Jodi Daniels, Red Clover Advisors helps their clients comply with data privacy laws and establish customer trust so that they can grow and nurture integrity. They work with companies in a variety of fields, including technology, SaaS, ecommerce, media agencies, professional services, and financial services.

Their free guide, “How to Increase Customer Engagement in a Private World,” is available here.

You can also learn more about Red Clover Advisors by visiting their website or sending an email to info@redcloveradvisors.com.


Intro (00:01):

Welcome to the, she said privacy. He said, security podcast. Like any good marriage, we will debate, evaluate, and sometimes quarrel about how privacy and security impact business in the 21st century,

Host (00:21):

Hi Jodi Daniels here. I’m the founder and CEO of Red Clover advisors, a certified women’s privacy consultancy. I’m a privacy consultant and a certified information, privacy professional, and I help provide practical privacy advice to overwhelmed companies. Hi, I’m Justin Daniels. I am a cybersecurity subject matter expert and business attorney. I am the cyber quarterback, helping clients design and implement cyber plans. I also help them manage and recover from data breaches. I also provide cyber business consulting services to companies. We have John Corcoran here who has done thousands of interviews, and we have flipped the script and he’ll be interviewing us today. John, take it away.

John Corcoran (01:05):

All right, Justin and Jodi, it’s a pleasure to be with both of you here today. And in this episode, we’re going to be talking about smart cities. What are they? What, how do you deal with them? What are some of the privacy concerns that we have about them? So we’re going to talk all about that. But first, before we get into that, this episode is brought to you by Red Clover Advisors, which helps companies to comply with data privacy laws and establish customer trust. So they can grow and nurture integrity. Red Clover advisors works with companies in a variety of fields, including technology, SAS, e-commerce media agencies, professional services, and financial services. In short, they use data privacy to transform the way companies do business together. We’re creating a future where there is greater trust between companies and consumers to learn more. You can go to redcloveradvisors.com or you can also email info@redcloveradvisors.com. All right, guys. So let’s launch into this topic of smart cities. It’s a term we, many of us have heard of probably a lot of people, not sure exactly. Where does the line between an existing city and, and a smart city start. So Justin, I want to start with you. Can you define for us, what does this term smart city mean?

Host (02:17):

Well, when I think about a smart city, I think about the use of technology to make a city more productive, more efficient and easier where its citizens to access it. So think about transportation and how technology could be used to have better traffic and how the traffic lights might be done differently, or how you might use things such as drones to make more efficient transporting products and services within the community.

John Corcoran (02:49):

Those sound like all really positive things. But I imagine there is a flip side to that. So Jodi, let me turn to you. What are some of the privacy that we should all be aware of when it comes to the development of these smart cities?

Host (03:04):

Well, it all goes back to what I’m always talking about, which is what data is being collected and how is it being used. So if I’m standing in a public area and there’s some cool gadget that’s going to monitor and collect information, does it have a picture of me? Does it have a picture of my car? Does it know what I’ve done? Can it connect to my phone? What kind of information will it have? And so then, okay, so let’s say it had my license plate. Great. Well, what does it do with it? Does it just keep it in a nice little filing cabinet for posterity? Or is it going to match it up to something else or did it even get that information kind of by accident? Because what it was really trying to do is figure out how many cars it was going through, but it didn’t the only way to do that is because it captures the license plate, but doesn’t really even want the license plate. Does that make sense? So it’s really all going back to the understanding of data collection. What data are we collecting and how is it being used

John Corcoran (04:04):

And what sorts of categories of data are you seeing being collected these days? It sounds like photographs, video personal information.

Host (04:15):

Yeah, well some of the other newer ones are going to be biometric information. So I scan fingerprint, scan, facial recognition, things like that. And justin, I’m sure you have some others.

Host (04:29):

I think the other kinds of data would be what route you traveled to work, where you are at a particular, certain time and day during your work day. So you can use one piece or multiple pieces of data to really construct where each one of us goes and what we’re doing all day. And we may not even realize it at the time that all of this data is being collected and can be booked together to create this kind of roadmap as to where we are, where we’re going and what we’re doing all day.

John Corcoran (05:00):

So, for example, if do use your example, the route we drive to work, if you had a, let’s say a very contentious custody battle between two parents and that information was public, the other spouse could find out what route they’re driving to get to work. Is that the sort of thing that might arise from this type of situation,

Host (05:18):

Or think about the route they may be driving to go see their not spouse or their spouse, isn’t aware of it, or you could have a situation. And I’ve seen this happen already with attorneys that I’ve talked to is if you have smart devices all over your house and you’re living with someone and you’re mad at them, you might put on the smart device when it’s like really hot and turn the air, not down but up and there’s cases where people have done that. Cause they’re mad at somebody in that why get mad? You can just get, even by doing something like that, utilizing smart technology.

John Corcoran (05:54):

And so we need to think what data is being collected and then the other piece, or the next piece is what should be shared, what shouldn’t be shared. So Jody, turning back to you, share with us our thoughts on, you know, what should be shared, what shouldn’t be shared and what our expectations are here.

Host (06:12):

Well, that’s a key question around expectations. So if I’m in a public area, is the expectation that it’s public data or that it’s private data. That’s kind of the quintessential question. You know, you go to a, an event someone’s taking a picture. Was that a public picture with you in it? Or was it a private picture? Right? And the other piece to sharing is then who is going to, so are these going to go to vendors or service providers that might be doing something else with the data, maybe they’re going to help us analyze it. And so they need that information, how that company is going to use, it’s going to be very different and our concerns are going to be making sure they only use it for what we sent it to them. And we want a strong contract in place, but do we share it to some other, maybe we’re going to participate in a research study and who else is going to have that information and are, are they going to take that and match it with somebody else?

Host (07:12):

So it really goes back to kind of that use piece and connected to all of this is how do you inform somebody about how this is all going on? And so we have the idea of smart devices that a city, a public entity might have. And so where’s the obligation because, you know, I’m driving, you, can’t just like pop up notices all the time. If there’s something on a traffic light and we’ve had cameras up before. And so those you kind of just had to know, or you’d go to a central place to be able to, to learn what was being captured in those. So it’s kind of an interesting challenge of how many devices and then how someone learns about what’s actually happening in a language. Someone can understand. I know you work on this a lot. So one of you share some of your real lives.

Host (08:12):

So I think to Jody’s point in some of the work that I’ve been doing for the curiosity lab, I’ve had, the research would have happened in some other cities. So in the center city of San Diego, they put up some video cameras as part of smart streetlights. And in their contract, it allowed the manufacturer of the streetlight to get the data and sell it to third parties. They didn’t go through their procurement process and negotiate that. So when the citizens found out about that, they were understandably upset about that. But then the other thing that happened in San Diego with those streetlights is the black lives matter movement. And all of a sudden the streetlights that were supposed to be for innovation and technology start to get the perception that they’re being used for surveillance. And to me, when you start talking about smart cities, one of the biggest issues that you have to grapple with is what is the difference between innovative technology for smart cities versus enabling government to have more surveillance.

Host (09:21):

And so what Jodi and I are talking about really is how do you proactively think about privacy at the inception of designing and implementing or deploying all of this technology? Because what happens in too many places because the public sector is no different than the private sector, they deploy all this. And then the privacy crap hits the fan and they worry about it later with a lot of negative results. Because one of the other studies I looked at with sidewalk labs in Toronto is once they lost the support of the public, there was no getting it back. It wasn’t like you could do another campaign to get people to say, okay, no harm, no foul. We’re going to believe in this. It completely torpedo the whole project. And that’s why I think in this whole public arena, the privacy issue is a critical issue to be thinking about before you deploy this technology, because as we’ve alluded to on this call, well, Jodie and I are in the public right away. Well, we really don’t have an expectation of privacy, but what are you doing if you are collecting and using that data? Cause I didn’t get to consent and how you’re using that data, but yet it really impacts me in, in several of the examples we talked about just a few minutes ago. So

John Corcoran (10:38):

Would you say that cities are doing a good job so far of anticipating the different privacy concerns which come up or are they not? It sounds like there are plenty of examples where cities haven’t anticipated other ways in which privacy concerns come up, Jodi I’ll turn over to you.

Host (10:58):

Yeah. I mean, I think there are certainly some cities that are doing a really nice job in realizing that this is a modern day problem and they’re taking the steps that an organization of any kind or company or a city public should be taking. And at the same time, there are a lot of cities that are not even realizing all the different areas that they need to be paying attention to. So I would say it’s probably fairly similar to the private sector of what you would find. There’s always room for improvement.

John Corcoran (11:34):

Following up on that question, Jody. So how do cities that want to take advantage of new technology, but don’t want to lose trust with their citizenry and don’t want to infringe on privacy. How should they be anticipating these issues? Is there a public process? What would you recommend?

Host (11:54):

So I’m actually gonna send that over to Justin because he’s worked on some things like this and it has some good firsthand notes. So I guess the first thing I’d like to say is I’m going to quarrel a little bit with Jodi and say that most cities are not doing a good job with the privacy. And there’s a reason why, and it’s one word funding. So think of private corporations who have lots of money and they struggle with privacy and security. You’ve read about it in the paper now.

John Corcoran (12:24):

I mean, to the point, I mean, there are companies that have billions of dollars and they have big privacy, unanticipated issues that come up. So you think of like a small city that has a small, much smaller budget.

Host (12:36):

Yeah. So what I’m going to say is most cities aren’t even thinking about this or doing anything about it because one, there’s a lack of awareness, but two, even when there is, where do you find the funding as a public entity to do a lot of this stuff because to do it requires a lot of time, money and effort with smart people. So I would argue that it’s an even tougher problem in the public sector for funding alone. But to answer your question about what cities can do, one which Jodi does a lot, it’s privacy notices on websites, where if you have smart cameras in rights away, you may put something up in the right away. But the other thing is, and this is where I think this is going to be more of a societal thing that we’re going to have to talk about is maybe if you do collect data in the right of way, maybe it’s only used for maybe a research project.

Host (13:33):

It’s not used to commercialize the data by selling it to third parties, maybe as a city, you have a open city council meeting and say, Hey, we want to implement that. You hear from citizens and you put in place a set of privacy principles that talk about transparency and other things. And then as Jodi will tell you, you can’t just have a policy because it doesn’t matter what you say. It matters what you do. And your, what you say has to matter has to be consistent with what you do. And so that’s really where it comes into play is you’ve got to decide what are we going to do and do it. And then you put things in writing in terms of privacy notices and whatnot that are consistent with what you actually do. And you do all of this prior to the deployment of the technology. So that you’ve thought about this. So when, when the unanticipated thing inevitably happens, you’re in a much better place because you’ve thought about these issues to address that unanticipated, what happens as opposed to not thinking about it at all. And now you have no idea what to do, and you’ve lost public confidence, which is very difficult to get back.

John Corcoran (14:41):

Jody are cities creating some sort of like city committees or subcommittees, which has stakeholders that are involved in it. Do you see anything like that happening right now?

Host (14:52):

Not really. I mean, it’s maybe the person who’s aware of, Oh, I should pay attention to privacy, putting their privacy hat on or put it, putting their hat on a recognizing. I need to get more information on this or speak to experts and bringing the right people, but a committee. I haven’t really seen a whole lot of committees taking place. What I would offer is that anyone who’s embarking on this, the stakes to me are extremely high. When you have taxpayers that right, no one wants to lose a customer, but if I lose a customer, you know, one me as the customer, I can go find another company to go get my item, but I can try and lure them back here. Or you’re going to have really angry taxpayers who might try and vote you out of office of whoever’s in office. That might’ve been responsible, but this is the place that they live.

Host (15:46):

This is their society. This is their home. This is very different information than whatever company I might’ve given it to. So the public trust that Justin was talking about, I think to me, the stakes are extremely high when it comes to, you know, my, my home, my area, where my tax dollars are going to cause I can’t control where tax dollars go. Only my, with my vote is how someone can control. I can absolutely control if you, if you do something and I don’t like, I’m just not going to use you as a company anymore.

John Corcoran (16:16):

Yeah. Justin, you mentioned a second, Oh, privacy practices like, like kind of a template or something are there. I know you’ve been involved in these sorts of things, but are there templates being developed that for the, especially for smaller cities that don’t have the resources where they can put in place a set of best practices. So that on at least on the, on the guideline side, as opposed to the committee oversight side, they have something in place or is our smart city so dynamic and changing that it’s hard to create something like that. That’s template it and it’s not customized. And that’s a set of best practices.

Host (17:01):

So I know that NIST, which is called the national Institute of standards and technology, which is a government organization, is trying to come out with some standards that do apply to smart cities. You can put together some principles like transparency and some other things where we’re at, because we don’t have any kind of overarching privacy or security laws. You do get into communities that may feel different about privacy versus commercialization. And I think that’s where you have to take into consideration that a city in Georgia might feel one way about how they interact with law enforcement, as opposed to a city in California or New York, that may feel a different way. And so I think there are ways to have a template approach about overarching principles, but the implementation, you know, communities aren’t homogenous, you know, even in Georgia, I can go to Atlanta or go to a city, South Georgia or Northeast Georgia, and there’ll be very different communities. So you have to take that into consideration as you develop this. But I think the key takeaway, and this is what I see time and time, again, both in the private and public sector is you need to be thinking about these issues as you are developing your technology prior to deployment. Cause it’s just been the default, we’ll just deploy this. The efficiency is great. The benefit is great. Oh, that privacy and security stuff, it’s an afterthought. And then when you have an incident or something else, that’s when it comes to the forefront. And that’s the mindset that has been difficult to shift unless you’ve had a breach or you are subject to regulation, it’s kind of been, that’s the way it’s gone.

Host (18:50):

Running a little short on time, but Jodi, any final thoughts on smart cities, privacy concerns that any city regulators, managers or administrators who may be listening to this in the future be thinking about?

Jodi (19:04):

I think to me, privacy is always about building trust. And especially when you have taxpayers, when you have the public, it’s essential to be able to establish that sense of trust, what they’re doing from a smart city perspective, it’s just one slice of everything else that’s happening. And so making sure that they really realize that the capabilities of these technologies and thinking through all aspects of data. So for example, the situation that happened in San Diego wouldn’t happen because you would have thought through, I’ll wait, I have this data it’s going to this vendor. Then what happens? And you’re just kind of following the data trail all with a basic principle.

Host (19:47):

It’s not only transparency as justin just said but this foundation of trust because at the end of the day, that’s what it’s about. What’s made to do something to make it a better community. That’s really what, what they’re trying to accomplish through technology. And so if we remember that they’re humans, that they’re people and we need to keep that sense of trust with them. Then I think that will start to help that privacy more at the top for the cities who haven’t already considered it. Well, if I’m a citizen or I’m a city manager and we want to learn more about Red Clover Advisors and you, the work that you do, Jody, where can people go to learn? Absolutely. So visit us at redcloveradvisors.com or send us an email at inf@tredcloveradvisors.com and find us on social media. Find us on LinkedIn.

New Speaker (20:37):

Justin, Jody, thanks so much. Thanks for listening to the, she said privacy. He said security podcast. If you haven’t already be sure to click, subscribe, to get future episodes and check us out on LinkedIn. See you next time.

Jodi DanielsJodi Daniels is the Founder and CEO of Red Clover Advisors, a boutique data privacy consultancy and one of the few certified Women’s Business Enterprises focused solely on privacy. Since its launch, Red Clover Advisors has helped hundreds of companies create privacy programs, achieve GDPR, CCPA, and US privacy law compliance, and establish a secure online data strategy that their customers can count on.

Jodi is a Certified Informational Privacy Professional (CIPP/US) with over 20 years of experience helping a range of businesses in privacy, marketing, strategy, and finance roles. She has worked with numerous companies throughout her corporate career, including Deloitte, The Home Depot, Cox Enterprises, Bank of America, and many more. Jodi is also a national keynote speaker, a member of the Forbes Business Council, and the co-host of the She Said Privacy, He Said Security podcast.

Justin Daniels

Justin Daniels is a cybersecurity subject matter expert and business attorney who helps his clients implement strategies to better manage and recover from data breaches. As outsourced general counsel for Baker Donelson, Justin advises executives on how to successfully navigate cyber business and legal concerns related to operations, M&A, incident response, and more.

In 2017, Justin founded and led the inaugural Atlanta Cyber Week, where multiple organizations held events that attracted more than 1,000 attendees. Justin is also a TEDx and keynote speaker and the co-host of the She Said Privacy, He Said Security podcast with his wife, Jodi.

Available_Black copy
Available_Black copy

Here’s a glimpse of what you’ll learn:

  • Why are documentaries like The Great Hack and The Social Dilemma making such an explosive impact on society?
  • Jodi and Justin Daniels share what parents should take away from these documentaries: addiction needs accountability
  • Is it possible to safely use Facebook, TikTok, and other social media platforms—both as a child and as an adult?
  • Jodi and Justin discuss potential legislative solutions to our current data crisis
  • The key takeaway: what kind of society do we want to create?

In this episode…

How frequently do you visit platforms like Facebook, TikTok, or Google? Probably at least once a day, if not more. But, did you know that every time you visit one of these platforms, your personal data is being collected, stored, and sold in the hopes of altering your behavior, purchasing habits, and voter profile?

Many articles, podcasts, and documentaries—including recent Netflix hits, The Social Dilemma and The Great Hack—have detailed the total lack of privacy and security on some of the most frequented platforms in the world. Though it’s easy to think that we are safe and secure when using sites like Facebook, TikTok, or even Google, this unfortunately isn’t the case. So, what can we do as parents, security/privacy professionals, and frequent digital consumers in order to protect ourselves and our loved ones?

In this episode of She Said Privacy, He Said Security, Rise25 Co-founder John Corcoran sits down with Justin and Jodi Daniels to discuss practical takeaways from the recent documentaries, The Social Dilemma and The Great Hack. Listen in as Justin and Jodi talk about the reality of social media addiction, strategies for protecting your children’s online profiles, and potential legislative solutions to personal data breaches. Stay tuned for more!

Resources Mentioned in this episode

Sponsor for this episode…

This episode is brought to you by Red Clover Advisors.

Red Clover Advisors uses data privacy to transform the way that companies do business together and create a future where there is greater trust between companies and consumers.

Founded by Jodi Daniels, Red Clover Advisors helps their clients comply with data privacy laws and establish customer trust so that they can grow and nurture integrity. They work with companies in a variety of fields, including technology, SaaS, ecommerce, media agencies, professional services, and financial services.

Their free guide, “How to Increase Customer Engagement in a Private World,” is available here.

You can also learn more about Red Clover Advisors by visiting their website or sending an email to info@redcloveradvisors.com.


Host (00:00):

Host (00:27):

Hi, I’m Host and I’m a certified informational privacy professional, and I provide practical advice to overwhelmed companies.

Host (00:38):

Hi Host here. I’m a cybersecurity subject matter expert and business attorney. I’m the cyber quarterback, helping clients design and implement cyber plans. I also help them manage and recover from the inevitable data breach. I also provide cyber business consulting services to companies as well. We have John Corcoran here today and we have flipped the script and he’ll be interviewing us. All right. You guys, I’m excited to dive into this topic here with you guys, because we’ve got an interesting topic here today. So there’s two really monumental documentaries that have come out recently. The social dilemma and the great hack, and they both have a lot to say about privacy and security issues and you two are privacy and security experts. And so we’re going to dive into some of the different issues that are raised from those two documentaries. But first, before we get into that, this episode is brought to you by Red Clover Advisors, which helps companies to comply with data privacy laws and establish customer trust.

Host (02:07):

So they can grow and nurture integrity. Red Clover works with companies in a variety of fields, including technology, SAS, e-commerce media agencies, professional services and financial services. In short, they use data privacy to transform the way companies do business. Together, we’re creating a future where there is greater trust between companies and consumers to learn more, go to red Clover advisors.com or you can also email info@redcloveradvisors.com. All right, Justin, I’m going to start with you two monumental movies that have come out, The Great Hack and The Social Dilemma. First of all, why are they making so much of a stir today?

John Corcoran (02:49):

I think they’re making such a stir today because they really laid bare the business model of big tech. And there’s a reason why Google, Facebook among others are the richest companies in human history. And it’s for one reason only, and in a word it’s data. And I think what the both documentaries make so clear is people don’t truly understand what is being done with their data. They’re just, you know, having fun on Facebook being part of the community and don’t understand how without their real knowledge or consent, how all that data can be used and monetized on an impressive dented scale. That has some really significant societal impacts.

Host (03:41):

Jodi, I want to turn to you. So the great hack is about Cambridge Analytica, for those who don’t recall, tell us about what that company did.

Host (03:51):

Yeah, so Cambridge Analytica, it was a data company essentially, and it helped other organizations use data in digital campaigns. But where the data came from is kind of the question at hand and specifically Cambridge Analytica really helped political campaigns and political messages around the world. So not just here in the United States.

Host (04:18):

And you’re both parents, what do these movies say to us as parents or another way of putting that is, what should parents be thinking about these big tech companies right now? What should they be aware of?

Host (04:31):

Yeah. So I’m going to start with this one. You know, as, as parents to people who are, once you give a device, these kids and adults are addicted and they’re addicted for a reason. And it’s because the device and the software is literally designed to keep us hooked. There is no end on Facebook. There’s no end on any of the social media platforms. You can just keep scrolling and scrolling and scrolling literally forever. It’s like infinity. And so then you have to wonder, well, what am I scrolling in? What am I seeing? And the content that I’m going to get is going to be different than the content he’s going to get. It’s going to be different than what you’re going to get. And then the question is, well, what content am I getting? And how different is it? And these companies are utilizing data. As Justin just described, they are data companies to customize that experience, but then you have all the ads. And so what type of ad message is also coming in from whom, and we’re all just kind of commoditized people here that my little screen and my little space, someone’s going to buy the opportunity to deliver what their messages to me, whether or not that message is accurate or real. We have no idea. So there’s a lot, but that’s my starting point.

Host (05:51):

Justin, your thoughts on what parents should be paying attention to.

John Corcoran (05:55):

So I think the most poignant part of the entire documentary of social dilemma, it was one scene. So, you know, all of us on the podcast and everyone listening, we’ve all been in a relationship that ended badly and it upset us. You know, someone broke up with us and you know, that stinks. So if you recall on social media, they have an algorithm. So if you haven’t been on, you know, Facebook, they want to get you to come back. And so haven’t you ever gotten a picture of, Hey, this is what John was doing a year ago or five years ago. And there’s a picture in it. It’s usually something, Oh, that’s a nice memory. And so you look and maybe you stay, but now what if it’s a picture of you with an ex boyfriend or girlfriend that you were struggling to get over and you’re finally over them. And now you see that picture, the algorithm is just, you know, it just numbers. It doesn’t think about the emotional impact of someone who had a tough breakup. And now they have to see a picture of their ex boyfriend or girlfriend and how that might affect them emotionally. And so when you see all these statistics about increase in teen suicide and all, some of these bad societal consequences, it’s because you have this algorithm that doesn’t think about people’s emotions. They just want you to get back looking at Facebook. Whether it’s something that’s going to make you happy or sad is irrelevant. But to us as parents that my daughter or son may go out on social media and see something that makes them really upset to the point, they’re not talking to me or, you know how some people can react to things. That’s a really big issue.

Host (07:39):

Yeah. So as a parent, what can parents do? What’s the answer for them? Is it no digital devices? Is there, you know, maybe software that can help with this sort of thing? Or is it regulating which apps they’re on or the number of hours that they’re on these apps?

John Corcoran (07:56):

I think you should tell him the story about our daughter and Tik-Tok.

Host (07:59):

Oh yeah. I’ll tell you the story. Tick-tok I think first it’s also a combination of everything that you’ve just described. And another part to the social dilemma that they’re emphasizing in and the addictive nature of it is the likes and the comments. And so people are all about, well, I have to go on because did you like my photo? And it’s literally psychologically driven that we want to keep going to figure out how many likes and comments and shares do we get? So then people are measuring, Oh, well, you know what? That, that picture only got three likes. I wasn’t a great picture. I need to go and do something different to me, which is also highlighted in the social dilemma so that it gets more likes. And then when, when the girl makes the change, if the picture is better received, so then she internalizes, Oh, I have to look this certain way. And that also starts to get to some of the issues of the teen suicide and depression and anxiety, because we’re people are now no longer to think for themselves. We’re just dependent on how many likes and comments. So if you, I think it also depends on the age of the child, the younger, the child. So literally our ten-year-old thought tik-tok should be fine. And we don’t agree that tic- tok should be fine. We think there’s a lot of security and privacy challenges on Tik-tok. And so we’ve just banned. There is no Tik TOK here. At the same time, you might have some older teens that banning tick-tock might be quite a challenge. They might be independent enough to be able to put it on their phone. So I think there it’s education and really explaining what, what is Tik TOK? What is Facebook doing?

Host (09:34):

What is Instagram? What are any of these social platforms doing? And teaching people to limit, teaching people, to know how to have their own value outside of the social media world. Because if you just take it away, it doesn’t actually teach the lesson. And, and we have to still work with technology. And this is today’s issue. Tomorrow, it might be a different issue. So we need to be able to teach to that. And there are definitely devices. You mean, you can talk about the VPN and things that you’ve put on ours for some of the kids kind of in between to be able to limit what they do,

John Corcoran (10:09):

Our daughters iPad, that’s called Custodio. So we know where she’s going and what she’s doing. But the challenge we have as parents is think about parenting nowadays. Now you have to spend extra time uploading the software, monitoring the use of the devices. And then we talked about this in another podcast we did was what happens when they go to somebody else’s house and they have different rules. It’s kind of like what TVs were, but now on steroids. So in other words, it’s really, it takes a village. So it takes parents being on the same page about the devices and whatnot. And I think Jodi’s example of Tik TOK is one where I still think you have a lot of unawareness of these apps and what some of the mean, because our daughter’s like all these other kids are doing it and it may be because their parents aren’t as well attuned to these issues as Jodi and I might be. Cause we’re immersive.

Host (11:08):

Certainly. Yeah. And you mentioned, Jodi a moment ago, tik-tok has a number of privacy and security issues. So this question goes to either of you, but what are some of those issues that people should be aware of?

John Corcoran (11:21):

I like to joke that you really want the ministry of state security spying on you.

Host (11:27):

I mean, it’s, it’s a Chinese owned company with the U S arm, but numerous times it would say it was doing one thing and it would do something else and it would extract way more information than one it disclosed and two, it needed as well. So you think you’re just putting a dance on and you’ve downloaded the app to your phone. And instead the app is actually taking a lot of other information that it doesn’t need for you to have a dance on the app. So there’s, and then it’s owned by the Chinese government. So you don’t own and have any control over that data.

Host (12:01):

Let me ask it seems like you know, as also users of digital devices, there has to be a degree of mixed emotions towards some of these technologies. You know, in some ways there are advantages, you know, people always point to like Facebook reconnects you with maybe someone you went to high school with, who you haven’t been in touch with in awhile. How do you feel about that? You know, and where’s the line between some good that comes from these different, you know, platforms and, and where, you know, crosses over the line and the bad parts outweigh the benefits to it.

Host (12:42):

So Justin always jokes that I’m a prolific Facebook user. So I use Facebook. I firmly believe in what you just described. I’ve reconnected with many people that I would never have been able to keep in touch with. And I like it for those reasons. I find a lot of value in the groups. I’m in a number of different business groups and personal groups and local groups. And I find a lot of information there. I also, you know, restrict and, and know kind of who I’m sharing the data with. You can have different friends lists and share, and I’m aware of what’s happening. I understand the different advertising that I get. If I click on one shoe out, I get 20 shoe ads for the next three days. You know, I kind of understand that at the same time, you know, if it’s kind of about minimizing and understanding what’s happening and I’m not looking for the validation from everyone else.

Host (13:37):

And I’m aware that my news feed is slanted. I get that I’m only seeing certain types of articles and certain contents on my feed. And if I want to get more information, I’m going to have to work harder to go and find it. So I certainly think that there’s value, but like many other things, if it’s a Seesaw and you know, it’s, this is so wonderful. It’s not perfection at all. And you have to, you have, there you go, you have to balance it out. And it’s not just in one direction. So do, I like that. They sell and use my data in that capacity. Absolutely not. So I I’m careful about what I put and communicate and those types of things

Host (14:18):

And still live life And want to be able to connect with people. So it’s about being educated and balanced.

Host (14:23):

Justin, your thoughts on that?

John Corcoran (14:26):

So this is where we’re going to have some fun. My view of Facebook is how many friends, how do you define a friend? I define a friend as somebody. I can call it two in the morning and they’ll come and help me, Facebook friends, or not, for the most part, you’re true friends, but where I really think differently is in my personal opinion, social media has in the last 10 years been weaponized to divide us. And I feel that the big technology company with the way their business model is they have zero incentive to police themselves to really start to have some profound conversations about the type of things that should be put on social media or should not be. And I recognize as an attorney, we have first amendment issues, but I’d also like to point out that you’re not hearing a whole lot of information about foreign interference in our elections because the media companies kind of banded together. The US cyber command took some offensive actions against known actors to kind of preemptively hamper their capabilities. And so when I see things like that, I think that there are opportunities to start to come up with ground rules. And I passionately have come to the belief that section 230 of the telecom act of 1996, which basically insulates a platform for liability for the content on it either has to be eliminated, or we have to find a better way because the current state of affairs in my view cannot continue because I can’t tell you how many people I talked to who, when you go behind what they’re telling you, they got it off of Facebook. And I’m like, well, how do you know that’s true? And they’re like, well, it was on my social media feed. I’m like, well, it might as well. You know, that’s not really any way to verify it. That’s my biggest concern. And that to me is one of the biggest negative consequences of Facebook that I’m very concerned about for us as a society.

Host (16:34):

As you look at the social media landscape, and there’s a lot of different players out there, we’ve mentioned Facebook, Instagram, which is owned by Facebook, there’s LinkedIn, there’s Pinterest. If we have a legislative solution like that, will that solve it, will that make these platforms more palatable or, you know, are we beyond that point? Is it impossible? Do you think it would be impossible for there to these social media platforms to exist in a more heavily regulated world?

John Corcoran (17:07):

Sure. So I’ve had this discussion, so I’m going to throw something out for you in the audience’s consideration. So when television came along in the fifties, you know, you have the FCC that regulates what you can watch on television. Like you recall, remember on Superbowl and Janet Jackson had a wardrobe Malfunction. There’s certain words that you can’t say on television, right? More importantly, there are certain rules about what you cannot sell to children who are watching cartoons on television, like cigarettes, magically, when you took the cartoons and stick them on, YouTube, you don’t have those regulations. And so, John, I’m not naive enough to think that this is some simple solution. It certainly isn’t. But to me, it’s no different than what happened in 2008, when you took away all the regulations and you basically had unfettered capitalism, and you saw what happened by the same token, there can be common sense regulation around, you know, prohibiting things that are patently false from being posted on social media and having some level of responsibility of companies that if they don’t police it, they can be held accountable for that because there’s precedent for other types of technology that we’ve used. Cause I think the current situation is one that is untenable and will continue to divide our society, which I think inevitably is going to make this a national issue. It’s just, how quickly can we start to build the awareness to where people say enough, we’ve got to deal with this or else we’re putting our children and our children’s children’s at serious risk.

Host (18:48):

It is interesting because maybe analogy would be like the music download industry, which there was a period of time, the Napster era and late nineties, when it was almost unfettered and it was the wild West and people are downloading music like crazy. And the pendulum kind of swung back the other way. And then it actually became a very productive industry and, and in many ways, music industry transformed itself. So turn it to you Jodi. So what are your thoughts on that? You know, all the different social media platforms out there, is there a legislative solution out there that would make it a more palatable solution for everyone?

Host (19:25):

I think the wild, wild West doesn’t work we’re seeing what’s happened. Um, now, right? All the, the amount of data that was shared to Cambridge Analytica quite honestly, was quite a privacy story. And to the tune that the privacy office came with, their big coats and the privacy professionals were like, look, they have big fancy privacy coats. So it was a very big news story and legislation won’t cure it. It is a first is the first step. And if you tie together really what the social dilemma highlights, which is the curated newsfeed in the algorithm. So I can maybe police the news and make sure that it’s quote unquote, whatever we believe is accurate, but that doesn’t necessarily change how people are being encouraged to be on a platform. And the algorithm is continuing to serve me a certain slice of the content. So the division in the US as an example, that Justin was mentioning, if everything I see is really in one slice of the pie, it actually might be all accurate, which you wouldn’t need the legislation for that part. But if it’s all accurate, it’s still one slice of the pie. And then if you have someone who’s paid, because that’s how they make money to serve ads, and I’m going to get a further message. The current design is going to continue to foster that singular message, and I’m encouraged. I spend hours on it. So the algorithm, the companies need to kind of gain some data ethics underneath any type of legislation that happens as well. You have to have all the different pieces collectively combined or it won’t work.

John Corcoran (21:09):

John, I think I’ve got a really good analogy that kind of brings us into focus. So let me ask you a question, how often nowadays, when you get in your car or you’re a passenger, do you not use your seatbelt?

Host (21:23):

Yeah, for me, never.

John Corcoran (21:27):

So let’s go back, wind the clock back to the 1960s. They didn’t have seatbelts. This guy came along, Ralph Nader. We may remember him. He was like, we should have seat belts in cars for safety. Cause people get killed in cars. So the auto industry fought it while we finally got seatbelts. But you know what? Even through the nineties, people weren’t wearing seatbelts. So in comes mothers against drunk driving, drunk driving, and they start to change the perception of why you should or shouldn’t, you know, you need to buckle up. We were against drunk driving. And then that led to the start of some legislation where in most States, now, if you don’t have your seatbelt on, you can get fine. But you know, they’d have to pull you over and whatnot, but it was the combination of activism about, Hey, this isn’t right. And here’s why people’s lives are at risk plus legislation that got us to a much better place. Now we haven’t gotten rid of car accidents, but we’ve sure gotten adoption of seatbelts to just skyrocket, to aware it’s a normal thing for my kids. They don’t even bat an eyelash. We just put on our seatbelt when I was growing up. My dad really wasn’t that way. And I just point that out, because if you do these kinds of things, if we could do it for seatbelts in the generation, why don’t we have the political will to do it for something like this? That is just as pervasive as the seatbelt.

Host (22:51):

Yeah. It’s an interesting analogy. Any final thoughts before we wrap things up on this topic or issues that we haven’t discussed, that particular parents should be aware of?

Host (23:02):

I think parents need to be active. They can’t just assume that their kids are fine. They need to know who they’re connected to. What platforms are they on, making sure that they have the right tools to be able to, to monitor and prevent, uh, kind of a local shout out to a company here called Bark, which can help monitor, uh, what’s happening for children’s accounts. So taking an active role, I think, is going to be the most important part.

John Corcoran (23:31):

Okay. Um, I guess my final thought is I think the key takeaway from today’s podcast is, what of society do we want to be going forward? How do we want to think more reflectively about how we want to engage with social media? Because I look out right now and, you know, as Jodi and I sit here today, we’re in Atlanta, Georgia. So the epicenter of the United States politically sits in our state for the next two months. And we’re going to be inundated with all kinds of ads and other things that if you fact check them, many of them are just going to be outright false. And a lot of this will be driven on social media. And to me, when I’m going to about to go through this whole period, there’s got to be a better way and all of us need to come together because all of this divisiveness is just corroding us as a country. And if you, if you’ve been out of the United States, we really have it good here, and we should be doing all we can to preserve this wonderful, resource we have called democracy in the United States.

Host (24:43):

Absolutely well Red Clover advisors, redcloveradvisors.com, anywhere else, people should go to learn more about you, Jodi, and the work that you do.

Host (24:52):

Social media, ironically for our discussion, go find us on Facebook or LinkedIn, responsible users only. haha like on the beer can. Not on Tik-Tok.

Craig Petronella

Craig Petronella is the CEO, vCIO, CTO, and IT Cybersecurity Expert at Petronella, the premier provider of cybersecurity, digital forensics, compliance, blockchain, and AI services since 2002. As a jack-of-all-trades, Craig is also currently the President of Cloud Computing Raleigh and InfusionSearch, as well as an Independent Distributor at SendOutCards and an Independent Associate at LegalShield.

In addition to this, Craig is the author of eight books, including his Amazon #1 bestseller, How HIPAA Can Crush Your Medical Practice: HIPAA Compliance Kit & Manual For 2019.

Available_Black copy
Available_Black copy

Here’s a glimpse of what you’ll learn:

  • Craig Petronella talks about his recent focus on the CMMC (Cybersecurity Maturity Model Certification)
  • How modern technology has impacted laws such as HIPAA and the Telecommunications Act, which were passed in the 1990s
  • What is the CMMC, and why should your company care about it?
  • Craig’s strategies for finding a better balance between convenience/profit and security/privacy in business
  • The benefits of working with an IT provider that has a CMMC accreditation
  • Craig reveals his top privacy and security tips

In this episode…

Are you looking for valuable insights to help your company navigate the confusing process of the CMMC (Cybersecurity Maturity Model Certification)? Or, are you worried about how the legislation surrounding technology, security, and privacy is impacting your organization? If so, you’re in the right place!

Let’s face it: you can’t successfully protect your company if outdated legislation is inhibiting you. But, a great deal of the laws surrounding security and privacy were created in the 1990s—before technology as we know it was even invented! Thankfully, Craig Petronella, a cybersecurity expert, is here to explain the current state of security and privacy while providing actionable tips for creating a safe and secure future for your organization.

In this episode of She Said Privacy, He Said Security, Justin and Jodi Daniels sit down with Craig Petronella, the CEO, vCIO, CTO, and IT Cybersecurity Expert at Petronella, to explore the Cybersecurity Maturity Model Certification (CMMC). Listen in as Craig talks about what the CMMC is, the impact of new technology on cybersecurity laws, and how to establish a balance between convenience and privacy/security in your business. Stay tuned for more!

Resources Mentioned in this episode

Sponsor for this episode…

This episode is brought to you by Red Clover Advisors.

Red Clover Advisors uses data privacy to transform the way that companies do business together and create a future where there is greater trust between companies and consumers.

Founded by Jodi Daniels, Red Clover Advisors helps their clients comply with data privacy laws and establish customer trust so that they can grow and nurture integrity. They work with companies in a variety of fields, including technology, SaaS, ecommerce, media agencies, professional services, and financial services.

Their free guide, “How to Increase Customer Engagement in a Private World,” is available here.

You can also learn more about Red Clover Advisors by visiting their website or sending an email to info@redcloveradvisors.com.


Host (00:00):

Hi Jodi Daniels here. I’m a certified information, privacy professional, and I help provide practical privacy advice to overwhelmed companies. I’ve worked with companies like Deloitte, The Home Depot, Cox Enterprises, Bank of America, and the long list of other companies over my career. And I’m joined today by my husband, Justin.

Host (00:24):

Hello, Jodi Daniels’ husband. It’s great to be here. I am a cybersecurity subject matter expert and business attorney. I am the cyber quarterback, helping clients design and implement cyber plans as well as help them manage and recover from the inevitable data breaches we have these days. Additionally, I do provide cyber business consulting services to companies.

Host (00:52):

This episode is brought to you by Red Clover Advisors. We help companies to comply with data privacy laws and establish customer trust so that they can grow and nurture integrity. We work with companies in a variety of fields, including technologies, SAS, e-commerce, media, professional, and financial services. In short, we use data privacy to transform the way companies do business together. We’re creating a future where there’s greater trust between companies and consumers to learn more, visit redcloveradvisors.com. So today I’m really excited for our guest, Justin, who do we have here today?

Host (01:33):

Today? Our guest is Craig Petronella. Who’s an IT cybersecurity compliance expert and Amazon bestselling author. And his latest book is the CMMC accredited, registered. Oh, I’m sorry. It’s the Ultimate Guide to CMMC and he is also a CMMC accredited registered practitioner. Yes, my wife is frowning at me because I used a lot of words and not correctly, but since I know Craig, I may get a pass. Welcome, Craig. Thank you. Thank you so much for having me.

Host (02:07):

Yeah, this is fun. So, you know, it’s always nice to understand there’s a lot of letters, so help us understand how has your career progressed so that you now are an expert and helping companies with a CMMC accreditation?

Host (02:25):

Sure. Yeah. So I started many moons ago in April of 2002. In 2006 we really focused on managed security services and compliance for various regulated industries, HIPAA for healthcare, for example, DFARST compliance or NIST 80171 compliance for federal defense contractors. Last year the CMMC came about, or the cybersecurity maturity model certification was developed in joint effort with the Carnegie Mellon Institute, that original creators of the CMMI. It was in beta form last year and it finally came out 31st of January of this year, 2020. So I really focused heavily on the CMMC because I believe that it’s the new ISO standard for cybersecurity. And I think it’s really going to transform other regulations in our country, such as HIPAA for healthcare, since that was enacted in 1996 by Bill Clinton. And that was ages ago and it really is a lot of gray matter there.

Host (03:32):

So I think it’s due for an overhaul. And what I like most about the CMMC is that it really helps increase or improve the cybersecurity maturity level of organizations. So they’re not low hanging fruit to get hacked as easily because now there’s no faking it anymore. There’s no more self-assessments you have to get a third party assessment done by an accredited what’s called C3 Pao. And they have to come on to the premise to actually audit all of your controls in cybersecurity. So, you know, Craig, I just thought about something you just mentioned that HIPAA was passed in 1996. I think that’s the same year that they passed the telecom act, which was in 1996. And so it brings up an interesting question that I just thought of and get your perspective. So those laws were passed in 1996 before the internet, before a lot of the technology that we have today.

Host (04:28):

And so I’d love to get your perspective on how you feel that technology has impacted those laws that we have now, because CMMC is now kind of the latest in the evolution of cybersecurity law. And maybe you can help us understand how cybersecurity has evolved from laws in 96. And now when you’re talking about your expertise in CMMC.

New Speaker (04:53):

Sure, well, even just five years ago, when the National Institute of Standards And Technology or NIST came out with the NIST 801 71 and the NIST 853 for security controls for controlling unclassified or controlled unclassified information or QE even back just five years ago that a lot of the spreadsheets and materials that were provided by nist.gov for free for folks there they’re a bit dated already, too. So if you go back all the way and what I mean by dated is they’re referencing on-premise servers and on-premise equipment. And now in 2020, you know, a lot of organizations, especially small startups and small organizations they’re in the cloud. So they don’t have a lot of equipment on their premise anymore. So way back in 1996 like you said before the internet and before mainstream, Amazon and zoom and all the, you know, the Facebook and social media and all that stuff. We’re due for some change there’s just so much difference compared to what was done back then to now. And back then, I don’t know if you remember Justin, but there were phone hacking and freaking was common. I don’t know if you have heard of 2,600 magazine for example, but they posted about all that stuff for years and published articles on how kids were breaking into AT&T and things like that. You know, those are, those are like the, the days in and around 1990s, 1996.

Host (06:28):

And obviously before that, too, with the phone system, but now we’ve evolved so much with the internet age. The regulations are severely behind. What I like most about the CMMC is it really helped strengthen pretty much any organization, any organization can go online now and do what’s called a self-assessment and they can go and score their practice or their firm for their cybersecurity maturity level. And they should, even if they’re not in a quote unquote regulated space, they should still do that to make sure that they’re not a Mark and that then they’re not low hanging fruit.

Host (07:04):

So for those who might not be as familiar with CMMC, can you help us understand, you know, how does the framework look? Can we kind of break it down into different parts of it? So if someone was to go and check out that free assessment, what are kind of the blocks of questions or the pieces that they’re going to be asked about?

Host (07:26):

Sure. So the CMMC is really an overhaul of the NIST framework. So the NIST framework that came out, the NIST 801 71, and the 853 controls are, they consist of policies, procedures, and security controls. So you have an organization that like, for example, I don’t use them anymore, but maybe some organizations still use thumb drives, right? So you have to have a policy around, you know, the proper usage of thumb drive, or maybe you just prohibit thumb drives and you don’t allow them in your organization. So that would be an example of a policy. And then maybe you have a security control that if an employee stuck a thumb drive into the end point, it didn’t work. So something happens. There was like a police activity that happens where, okay, this is detected, this is not authorized. I’m going to make it stop.

Host (08:22):

So that’s a security control for that particular layer. So there’s 110 security controls and NIST 801 71. And the CMMC depending on the level, there’s five different levels of the CMMC, the CMMC looks and adds to the existing NIST policies, procedures, and security control requirements. So at the basic level one, for example, you don’t have to have all 110 controls. It’s just a small subset of them, and you don’t even have to have all your policies documented. But it’s the most basic of cyber hygiene. And I, and I think that most organizations should still adopt at least level one, but then you build upon level one, as you mature to level two level three and so on. And when you get to level three, that’s the most equivalent to NIST 801 71 with the addition of extra controls and extra procedures and policies on top of that.

Host (09:21):

Well, thanks for sharing. So I think my follow on question would be, you had mentioned any organization should go and do that. Is there a thought to sort of the size organization that generally takes and picks up and applies these types of frameworks? I mean, while we would say all organizations should pay attention to privacy and security, is there a certain size, whether it’s employee size, revenue size kind of data, some metric when a company should say, Oh, I should probably start really thinking about implementing a framework like this.

Host (09:53):

Oh, through my lens. In my experience, in my perspective, I think all organizations should try to hit level three CMMC level three. I know that’s a huge undertaking, especially for a small organization, but hackers don’t care. They’re looking for low-hanging fruit. So the harder we can make it for them to penetrate your network, the better for everyone, especially if you’re dealing with anything sensitive, personal identifiable information or patient health information or anything that you don’t want to be public, you should try to strive for that 801 71 or CMMC level three equivalent. Now that’s a huge job. That’s a huge undertaking. So you know, a lot of these small organizations, maybe five, 10 people, maybe they’re using Microsoft office 365, for example, but did you know that the commercial versions of Microsoft 365, it’s not compliant. Your data could be anywhere in the world and it’s not in the U S and it’s not manned by us personnel that are background checked. So anyone that’s dealing with sensitive information should not be using commercial versions of Microsoft 365. They should be securing their data, ideally with an end-to-end encrypted solution and trying hard to find and select vendors that take cybersecurity seriously.

New Speaker (11:19):

You mean the end-to-end encryption solution that zoom claim they were using, or Microsoft, Microsoft still doesn’t have end to end encryption. So, I think that we as consumers and we need to put more pressure on our vendors and companies to adopt more strength in cybersecurity.

Host (11:46):

You missed my little side signal. So I wanted to get in on that one, which so let you know, a lot of people use Microsoft 365. It seems like there’s Microsoft 365, and then there’s Google. Like, those are my two choices for email. So for those small companies who are, or even large companies who are using Microsoft 365, what is the recommendation? Is Google a whole lot better? Is that Microsoft plus something else? Or this is a bit of a risk you are taking?

Host (12:18):

Oh, the short answer is that Microsoft has a different product suite called GCC high and GCC high for DOD. There are different special versions when working with sensitive data, that it’s a separate environment it’s inside the US and only manned by background checked US people. So what that ecosystem to GCC high is typically three times more expensive than Microsoft 365 commercial versions. And if you’re currently using a Microsoft 365 commercial version, there’s no easy button, there’s no migration path. So a company essentially has to build an architect, a brand new environment, and GCC high and start over and migrate their data manually. There’s no quick way out. In the Google ecosystem, Google has G suite or premium products that you can use any of the free services, by the way, they’re not compliant. G suite has a, um, Google puts out in the HIPAA world or the healthcare world. They, they, they will sign a BAA if you properly configure all of the security controls and the policies around their G suite premium environment. It’s still not end to end encrypted though. It does check some of the boxes and it’s not compliant with the latest CMMC. So you it’s all about a balance. You can use certain things like, um, you could use certain things but you can’t store your data in the cloud with Microsoft three 65 commercial. You have to use their GCC high solution in the Microsoft world.

New Speaker (14:00):

Thank you. Now, it’s your turn. It’s my turn, your check. Your sure? Okay. Craig, I wanted to ask you to the tail end of your last answer. You talked about how consumers need to be putting more pressure on companies to take privacy and security more seriously. And again, I’m going to use zoom as an example.

Host (14:22):

The reason the primary reason zoom has grown exponentially is because it’s so convenient to use having used teams and zoom and WebEx it’s by far the easiest to use. Having said that, you know, and this week with the FTC kind of re uh, reinforcing it is it comes with the cost of security and privacy. And so I’d love to get your perspective of how we find a better balance, because if I’m a company and being more convenient leads to better profit, it’s going to beat the pants off of security and privacy every time.

New Speaker (14:58):

Agreed, you know, security is a balance. Um, however with modern technologies and solutions, the security features such as end to end encryption could be done in the background and will not affect the end user from using the system the way that we use zoom. Now, for example, so it doesn’t have to be super complicated. In this context around video conferencing and chat, Microsoft could easily or should have from the beginning made Teams end to end encrypted. And they didn’t. These are things that we, as consumers need to put more pressure on and demand from our companies that we, they have to have these encryption mechanisms in place in the background. And I fully agree with you, and it should still be easy to use. There are new email and data systems now that allow you to use, um, the services without a password. You don’t even have to use passwords. It binds it to the end point device. So that’s even easier to use. So I think it just depends on the application and the company and their maturity level and how they craft their product. Cause one thing you said in your comment there, Craig was, you know, Microsoft teams, they just didn’t do it.

Host (16:18):

And so then the question I have would be why, and I think a lot of times the answer to why is they don’t bring the privacy and the security people into the product development discussion. They’re just not part of it. And they figure it out, we’ll deal with privacy and security and bootstrap it later, which is effectively what zoom is being required to do by the FTC.

New Speaker (16:39):

I also think that’s true. I also think that companies like Microsoft may get pressure from law enforcement as well, and from the FBI around not using something so secure, like end to end encryption, because they want the back door, they want the way in for investigations. You know, I think that encryption as a whole it’s the forbidden word for FBI and law enforcement. They do not want us all to be using encryption because criminals will use that technology and exploit it. I mean, if you’re a criminal, I want to put you in jail, just let you know, and do what’s right. But the fact of the matter is we shouldn’t sacrifice our entire country’s security because of that one thing, there should be other policies, procedures, other things put into place that don’t jeopardize the security for everyone.

Host (17:36):

Yeah. That gets to the exact issue that we have between transferring data from the US to the EU. So I spent a lot of time working with companies in GDPR and the whole issue that we have that exists now between that cross border transfer and why safe Harbor fell down and why privacy shield has fallen down is because of those very backdoors. So it’s a very real challenge that I am not going to be able to solve on today’s conversation. What- no you’re good at solving everything! I know I can’t quite solve this, but I do have a different question because Craig, you can help people solve problems. So help us understand what is the accreditation that you have and what does that mean and why would a company want to pick someone who has that accreditation? What is it that they’re getting over a Jodi who could say, well, I could do that too.

Host (18:32):

Great question. So the DOD or department of defense, they pass the Baton over to a nonprofit called the CMMC accreditation body. And that nonprofit was to lead the efforts of the CMMC also lead training resources that different individuals as well as vendors and provide certification tracks. So my company has chosen well, it’s called an RPO or registered provider organization, and I’m the first registered practitioner or RP in my organization to become accredited through the CMMC AB. And what that means is I’ve, I’ve signed off on a code of conduct for ethics, and I’ve also gone through their training. They have about 12 different exams I had to go through and pass all those exams and get my accreditation. So I’ve been vetted. And basically the reason why folks that are listening want to go to the CMMC ab.org website is to find either us or other accredited companies to work with because they’re the ones that are going to be proficient. The very complex topic it’s changing very rapidly. You want to make sure that you select a company that can help. And there’s two sides. There’s, what’s called the RPO, which is what I have the registered practitioner organization. What that means is we help with the consulting, the policies, the procedures, the templates. If you have nothing, we help you create it. If you have some pieces, we help you review and use what you have and adapt that to security controls. If you don’t have the controls, we help you build the controls, build the environment architected. So we help with that whole process. What we’re not allowed to do is the formal assessment. We can’t do the formal third-party assessment. So we choose to work with other organizations which are called C3, Pao certified third-party auditor organizations. Those folks will send out what’s called a lead assessor or a certified assessor to the, to the firm, to their premise. They’ll fly them out. Or if they’re local, they’ll drive there, they’ll show up, they’ll look over the shoulder of their IT person and ask for two forms of evidence for each of the, that they’re going after for the maturity level that thereafter with CMMC. And after they go through all of this rigorous process, then the lead assessor will submit the results to the CMNC accreditation body and they’ll get processed. And if all goes well, the firm will get their accreditation with the CMMC at the level that thereafter, if for some reason, something needs to be remediated or fixed, they can come back to us. It’s the RPO will help that, that company fix whatever they need to do or fill those gaps. Then they can get retested.

Host (21:22):

That’s quite a, quite a process. So congratulations to you and on going through it. And I think it’s really helpful to explain because it, to the average person, they might not be as familiar with it. And there’s a lot of people out there who say that they can add and do a variety of, of items. And they’re not always as I’m certified in them or always as well-versed in them. So thank you. It’s really helpful to know that.

Host (21:47):

You can hire me tonight and I can come fix your plumbing. Well, where should we go from here? I guess one thing, when I think we have a final question before we get to the two most important questions, two most important questions is, you know, we’re talking about CMMC today and you know, we talk about the, we talked a little bit about the past. We talked about the present and love to get your perspective on, you know, based on what you’re seeing now, what are you seeing as some of the trends that are going to continue in cybersecurity or the future or the things that you see happening that you’ve identified as this is going to be a challenge in the next three to five years going forward in the industry. Yeah. So I think that the takeaways from the CMMC for example is really this third party assessment process.

Host (22:47):

Self-assessments don’t work is really the bottom line companies have been self-assessing for HIPAA compliance for PCI compliance. They self a test online, they fill out a form, it might be 30 questions or 50 questions. Vendors nowadays are getting smart. They want to, they want to score the risk of who they’re doing business with and for good reason. So they create what’s called VSQ or vendor security questionnaires. I’m sure you’ve seen them. They’re sometimes five, 600 questions. They’re complicated. They have multiple tabs at the bottom. They ask a bazillion questions. Most of the folks that try to fill this stuff out, it’s like a deer in headlights. They don’t even know where to start, so they need help filling out the VSQ. All of that stuff, shouldn’t be so hard companies of all sizes should have basic cyber hygiene in place. They should have basic policies and procedures and security controls in place, and they should be audited by a third party to make sure that they’re done correctly because I get it.

Host (23:46):

I know that a lot of small organizations, they probably don’t even have an it person. It might be the owner or the brother or somebody that comes in to set them up, right. To get them going. And that’s all great. But that person may not be a cybersecurity expert and may not know, or have ill intentions, but maybe it makes mistakes in regards to security, you know, back I’m sure Justin, you remember back in the XP days and the windows days, you know, a lot of people would just do password and people still do this today. Scared, so scary. But now they’ll log on with the same credentials to the systems and they won’t take, you know, the disable, the security of the operating system in favor of, of using regular usage to make their job easier. So my point is that you can’t do that stuff anymore. You can’t fake it. Craig, I’m going to have to interject. If you’re going to reveal my password on the on the podcast, which is password, I’m going to have to cut you off. laughing.

Host (24:42):

And you can send me sticky notes.

Host (24:45):

That’s my other favorite slide. When I do a presentation of a guy who put his password on a sticky note on his monitor, and they took a picture because to your point, now we use these complicated passwords of like 10 or 12 characters with a capital letter, a number of symbol. And if you make it that complicated, well, how the people get around it. Cause they can’t remember all that. You write it down and stick it on the monitor. Well, the scary thing is how easy it is for a key logger to capture that password and key loggers are malicious software that captures your keystrokes and they exist on mobile devices as well as for Mac and windows systems and endpoints. And there’s no known way to stop a zero day key logger. You could, you could find known key loggers with traditional software like antivirus or anti-malware software, but unknown keyloggers are easy to find and available to hackers on the black market for just a hundred bucks or less.

Host (25:38):

So the only way to stop that is to not use the keyboard or use a token based system. You’ve probably seen one of these before, or maybe not. It’s called a hardware token. Yep. That one in particular is part a YubiKey. So you can use that in conjunction with a supported password manager. So you can remember this long passphrase plus you have to use the token. So if you had a keylogger on your system that knew your password is they still can’t get in because they don’t have hardware token. So doing those things together and meshing in multiple layers of security vastly improves your cybersecurity. Gotcha.

Host (26:16):

Well, so that could be a really great tip.Our question that we’re asking everybody is what is the best kind of privacy or security tip that you would want people to take away? So it can be either personal or for their work environment. Personally, the token might be my tip for today, but I’d like to hear if you have any other special tips.

Host (26:40):

Yeah. So the, the one I showed with the password using a password manager paired with a token is something everyone can do. And it’s cheap. I mean, a token like that, that I showed you is less than 50 bucks on Amazon, a password managers probably about the same, but it’s like for the whole year. So for a hundred or 200 bucks you get vastly improved security. The second thing I can give you would be multifactor authentication, almost all cloud services, support multi-factor authentication, where you get a text message or you use something called Google authenticator and it changes every 60 seconds or 30 seconds with a new pin code. So you should use those things for as many services as possible.

Host (27:21):

Yeah. I really like authie. That’s one that I enjoy and what’s nice about author is it can be mobile or desktop. So Google authenticator is always having to go to my phone and get it. So authie is kind of nice because it’s right there, it’s on lots of places.

Host (27:36):

Yep. And so for our last question, we’re going to go away from cybersecurity and just say, you know, what is something that you enjoy doing outside of your day job? So I, a few tips, a few different things. I like to do CrossFit. I started doing CrossFit about four years ago. I built a gym in my garage and I do CrossFit every day. Yeah.

Host (28:01):

Now did you take over the whole garage? Just kind of

Host (28:04):

It’s the corner, the corner. Yep. And my wife likes to, you know, especially with COVID nowadays, she uses the garage in the gym in there. So I got a full CrossFit gym in there with, you know, rower and bike a fault by, you know, the whole nine yards, you know, that’s awesome because I used to have a gym and now it’s my wife’s office and the podcast studio. It’s a multi-purpose room.

Host (28:31):

Wow. Who taught you to be a lawyer?! There you go. I guess so. Well, Hey, it’s been great to have you. I think we learned a lot today, particularly about, um, the commercial version of Microsoft three 65, particularly interesting. Um, last thing, if people want to learn more about CMMC and what you’re doing, where can they find you? They can go to my website, which is Petronella tech, PETRONELLatech.com. Download our CMMC guide book that they can download if they go to CMMC.Petronellatech.com as well as a wealth of other resources available. Okay.

Host (29:15):

Well, we’ll be sure to include that in the show notes. So thank you so much. Thanks again, Craig, for joining us.

Host (29:20):

Thank you so much for having me. This was fun. Absolutely.

Johnny Lee

Johnny Lee is a forensic investigator, cybersecurity and data privacy specialist, digital detective, and attorney with almost 30 years of experience under his belt. He is currently the Principal & National Practice Leader of Forensic Technology Services at Grant Thornton LLP, one of the world’s leading audit, tax, and advisory firms.

One of Johnny’s primary passions is providing advisory services to companies that are working to address complex cybersecurity, blockchain, information governance, and data privacy issues.

Available_Black copy
Available_Black copy

Here’s a glimpse of what you’ll learn:

  • Johnny Lee talks about his background as an attorney and management consultant
  • Why SOC 2 compliance is not enough to meet your company’s data security needs
  • How blockchain technology is evolving over time to address privacy issues
  • Grant Thornton’s litmus test for deducing how effectively and appropriately blockchain is being applied to business problems
  • Johnny predicts what the future of privacy and data security will look like
  • Johnny shares his personal privacy tips

In this episode…

You, like many other business owners, may think that you’ve done everything necessary to ensure your company’s privacy and security. After all, you have a SOC 2! But, is that really enough to protect your data?

Unfortunately, depending on compliance paperwork could lead to serious privacy and security issues for your business. Though having SOC 2 compliance is a necessary step in establishing protective measures for your company, it is the minimum requirement for privacy and security—not the maximum solution. So, what are the next steps you should take in your journey toward achieving privacy and security for you, your customers, and your company?

In this episode of She Said Privacy, He Said Security, co-hosts Jodi and Justin Daniels sit down with Johnny Lee, the Principal & National Practice Leader of Forensic Technology Services at Grant Thornton LLP, to discuss common misconceptions about privacy and security. Listen in as Johnny talks about why your business needs more than SOC 2 compliance, how blockchain technology is improving over time, and what the future of privacy and data security will look like. Stay tuned for more!

Resources Mentioned in this episode

Sponsor for this episode…

This episode is brought to you by Red Clover Advisors.

Red Clover Advisors uses data privacy to transform the way that companies do business together and create a future where there is greater trust between companies and consumers.

Founded by Jodi Daniels, Red Clover Advisors helps their clients comply with data privacy laws and establish customer trust so that they can grow and nurture integrity. They work with companies in a variety of fields, including technology, SaaS, ecommerce, media agencies, professional services, and financial services.

Their free guide, “How to Increase Customer Engagement in a Private World,” is available here.

You can also learn more about Red Clover Advisors by visiting their website or sending an email to info@redcloveradvisors.com.



Hi, Jodi Daniels here. I’m a certified information, privacy professional, and I help provide practical advice to overwhelmed companies. I’ve worked with companies like Deloitte, the home Depot, Cox enterprises, Bank of America, and many more. And today I’m joined by my husband, Justin Daniels.


So good morning. I’m Justin Daniels or Jodi Daniel’s husband is I like to say I’m a cybersecurity matter expert and I’m also a business attorney. I help quarterback the design and implementation of cyber plans. And I also help clients when they have the inevitable breach. Additionally, I also provide cyber business consulting services to companies.


Today. This episode is brought to you by Red Clover advisors. We help companies to comply with data privacy laws and establish customer trust so that they can grow and nurture integrity. We work with companies in a variety of fields, including technology, SAS, e-commerce media agencies, professional and financial services. In short, we use data privacy to transform the way companies do business. Together, we’re creating a future where there’s greater trust between companies and consumers to learn more, go to red Clover advisors.com.


So who do we have here with us today? We have a friend of both of us today. So today we have Johnny Lee who is a forensic investigator management consultant and attorney specializing in data analytics, digital forensic, and electronic discovery and supportive cybersecurity incident response, corporate investigations and litigation. He also provides advisory services to companies working to address complex cybersecurity, blockchain information governance, and data privacy issues. Good morning, Johnny. And welcome.


Good morning to you both. Thanks for having me.


Absolutely. We’re really glad that you are here today to help get us started. You know, it’s really interesting that you are a management consultant and attorney in how many different areas you’ve covered over the course of your career. So I’d love if you could kind of walk us through the career arc and where you started and how you got to where you are today.


Well I’ll start with an anecdote from my father who was a career physician and knew he wanted to be a doctor at age seven, who thinks that my career progression is a function of attention deficit disorder, but it started in the liberal arts. I was a philosophy major determined to be a university professor and then fell out of love with that vision upon graduation and worked in the rare book business for a time taught myself how to program software and learn networking and master databases. And that was my career for about a half a decade. And then I reinvented myself and went to law school worked at the district attorney’s office for a short time. And once again fell out of love with that vision. And then ever since I have worked to try and marry those two real passions, law and technology.


And about the time I was doing that in the early two thousands there wasn’t there wasn’t a really established way to do that. There wasn’t a career path that was very well trod. So I worked through a series of consulting platforms. Arthur Anderson, a risk consulting firm that was formed after Arthur Anderson went under and now for the last 11 years at grant Thornton specifically focusing on forensic technology, which is, you know, the best way that I’ve found to marry those two real interest center to really leverage the knowledge that I gained over the past 25 years, doing professional consulting in some form or another.


So Johnny specifically in that interesting career arc of yours, how do you think being a lawyer has helped you with your management consulting and forensic practice?


So I think it’s a really understated aspect of a law degree that it is first and foremost, one of the best business degrees in the world. And the reason for that is that it lays bare the mechanics of how businesses really operate at a tactical level over time. The regulations they deal with the insulations and risk management protections, they’re able to secure their place in the market their ability to protect their brand. I learned more through law school and the years since about basic principles of contract law, to how to manage vendor relationships through my legal training, right? It is it is a first rate business degree and I’ve used it every day since in that capacity. It really does give you a peek behind the curtain that allows you to look at things with, with the proper level of skepticism and to apply that knowledge of how the world really works to say that that’s, you know, that’s not the right position we should be taking, or that’s going to run a foul of a regulation downstream, or that doesn’t protect our interests in the way you think it does because you don’t have this appreciation for a nuance in this quadrant or that arena.


So I’ve used it in that capacity. Mostly, I should say for the benefit of my current partnership that, that these, these views are mine alone and do not reflect those of Grant Thornton. For the record, I am a non-practicing attorney which will make my CPA partners quite happy to hear on the record.


Got it, got it, got it, love it. Are there any particular themes that you find come up often over and over again?


In regard to the consulting work? I do, yes. I think you know, it depends on the context of that question, but there are themes that I see over and over again with regard to the way that certain interactions happen, you know, with our clients in some ways seems elementary to say that you might have the right to do something in a contract. But if you do it as a matter of common sense or sort of social being socially conscious, you’re going to alienate your customer or the actions you’re recommending may take, might alienate their customers or their stakeholders. So I see that disconnect happen all the time, confusing permission, you know, confusing the is with the odd, if you will. And that scene comes up again and again, both in the privacy arena and the security arena in the security arena. I see that conflation happen most often in the context of confusing compliance with actual security. One is, you know, both are very valuable agendas, but they’re very different animals. And we see that come up again. And again, that sort of difference between we’re compliant. How could we possibly have been breached? And those conflations can be very dangerous for businesses, both in, in their tactical security and their, their broader privacy philosophies. So wait a second, Johnny, you mean when I read the cloud providers soc too , I thought that meant that they’re good to go.


Well, soc twos are our scopes that are designed by management. So it’s important that you read them carefully. And for the substance of them, the fact that they exist is evidence of nothing, except that they’ve had a third party do something for them. So it’s a great point. And I know you asked the question tongue in cheek, but I think that is the illustration that we see as a disconnect. We see paperwork being held out as something beyond which it really signifies, and there’s a danger to that. And I think privacy professionals recognize that. And I know that cybersecurity professionals recognize.


Yeah. So that’s a really interesting point. There’s I talk with a lot of companies and the belief is, Oh, if they have that SOC two, then that’s it. They’re good. I’m curious. What do you see companies doing? What, are the recommendations we could offer to another company evaluating to say that piece of paper is great?


Here’s what else you also want to be looking for?


I think first and foremost, it’s recognizing that satisfying a compliance agenda and satisfying the desire to be secure or to hold private information truly securely. Those, those are Venn diagrams, and if you’re doing it well, they overlap significantly, but there’s daylight on both sides of that, right? There are things that relate to a compliance agenda that have little impact, move the dial very little in terms of securing your data more or less. That’s always going to exist, right? With the nature of evolving technology and the complexities of bad actors, constantly innovating and studying ways to compromise data. There’s always going to be a Delta between what the law or the regulation says needs to occur as a certification mechanism versus how to actually day-to-day keep your environment secure. And I think that’s just a practical reality that more and more executives appreciate every year.


But it’s an important thing for us as consultants and advisors to, to repeat as often as we can, because I think the, the notion that a compliance audit has achieved security is a dangerous one to be sure there is overlap there. And there are meaningful things that management can take out of a compliance audit done very well. And that can rely on those audits. If it’s done by a reputable party against an established benchmark, that’s meaningful in the security arena, but it is very dangerous to conflate those two things as a matter of principle. But I think, you know, your, your, your company is a great example. I know Justin, the way he practices law thinks about this holistically, and these are the sorts of things that our clients need to hear again and again, right? Which is, these are different conceptualizations of very squishy concepts. And, you know, you can’t find any two security geeks to agree on much, but I think if you ask them whether, you know, a SOC is equivalent to security, you’ll get pretty uniform answers in the negative there.


So Johnny, you touched on emerging technology. So I wanted to take the conversation into a more narrow area of emerging technologies. And I want to have you talk a little bit about the type of work that you’re doing with blockchain technology.


Sure. I’d be happy to I’m a diet and will convert so a discount accordingly, but we we’ve been in this space for a little over seven years at Grant Thorton. We are doing work principally in, in an arena. That’s very age-old right. We are helping our clients as a public accounting firm do financial statement audit work and the intersection of how blockchain relates to that and how it in some ways hinders that and how it helps. It really been a fascinating thing to watch over the last seven years. If you think about what a traditional financial statement audit consists of, some of the key testing attributes come in this concept of existence, right. Does the asset that your client is asserting to hold exist? Is it real, and is it valued at the level that your client asserts it to be in their books?


Ownership is a really critical, crucial aspect of testing of this kind of financial stuff kind as well. And in the book, blockchain arena, those two concepts, perhaps above all others, at least from a technological perspective Have been very challenging to, to effect with distributed ledger technologies. Yeah, I’ll explain a little bit why without you know, exhausting our entire time together, but with regard to existence, the ledger actually has neat mechanisms for ready identification of whether an asset exists. And at that given point in time, what that asset consists of in terms of its objective valuation in some ways that’s a little harder than a traditional financial statement audit because you can’t get things from an objective third party, like a bank. We can’t simply get a bank statement as of midnight on December 31st and call it a day. We have to, for some types of cryptocurrency and our firm audits nearly 40 different nodes and thousands of different ERC, 20 compliant tokens.


In some of those tokens, especially what we call the Satoshi like tokens Bitcoin and their derivatives. Getting to a point in time balance is an eminently challenging affair because the ledger doesn’t hold that data. So in essence, once you have an address that you need, that, that you can establish is in fact, under the custody and control of your client, you then need to establish what at that point in time, the end of the fiscal year was in that address and what its objective value is. And to do that for certain kinds of technology, you need to recreate the entire transactional history of that address because you need to tabulate what are called unspent transactions. And so it’s not enough to have the address, and it’s not enough to establish that that client had custody and control of that address as of that point in time.


But then you have to do all this convoluted forensic work to reconstitute. What that value was that point in time balance was for that address in that cryptocurrency. And this mode of testing can be vastly different from one cryptocurrency to the next. Now in other ways we can do very ready things. The existence and ownership for some cryptocurrencies, the existence is pretty straightforward and the ownership is very straightforward because it is immutably recorded to the ledger. So in some currencies that’s even more straightforward than a traditional financial statement audit, but in others it’s eminently harder. And so we need to actually run both an independent node to satisfy our charter as a arm’s length accountant, but we also need to run derivative or forensic nodes that are constantly improving the record in the background in a way that satisfies an agenda that is very different than what the designer’s implemented. It may come as no surprise to you two, that blockchain developers don’t really contemplate making auditor’s happy as they build their new products. So I hope that helps,


Or I might add privacy and security professionals for that matter. They just develop technology. But on that score with a follow-up question, Johnny, what are you seeing as blockchain is evolving when it comes to addressing privacy issues? Because one of the great things about a blockchain is the ledger is supposed to be interviewed. And obviously we’ve got privacy rules, California, and GDPR that talk about being able to get your data or being able to be forgotten. How are you seeing that being reconciled or that an issue that’s just not being addressed yet?


I think it’s being addressed. And I think it comes down to the consensus model, the design of the original implementation. There’s some really exciting promise for privacy in the blockchain technology. Perhaps no more so than the bank of India example where a consortium of banks who are normally fierce competitors are actually sharing information at a remarkable level of fidelity to prevent a certain kind of systemic fraud in that part of the world. They’re doing it in a way that is not in any manner betraying the privacy obligations that they have for their account holders and their applicants that would run a foul of either, you know, national law or, or their customer relations outreach. So I would say that that is an example of an exceptionally well-designed program that uses firmly established cryptographic architectures like a Merkle tree to secure private data, and yet make it available or verifiable if those combination of data elements is seen again in a ledger.


Then there are certain that we’ve seen that don’t adequately contemplate the right to forget or the right to purge or immutably recording things on the ledger that are you know, un-encrypted or unauthorized skated. Those are problematic because as you know, with the EU directive and the privacy laws coming out of California and burgeoning elsewhere around the world those are going to be very problematic technologies because if you write it to the ledger, it is, you know, in, in the common parlance immutably there. And so whatever you’ve written can’t be struck. And so if you haven’t contemplated that from a consensus model, you’re going to have a lot of problems with that. And those kinds of solutions will fall by the wayside, whereas something out of India, like the Merkle tree example those are, those have got legs. I think we’ll see those architectures be adopted more and more often because they find where to put that fulcrum between efficacy and privacy. And I think it’s a, a very good question, but it’s all over the place


So much, like we talked about before where oftentimes people will get a SOC report check-mark. It’s a secure company. They’re good. A lot of times I hear, Oh, well, it’s on the blockchain. So it’s secure it’s okay. What are the challenges when someone is thinking along those lines?


You know, it’s a great question. And, Grant Thornton came up a few years ago with what we have come to call the litmus test, which was as much an internal mechanism for us as it was a mechanism to help our clients figure out whether blockchain was being appropriately applied to the business problems they were bringing to us to help solve. And I think, I think something like that is probably worthwhile, right? I won’t bore you with the entire litmus test, but it basically comes down to an illustration that if you can do what you’re proposing with the traditional database design, then all you’re doing through a blockchain implementation is building a slower database, right? Because you’re not speaking to the two central strengths of blockchain, which have to do with functions of trust and functions of system resilience. And so if you can’t pass that litmus test, then you are by definition applying the wrong tool to the problem.


And I think that’s how we would approach that, which is, let’s be careful about what your actual agenda here, your agenda is to serve clients with this kind of modality, but you also have this privacy balancing that is blockchain, the right solution for that. And that’s that design consideration that you were alluding to Jodi, which is maybe blockchain technologies aren’t designed with auditors in mind or privacy and security practitioners in mind, but the better ones will be because those are necessary design inputs, but I think there’s an analog to traditional software development there. Right? Most software is not designed with security in mind. That’s changing as a phenomenon in the it world and has been, I think for a decade, but it used to be quite the afterthought, right. I remember doing coding at Arthur Anderson and the security guys were brought in at the last minute with an impossible deadline and an incredibly complex set of interwoven code that they were asked to sort of bless or rubber stamp days before a launch.


And it was a completely unfair request. And I think, I think a great deal of that still goes on and the better companies bring into their dev ops and their design protocols considerations about privacy and security. And I think you see that with the cottage industry, that’s springing up to serve those who are, are tending more toward privacy considerations in their consuming habits, right? Look at a company like start page out of the EU as a, as a secure search engine, right? A private search engine. It, something like that didn’t exist. Right. We had the U S equivalents and ducked out go in the light for years, but this has ratcheted that up even more so that there is a free market response to those. And I think the ones that contemplate privacy insecurity in the design and through the implementation of the software are going to have a real competitive advantage because I do think there’s a groundswell in the consuming public that those things are increasingly important every year.


Yeah. We’re, we’re certainly laughing over here about the last minute folks coming in, being asked to bless and make it all magically secure. I certainly still see similar concepts, whether it be from a legal point of view, here’s the contract I’m good. Right. Or we can get that privacy notice updated, or it meets that or the security piece. So we have certainly made progress at the same time. There’s more, more to be done. On that note, what might you say is sort of the future? We talked about blockchain and the evolution of incorporating privacy and security at the beginning and how the tools are continuing to advance with the push towards thinking about privacy and security. So if you kind of put your crystal ball in front of you, what, what does the future have to say? Is there a new future technology that you might you might start seeing more of?


I think I go back to the way I sort of answered the last question. I think the future is going to be driven by considerations that 10 years ago weren’t part of the equation, right? I mean, if you look at the launch of the things like Facebook and consider a launch of competing technology today, like parlor nobody would have paid attention to parlor, which is a social media platform designed with privacy in mind. Those are really different conceptualizations and, and they’re almost bookends to the kind of contemplation you’re asking about here. People weren’t shopping for privacy 10 years ago. They simply, weren’t not in America anyway. And that’s mostly what I’m commenting on here. Folks in the European union, they have a different conceptualization of privacy.


We can certainly talk about that, but I think more people are coming around to privacy as a worthwhile consideration are, are more considerate of how businesses, some, some businesses exist strictly to aggregate and resell and package you as the product. I think the disturbing trends coming out of the consumption of social media among our teenagers. Research like that is really important. And if you’re sort of watching all of that unfold and becoming an edified consumer, you’re going to shop very differently than you did a few years ago. And I think the companies that are picking up on that and honestly catering to it, they’re going to have a real edge there. And I think that above all is the real difference in the next five or six years in terms of how markets are going to be, our products are going to be marketed, I should say,


Thank you for that, Johnny. I think we’re gonna get to the last two questions that Jodi had asked about that. We’d love to learn more about.


What’s the best privacy or security tip that you’d offer to our listeners?


Oh boy, I get this one, a fair amount. Sticking with, with personal privacy, the two best things you can probably do for yourself are to research and adopt the use of a password manager and a VPN. There are things that, you know, if you study this space, even at a cursory level, will disturb you about how much your internet service provider knows about you, about how much your phone company knows about you, about how even the public records that your jurisdiction holds about you and how those information and those data get packaged and resold even from governmental entities. There, there are steps that you can take to make sure that you can minimize those digital footprints and without becoming perhaps as paranoid as the speaker here lighten your concern about exposures in that way. I don’t think people need to go down, you know the privacy rabbit hole entirely to get comforting news, that these things are manageable risks and password reuse is one of the easiest things to conquer. And there are viable user-friendly technologies to help you with that. So I would say those two things are probably at the top of that list.


Thank you. Do you have a we’re big VPN and password manager, people here as well. Do you have a favorite one that you recommend?


You know, I should probably steer clear of a specific endorsement here, but for my personal use, there are a lot, I think what’s important is to find a reputable company that’s going to be around two or three years from now, and that’s always a challenge, right? But there, there are a lot of you have to do your homework, right? There are a lot of very reputable sources out there. There are companies that are established to do nothing but privacy work. You look at a company like proton which has secure mail and calendar functions, and also its own VPN client situated in Switzerland, which is a remarkably privacy oriented jurisdiction. You have Nord as a technology that there are, you know, there are a half dozen, highly reputable firms that do VPN work. And there is a privacy scorecard that you can build for some of those things.


And certainly happy to send you some links for your show notes on that allow you to contemplate those sorts of things where they’re situated. Are they subject to this third-party review, are their practices, where are the servers do they store logs? You know, all those sorts of things I think in the US there’s an increasing appreciation among the consuming public that there’s an awful lot of warrantless searches that happen on logs of that kind servers of that kind. There are exceptions to what most people envision as their fourth amendment protections that are worth looking into right. Again, consider the source I’m professionally paranoid, but these are the sorts of balancing acts that I think are important for privacy conscious folks to contemplate as they look at a password manager or a VPN product.


Well, thanks. We would love that list. So please do send it along. We’ll make sure that we post it


Excellent. Happy to do it.


All right. So I think the most important thing is outside of being professionally paranoid, when it comes to privacy and security, what do you do in your free time?


Well, I think this may not be a terribly interesting factoid, but I have been an inveterate reader since I was a kid. And my first real job out of college was in the rare book business. And there I think I had the habit turned from reading to collecting. So I’ve been a book collector for more years than I’ll admit on this podcast, but decades let’s call it. And I have a, a really solid collection of 20th century detective fiction first edition. And so I spend a great deal of my time out of work, reading mysteries and detective fictions and those sorts of things. So that’s how I’d answer that.


Yeah. That’s a new fact in all the years. I’ve known you. I didn’t know you collect books, that’s going to be it for our next coffee time, whenever that happens in the future. Well, fair warning. I can go on for days about that.


Well, I look forward to it. I love to read as well, but so Johnny, how can people find you to learn more about what you do and the kind of expertise you provide?


Well, I appreciate that. I am a partner at Grant Thornton and I lead a practice there called their forensic technology group. And that group specializes in cyber security and privacy and blockchain work in, in terms of how advanced technologies get applied to those agendas. So if you go to grantthorton.com and search my name all eventually come up, there’s a good looking Johnny Lee, he lives in New York. So I’m the second one.


Nice. Well, Johnny, thank you so much for joining us today on the She Said Privacy, He said security podcast.


Thanks so much for having me. It was a lot of fun.


Thank you.

Jodi DanielsJodi Daniels is the Founder and CEO of Red Clover Advisors, a boutique data privacy consultancy and one of the few certified Women’s Business Enterprises focused solely on privacy. Since its launch, Red Clover Advisors has helped hundreds of companies create privacy programs, achieve GDPR, CCPA, and US privacy law compliance, and establish a secure online data strategy that their customers can count on.

Jodi is a Certified Informational Privacy Professional (CIPP/US) with over 20 years of experience helping a range of businesses in privacy, marketing, strategy, and finance roles. She has worked with numerous companies throughout her corporate career, including Deloitte, The Home Depot, Cox Enterprises, Bank of America, and many more. Jodi is also a national keynote speaker, a member of the Forbes Business Council, and the co-host of the She Said Privacy, He Said Security podcast.

Justin Daniels

Justin Daniels is a cybersecurity subject matter expert and business attorney who helps his clients implement strategies to better manage and recover from data breaches. As outsourced general counsel for Baker Donelson, Justin advises executives on how to successfully navigate cyber business and legal concerns related to operations, M&A, incident response, and more.

In 2017, Justin founded and led the inaugural Atlanta Cyber Week, where multiple organizations held events that attracted more than 1,000 attendees. Justin is also a TEDx and keynote speaker and the co-host of the She Said Privacy, He Said Security podcast with his wife, Jodi.

Available_Black copy
Available_Black copy

Here’s a glimpse of what you’ll learn:

  • Justin and Jodi Daniels talk about common privacy and security concerns when dealing with third-party vendors
  • What you can learn from Target’s gigantic third-party vendor security breach in 2014
  • The main privacy concern in Target’s security breach: unauthorized access to personal data
  • Justin and Jodi discuss the common factor in major security breaches of high-profile companies
  • Why most questionnaires for third-party vendors aren’t enough to gauge the true state of their cyber hygiene
  • The importance of determining how third-party vendors are using the data that they acquire from your company

In this episode…

Many businesses today simply aren’t doing enough to protect their security and privacy. This isn’t a result of complacency or poor leadership, however—instead, it can be traced back to cybersecurity concerns with third-party vendors.

Third-party vendors are essential to the day-to-day operations of your business. But, did you know that some of the most significant security breaches in the past 10 years have been a result of poor cyber hygiene on the part of these vendors? Most business owners and operators are unaware of just how much power third-party vendors have over their company and customer data. Thankfully, Justin and Jodi Daniels are here to save the day with their expert insights into how to proactively protect your company!

In this episode of She Said Privacy, He Said Security, Rise25 Co-founder John Corcoran sits down with Justin and Jodi Daniels to discuss everything you need to know about protecting your company from third-party vendor security breaches. Listen in as Justin and Jodi talk how third-party vendors can gain unauthorized access to personal data, what they do with that data, and steps you can take to protect your company and customers today. Stay tuned!

Resources Mentioned in this episode

Sponsor for this episode…

This episode is brought to you by Red Clover Advisors.

Red Clover Advisors uses data privacy to transform the way that companies do business together and create a future where there is greater trust between companies and consumers.

Founded by Jodi Daniels, Red Clover Advisors helps their clients comply with data privacy laws and establish customer trust so that they can grow and nurture integrity. They work with companies in a variety of fields, including technology, SaaS, ecommerce, media agencies, professional services, and financial services.

Their free guide, “How to Increase Customer Engagement in a Private World,” is available here.

You can also learn more about Red Clover Advisors by visiting their website or sending an email to info@redcloveradvisors.com.



Welcome to the, she said privacy. He said security podcast. Like any good marriage, we will debate, evaluate, and sometimes quarrel about how privacy and security impact business in the 21st century. Hi Jodi Daniels here. I’m a certified informational privacy professional, and I help provide practical privacy advice to overwhelmed companies. I’ve worked with companies like Deloitte, The Home Depot, Cox enterprises, Bank of America, and a lot more. And I’m joined today by my husband, Justin Daniels.


So Justin Daniels here or otherwise known as Jodie Daniel’s husband. I am a cyber security subject matter expert in business attorney. I am the cyber QB, helping clients design and implement cyber plans. I also help them clean up the mess and recover from a data breach. I also provide cyber business consulting services to companies. Today. We have John Cocoran here and we have flipped the script and he will be refereeing this discussion and let with Bazell let the games begin and the dog starts us off. Right. All right, good. He wants to be heard. Exactly, you know, he didn’t get to introduce himself. So that’s why I spoke up. Exactly. So exactly. So thanks you guys. This is going to be a good episode. So what we’re talking about here. You both have expertise in privacy and security, and we’re going to be talking about a third-party vendor.


So companies that are using third-party vendors and how that raises both privacy and security issues and some of those things that you need to be aware of. But first, before we get into that is episode is brought to you by red Clover advisors, which helps companies to comply with data privacy laws and establish customer trust so that they can grow and nurture integrity. And red Clover advisor works with companies in a variety of fields, including technology, SAS, e-commerce media agencies, professional services, and financial services. In short Red Clover Advisors uses data privacy to transform the way the companies do business together. We’re creating a future where there is greater trust between companies and consumers to learn more, go to ReCloverAdvisors.com You can also email info@redcloveradvisors.com. All right, so we’re going to hop into this topic and Justin, I want to start with you.


So the topic is third party vendors, and you know, many companies, especially larger companies are using thousands of third-party vendors. And we were talking beforehand about how it creates all of these different security risks. So just launch us into this topic, what are some of the things that companies need to be aware of when they think about security concerns having to do with their third-party vendors? So I think the first thing is, is there really needs to be an awareness from the company perspective that you really need to care about the cyber hygiene of your third-party vendors. That step one is even recognizing that, Oh, I need, need to care about the vendor ecosystem I have and how they deal with cybersecurity. Second is what kind of process do I want to put the vendors through? There’s too many times when companies just send out a vendor questionnaire and simply rely on that without really going behind it to really understand what cyber hygiene the vendor really uses. Those are my two big ones, two big ones. All right. And then from a privacy perspective, Jody, what are some things that companies should be aware of when it comes to their third-party vendors?


Yeah we really need to understand how the data is being used and, and processed. Are they using it for whatever reason that give them are, you know, maybe it’s a payroll provider or an accounting service or a marketing agency or some type of company, right? What are they doing with the data? And maybe are they even analyzing it, aggregating it for themselves and repurposing it? Do they share it downstream? Maybe they have sub processors or other vendors that they’ve hired to help fulfill whatever service or product that we bought from them. So it’s important to understand the Daisy chain, we’re almost the domino effect of how data is being used and processed.


Okay. So I want to dive into those and use a couple high profile breaches that have happened in recent years as examples. So Justin you’ve followed many of these different companies that have been in the headlines in the last couple of years for different breaches. So do you want to start with a one in particular?


So I think the best one to start with is target that really brought breaches onto the national landscape and what a lot of people don’t realize about the target breach. Is it really emanated from a third party, HPAC vendor in Pennsylvania. You’re thinking what’s that got to do with target? Well, back in the days, when we step foot in the physical target regularly, you know, someone needed to manage the HPAC for the building for target. And that’s what this company did, but they were also connected to Target’s network.


And so a cybercriminal hacked into the HPAC vendor and wallah that got them into target, but not only did that, get them into Target’s network, they were able to get wherever they needed to go. So where do you go with a retailer? You go to that point of sale system, and then you target all those credit cards and that’s exactly what happened. And so the point of that story is the HPAC vendor is part of Target’s network because they have access to it. So how you then make sure that those third party vendors have good cyber hygiene because a consequence of their bad cyber is your data breach. It seems crazy that a HVAC provider is having access to Target’s point of sale system. How common is that? Sadly, it’s more common than you would think, because really the concept that we’re talking about is net work, segmentation, and least privileged access.


So what all that means in plain English is think about if I’m the HPAC vendor for target. Maybe the only thing I need access to is certain parts of the network that allow me to keep tabs on the different locations that I need to be what’s going on their maintenance schedules, but why would I ever need access to Target’s POS system that has nothing to do with my function as an HPAC vendor. And that’s where the least privilege comes in, but also network segmentation. So think of a network, if you segment it into a bunch of different sandboxes and maybe one sandbox is invoicing POS system operations. And so how do you then segment those different areas so that if the hacker somehow gets access to one area, it’s not so easy for them to go into another sandbox, but if it’s just one big sandbox with everything in it, you get in one place, you can go ever, you can go wherever you want to go. And that’s what a lot of companies don’t really do well. And that was a fortune 500 company. Think of all the middle market and small companies, they don’t have the resource to even think about that kind of stuff. Let alone implement. Yeah. So Jodi, you followed this target breach as well. What sorts of privacy concerns were raised from that incident?


Well, anytime you have a data breach you’re naturally falling into a privacy issue because now someone else has unauthorized access to personal data and what are they going to do with it? So that’s the intersection of privacy and security. I often talk about them being a concentric circle and there’s overlap well when the security is breached. Now, my data is no longer private and there’s a big overlap there. So now companies have to think about how am I going to communicate this to individuals. So there’s a big communication and PR strategy that has to happen once it’s determined a breach, many people all got the letter. Maybe you have credit monitoring and things like that, because now that information is on the black market somewhere and could be used for identity theft and things along those lines. Then in today’s worlds, Target, happened years ago. But now we also have these privacy laws to consider things like GDPR and CCPA and a number of others around the world. They also have obligations if you have a data breach. So not only does a company have everything I’ve just described, but they also have other obligations under these laws that they might need to consider as well.


So when a data breach does happen, then they have additional obligations that are kicking the play


They do. That’s why prevention is so important because it’s, it’s a lot your, your time resources and attention to having to deal with this is plentiful.


Got it, John, I was going to add what complicates it as Jodi’s alluded to is the complex complicated regulatory structure. Because in the United States you don’t have an overarching cyber law or privacy law. You’ve got California’s consumer privacy act, which is a very important privacy statute, but now you’ve got HIPAA Gramm-Leach-Bliley we have more of a sector approach. So, but now when you’re a retailer and let’s be honest, what retailer isn’t doing business in California, I think it’s the fifth largest economy in its own right. So now you’ve got to really start thinking about, Oh, I need to worry about this California consumer protection act by calling Red Clover to help me out initially with figuring out what to do. Yeah. And Jodi and I did a great another episode where we talked about that, the different regulatory framework, that GDPR, CCPA, how that’s affecting things and how the lack of some kind of national standand makes it difficult for companies to figure out that landscape.


Let’s, let’s talk about some of the other breaches that have been out there. So there’s been a number of different retailers that have had high profile breaches. Do you want to tackle some of the other ones? Home Depot is one. Marriott is, one is a couple of different ones that have happened. I think Focus Brands was another one Lord and Taylor and w what they all share in common is all of them started with a third-party vendor that allowed the cyber criminal to get access to credit cards, point of sale. Because at that time, when all those breaches occurred, that was very lucrative. Now it’s been graduated to let’s just ransomware and deny them access to their entire network. And not only that, but we’ll exfiltrate the data. So whether they pay it or not, they’re in a really tough spot. That’s what you’re seeing now. But back when we were talking about all these retailers, it was more of the third party vendor going in that way, you can still go through the third party vendor to get access, but now it’s all about ransomware.


Privacy concerns from these various different retail breaches that come to mind?


I mean, really similar to what we’ve, what we’ve already talked about. So, you know, if I’m at a point of sale system and I’m giving you my name and my credit card, and maybe an email kind of also ties to maybe what other connect, what other systems is point of sale connected to, because if the entry point is point of sale, can I easily go and grab one more data from other places? So not too much different.


But I would also say John to follow up on Jodi ‘s point is even if they have access to the data and you’re a retailer and e-commerce, that does business in 50 States, you could have access to data that requires that you give breach notification in 50 States because there’s a different state law. Imagine trying to do that. If you don’t have the right cyber insurance, or even if you do so, put juxtapose that against, you know, doing it right on the front end, that is a huge thing to do all those notifications. So the notifications just explain to the listener what that means, how you comply with those different notifications. I imagine there are different standards in terms of the notice that you need to provide. So to your point, they’re relatively similar, but there are differences. So anytime you’ve ever gotten something in the mail and we’ve gotten them in the mail that say, Hey, we want you to know, you know, someone’s gained on authorized access to your data. Here’s the call center we’ve set up and here is the credit monitoring that we’ve set up. And so you get noticed that this has happened so that you can take certain precautions, and then it’s usually followed up by some type of letter from possibly a law firm about potential class action lawsuit. You get that one? Yep, yep. Yeah. You get that one as well. I guess that’s the other notice that we received,


There’s a whole process to what that looks like. You have to determine when you have the right information, what you communicate, and then a whole process of determining which state has which requirements. So it’s, it’s very timely and an expensive and a diversion from regular business, and you really don’t have 50 different States. Right?


Right. And I imagine there’s a whole industry of other companies that help when this sorts of things happen to step in and help with that. There’s different.


Let’s talk about how stepping back to the overall topic here, which is third-party vendors and how I figure out what is secure, what is safe, and Justin, you mentioned a questionnaire doing a questionnaire, kind of a little dismissively saying that a lot of times, that’s not enough. So talk a little bit about what role the questionnaire plays and how you can improve it if possible.


So I’ll start with an example, John, and we’ll just use you. So let’s say you want to do business with my company and I send you a questionnaire and it says, John, do you have good security? And I’m sure you’re going to write back. Yes. My security is really bad. You shouldn’t do business with me.


That’s exactly what you’re going to write. Right. You’re going to write back. Our security is fine. Right. And especially sales teams, right. Sales teams who want to make the sale are more likely to be like, Oh yeah, we’re fine. You want to watch the blood drain from the sales teams face, bring a lawyer or a privacy or security person into the room and watch like, Oh no, the office of no has arrived, but you know, a little more seriously, if all you do is send out the questionnaire and they answer them and you don’t go behind the answers, what have you really learned? It’s just an exercise and digital paper that really has no meaning. So what can you do? And so Jodi and I have been investigating technologies that maybe people can use that are questions that are geared to a kind of framework.


It might be a NIST framework or an ISO, you know, a different security or privacy framework that basically can gauge the answers against what best practices are. So now you can start to compare and there’s, we’re seeing technologies that are out there that can identify, Oh, well, if they have to have this best practice and the answered this, they need these two other types of controls or things to put in place. Like maybe multi-factor authentication to be more compliant, but at least now you’re starting to go behind the questionnaire or construct something that gives you data that you can compare to a best practice. And now make more of an informed decision. Another aspect that would be helpful is if you put together a good third party vendor compliance program, you start to have definitions of, you know, the level of vendor. Like if I, if, if we have an outsourced it vendor, that should probably be a vendor who gets the proctology exam from a third party vendor perspective, because that can have serious ramifications, but maybe somebody who’s just providing email or whatnot or a service that’s limited to one part of your network and doesn’t go anywhere else.


They may not get the same level of scrutiny because what they’re doing, isn’t as critical to the function of your business. But that assumes you’ve identified. What’s important to my business. What are the frameworks or business operations that I can’t go without, that shut me down. And then you start to build out from there, what are your third-party vendors who service those critical business functions? And now you start to say, okay, I have to put vendors in different buckets of the kind of scrutiny that I’m going to give them along with the questionnaires or technology that actually gives you actionable informed information about the true state of their cyber hygiene.


So I want to turn to you Jodi, because you said one of the important questions that you should ask under this idea of how do I figure out what is secure and safe is how are they going to use the data that they acquire? So talk a little bit about that.


Yeah. So it’s important to understand what they’re going to do with it. Are they using it just to perform the product or service that you’re, t offering or, you know, that the company is trying to perform, or are they maybe going to use it for themselves as well? Could they be pulling it together kind of a DataBank of some sorts, and maybe they’re going to use it for just analytics, but are they using the personal data for analytics or are they stripping the personal data? And if they’re stripping the personal data, how are they actually doing that? Are they sharing it with other customers? You know, I had a situation once where a company said, Oh, no, we don’t use it for anybody. Else’s, it’s not personal data. And we just use it for you. And after getting on the phone with them and it kind of something just didn’t add up.


And I kept asking question after question was like peeling back the layers of an onion. It was identified that they were, it was true. It was a privacy friendly tool and the data was not personal for that purpose, but they were actually aggregating all of the data and repackaging it and selling it to other customers. So our company’s data was going to be used to fuel, you know, monetization strategy of data for somebody else. And they hadn’t disclosed that to us. So why would we be okay with that? Would we not be okay with that? And that’s a use situation. And with my data, it’s important to have that conversation with the company. And, and it’s often not the sales person on the other side, but really getting into their engineers and their technical architects to really drill in and what’s happening.


And you, I know you’ve had experiences stories where you’ve had clients where you’ve uncovered just from not looking only settling with looking at their website, but going beyond it, having a human conversation with someone where you have found situations where they said they were in compliance with GDPR and CCPA CCPA, but it turns out that their practices were actually not in compliance


Well, and so similar to the story that I just shared, that certainly happened. And there’s been times where they might still be complying with the law, but what they’re saying, isn’t that still quite adding up to what’s actually happening, that the data is being used a little bit differently than described. I think sometimes companies have good intentions and they’re trying their best to summarize, but it can never replace the human interaction of really drilling in and saying, okay, so I send the data to you and you do what with it. Oh, well we put it in this database. Okay. And so then you do, what was it? Oh, well we share it with all these people. Okay. And so how do I get the data to you? Because I hear that you don’t have any personal data. Oh no, no, no, no.


We, we totally have personal data. You’re sending it to us. And that actually right in itself, uncovers any type of privacy obligation that a company has. So if I’m sharing personal data, that’s a flag that says I have obligations when I send it to the vendor, the vendor has obligations. So we can’t only look to the pretty webpage that is great. Companies should have pretty web pages that explain this stuff. But we also have to go a step beyond and really make sure that we understand the flow of data, what we send them, they do what with it and the end result. And when we look at that whole process laid, that’s generally not perfectly described on pretty webpages. And it gets uncovered through these assessments. Justin was describing or an old fashioned conversation.


Yeah. And actually to circle back to you, Justin, is it enough to have that human conversation or circling back to your point about segmenting networks? You know, maybe that human’s going to say, yeah, we segmented our network. Our network is fine is totally secure. Should companies be going an extra step further where they even have, depending on the size of the company have a professional from their team that looks under the hood, so to speak for that third-party company to make sure. And you see where I’m going with this. Make sure that that, yeah. Okay. If I come to you and I want to do business with you, and I say, you know what, John, I want to have my third party vendor come onto your network and Snoop around, what is your answer going to be? You know, there’s the challenge, I guess, is I don’t want to do that.


Yes or no. Probably not. Right. That’s not the right answer. The answer is, hell no, I’m not giving you access to my network. Now in certain circumstances, if it’s a large company and it’s a smaller vendor, they might have the ability to strength. Yeah. Get that done. But practically speaking, what really happens, that’s where your third-party vendor contract comes into play the cyber insurance that you make them have the actual obligations that underpin what Jodi and I have talked about, because now what you’re seeing in a lot of commercial contracts are the data privacy and security addendum, where a lot of these issues in a third-party contract get addressed for exactly. The reason that you’re talking about, which is getting access to somebody else’s network is not easy. You might be able to do some security testing on their public facing network, like their website and whatnot. So what a companies end up doing, they have a contract, they have requirements about cyber insurance because there are some limitations.


Makes sense. That makes sense. As we wrap up this conversation, any further thoughts on either the security side or the privacy side, as it pertains to third-party vendor agreements.


So I would ask that I think a lot of times people think the small SAS company for whatever cool tool that they downloaded off the internet, it’s no big deal. And any time you have data going anywhere, you want to understand who that company is. And you should read their privacy notices and practices. And you know, some of the bigger companies also have kind of security certifications that they have to go through. So that gives you a little bit of a sense of comfort, but I guess I would leave that anytime you’re sharing data, no company is sort of too small to to discount. And, and on our side from a data breach point of view, there’s certain data that’s included in the definition of a data breach kind of per the law. But at the same time, from a privacy point of view, there might still be data that’s not in that fancy definition, but that still counts. And you want to be paying attention to that. So it’s, don’t only pay attention to the security definition and only the big companies, the little guys and all the personal data account,


Justin, any final thoughts? I guess what I would add is, well, I’m a small company. No one’s going to target me. I’m not a problem to anyone. How many times have we heard that? A lot in your prime target because you’ve done nothing. So I’m going to go after you target, but that’s exactly how you phrase it or what’s the more. Sometimes I phrase it like that. I guess what I’m going to add, John, is no company is too small to not be thinking about this yet. In my view, there remain remains a real gap in terms of what companies say about what they’re doing about privacy and security versus, Hey, we just want to get out there and get the technology implemented, start making money off of this. There’s still a huge gap between the importance of people attribute to what Jody and I are talking about. And then the actions that get taken. I mean, don’t, you still have people who say, what about the GDPR PR thing? It doesn’t apply to me. I don’t have to worry about it. And you have to explain to them that, Oh, but you do need to care.


Right. Great. Red Clover advisors is the name of the company, Jody. Where can people go to learn more about you and the work that you guys do? Yeah. Come check us out at redcloveradvisors.com. You can send us a message info@ redcloveradvisors.com and visit us on LinkedIn or Facebook. All right. Great. Thanks everyone. Thanks for listening to the, she said privacy. He said security podcast. If you haven’t already be sure to click, subscribe, to get future episodes and check us out on LinkedIn. See you next time.

Justin Daniels

Justin Daniels is a cybersecurity subject matter expert and business attorney who helps his clients implement strategies to better manage and recover from data breaches. As outsourced general counsel for Baker Donelson, Justin advises executives on how to successfully navigate cyber business and legal concerns related to operations, M&A, incident response, and more.

In 2017, Justin founded and led the inaugural Atlanta Cyber Week, where multiple organizations held events that attracted more than 1,000 attendees. Justin is also a TEDx and keynote speaker and the co-host of the She Said Privacy, He Said Security podcast with his wife, Jodi.

Available_Black copy
Available_Black copy

Here’s a glimpse of what you’ll learn:

  • Justin Daniels talks about helping a real-world testing facility design and implement their cybersecurity plan
  • Potential privacy and security issues that modern companies and cities face
  • How the public’s expectations around their private information is shifting
  • The benefits and liabilities that companies need to be aware of when collecting data
  • Why should companies invest in cybersecurity protection?

In this episode…

As technology continues to advance, so does the risk of cybersecurity and privacy issues. Now more than ever, companies need to design and implement cybersecurity plans to protect themselves from potential risks and data breaches.

According to Justin Daniels, a cybersecurity thought leader, you don’t have to forgo innovation to keep your company safe from cybersecurity and privacy issues. So, how can you protect yourself from these risks without missing out on cutting-edge technology?

Tune in to this episode of She Said Privacy, He Said Security as Justin Daniels, a cybersecurity subject matter expert and business attorney, is joined by John Corcoran of Rise25 Media. Justin dives into the potential security and privacy issues that companies need to know about today. He also talks about creating a cybersecurity plan for a smart city testing facility, how to stay aware of what data your company is collecting, and the importance of investing in cybersecurity protection. Stay tuned!

Resources Mentioned in this episode

Sponsor for this episode…

This episode is brought to you by Red Clover Advisors.

Red Clover Advisors uses data privacy to transform the way that companies do business together and create a future where there is greater trust between companies and consumers.

Founded by Jodi Daniels, Red Clover Advisors helps their clients comply with data privacy laws and establish customer trust so that they can grow and nurture integrity. They work with companies in a variety of fields, including technology, SaaS, ecommerce, media agencies, professional services, and financial services.

Their free guide, “How to Increase Customer Engagement in a Private World,” is available here.

You can also learn more about Red Clover Advisors by visiting their website or sending an email to info@redcloveradvisors.com.