Click for Full Transcript

Jodi Daniels  0:02  

Hi, Jodi Daniels here. I’m the founder and CEO of Red Clover Advisors, a certified women’s privacy consultancy. I’m a privacy consultant and certified informational privacy professional, providing practical privacy advice to overwhelmed companies.

Justin Daniels  0:17  

Hello, Justin Daniels here I am a shareholder at the law firm Baker Donelson, passionate about helping companies solve complex cyber and privacy challenges during the lifecycle of their business. I am the cyber quarterback, helping clients design and implement cyber plans as well as help them manage and recover from data breaches.

Jodi Daniels  0:37  

And this episode is brought to you by where’s my drumroll? Hello, silent today. That’s a terrible term low jump, can’t speak drum roll. Well, this episode is brought to you by Red Clover Advisors, we help companies to comply with data privacy laws and establish customer trust so that they can grow and nurture integrity. We work with companies in a variety of fields, including technology, ecommerce, professional services, and digital media. In short, we use data privacy to transform the way companies do business. Together, we’re creating a future where there is greater trust between companies and consumers. To learn more and to check out our new best selling book data reimagined building trust one bite at a time, visit redcloveradvisors.com You’re not gonna be silent the whole episode today, though, argue

Justin Daniels  1:24  

why you seem to like that.

Jodi Daniels  1:27  

Well, I wouldn’t really be he said privacy. He said security. We’ll just be as she said privacy with a lot of pauses.

Justin Daniels  1:36  

Well, I’ll try to chime in as I can.

Jodi Daniels  1:37  

Okay, good. Well, today, we have Dan Frechtling, who is the CEO of bolt of developer of digital security and privacy software and inventor of secret shopper technology for compliance. Prior to bolting. He was president of cybersecurity firm G2, which was acquired by Transunion and worked for McKinsey, General Mills, and Mattel, my daughter’s favorite company. He attended Harvard and Northwestern. Well, Dan, welcome to

Dan Frechtling  2:06  

the show. Thanks, Jodi. Thanks, Justin. It’s great to be here.

Jodi Daniels  2:09  

Now, if my daughter was here, she would just want to talk about Mattel and American Girl, but it’s a privacy podcast. So I guess we’ll talk about privacy.

Justin Daniels  2:19  

You think about an American Girl is an interesting company from a privacy standpoint with all the little kids who liked that. But we’re gonna pivot and ask Dan, with your very background, how did your career evolved to where you’re at now?

Dan Frechtling  2:33  

Well, it was a it was quite a evolution. I had my timing all wrong. By the way, because I worked at Mattel before I had kids. Now I’ve got three boys. So they missed the whole Hot Wheels and RC and action figure era that when I was there, but but that was many years ago, I really began as a marketer. But what drew me into privacy, like many people, you experience something that that makes privacy really personal to you. And, and for me, it was really a health diagnosis might one day my wife had back pain. And we went through a whole bunch of different research try to figure out what was wrong with it, we got a very serious diagnosis. And it turned out that it was cancer. And we discovered it was actually a rare form of cancer. And I did an enormous amount of additional research, it had to be had was related to genetic disorder. And as I was doing this research, I discovered something very odd, which is I started to see cancer ads follow me in unexpected places, when I’d go check out sports or check out entertainment are trying to get away from the research and medical research I was doing these ads would follow me. And these ads were so pervasive that even after she passed away, which was about three years ago, I still saw the ads, they were very durable. So that left, for obvious reasons, a very kind of indelible mark on me, and kind of triggered with me something that I wanted to do to fix that. Well, in sort of life’s great ironies, a friend of mine was CEO of a company called called Ad Lightning at the time. And his wife was diagnosed with a similar condition. And he needed to step down and take time away from the company. And so he recruited me and I left GE to the company you mentioned a moment ago, and became CEO of Boltive about two and a half years ago. So that’s my story. That’s what drew me into privacy. And it was it was an opportunity to sort of right or wrong that I had experienced and also to help out a friend and Boltive as the company has now is now named has, as you mentioned, commercialized secret shoppers for privacy compliance, to help make sure that these kinds of privacy transgressions don’t happen when companies don’t want them to know a lot of times

Jodi Daniels  4:51  

we’re talking about the issues and the creepy factor. Yeah, advertising and I always used to say that Just over a decade ago, you know, what was too creepy? And where was the line for people? The story that you just shared is, is, is unfortunately, an example of that. And, you know, I’m, I’m sorry that you had to go through that I appreciate that you’re able to share that story. So people can understand that it’s real. It’s not just theory. It’s not just academic. It’s, it’s actual people having to relive and experience something. You know, here in the health space, I’ve seen and heard similar stories of other medical conditions, you know, you might be researching for for somebody else. And then people think that it’s actually you, and then they’re using that data, perhaps in in an insurance situation or targeting you completely inappropriately. With that being said, there’s obviously a lot of privacy challenges in the advertising space. Can you share what you see as privacy challenges happening? Yeah,

Dan Frechtling  5:56  

yeah, it’s really fascinating, because a lot of what you experience with these, these privacy errors are inadvertent. And it has to do with the difficulty of orchestrating so many vendors in online advertising and ecommerce in general, if you look at the average Mar tech stack, there’s 50 vendors plugged in. And on many pages, there’s hundreds of tags and cookies. And even if you’re if you’re a media company, you’ll deal with 20 or more supply side platforms. So it becomes the sort of nth party problem for what we call digital objects, which are those on page scripts and beacons and pixels. And then the online advertising which can come in through various methods like iframes. And when you have third party sharing data, they then you’re triggering fourth, fifth and sixth parties, and the risk of data leakage really exponentially increases, because if any, any step along the chain gets privacy wrong, we’ve discovered that consent can get dropped. And so in addition, what it’s true, what’s true one day may not be true another day, because it’s an ecosystem of vendors that are that are shifting and changing as they work together as they partner together. So it’s in kind of a summary form, it has to do with the the interplay of consent, digital objects, and ads. And it’s this layering of vendors in a world where you’ve got an e-commerce layer on many websites, martec layer and add tech layer, and now a privacy layer with consent management platforms and other technologies. It’s quite complex to orchestrate.

Jodi Daniels  7:33  

Many people think ad tech is one of the most challenging eras in privacy, and many people really don’t understand all these tags. You mentioned fourth, fifth and sixth party. Can we break that down even further? Can you help explain to someone listening? What is that fourth, fifth and sixth party tag?

Dan Frechtling  7:56  

Yeah, certainly, let me talk about it first with the tags. And we can talk about ads as well. But tags can act as containers. And they can contain other tags think of, of a nesting doll of Matryoshka, Russian nesting doll where you’ve got tags within tags. But instead of just being one doll, there’s multiple offspring. And because of this, there’s not just a single line, but a multitude of branches on almost like a tree. Containers can contain other containers. And so when you visit a page, and when you trigger a tag to fire, it can trigger other tags that are that are hidden that are not visible, that as a business you may not even be aware of. And it can go in multiple layers in multiple waves. So that’s where there’s a challenge around digital objects. What adds some complexity to it even more is when cookies are dropped. Many times the cookies are cryptic, right. So some are straightforward. Like Facebook underscore F BP. If you see that that’s a little bit easier to understand that F B might stand for Facebook. But when you get into other cookies, like UET vid that Microsoft uses for targeted ads, it becomes quite cryptic. So you don’t even know what’s on your own page.

Jodi Daniels  9:12  

Thank you for sharing.

Justin Daniels  9:15  

So let’s talk a little bit about Boltive. I love this idea of the secret shopper because I know that’s what companies use to go in to see how their employees doing. So it sounds like you are taking this concept to say hey, how can we figure out how a company is doing with its privacy compliance. So maybe you could talk a little bit about how bolted is being the secret shopper for compliance?

Dan Frechtling  9:40  

Yeah, so we have developed technology that was initially used for security, because we had to trigger malicious payloads to open from bad actors that were hiding malicious browser extensions and redirects and other things within online ads. So over the years, we had to develop a way Want to look as much as possible, like real users with real location devices, behavioral history, all of those things. And we found actually that, that that technology was very well applied to privacy, because it allows us to visit pages as real users do, and forensic ly capture everything that’s going on, like a smoke test for consent systems and for the digital objects. And for the ads that we were talking about a moment ago, to see when somebody opts out. Are the right objects being suppressed? And are the ads that are not going to be retargeted being suppressed? Or are they being enabled due to some error along the way. And so being able to do that also allows us to do the flip side, when users opt in, then they’re they’re willing to share their data. So the tags and ads should be following them in, in line with their consent. So it’s a bit it’s a bit of a smoke test, that works both ways. But because it’s not intrusive, it’s a low traffic way to do it. And because it’s machine driven, we can capture everything that’s going on, and then dial forward, dial back and see, hey, if there was a data leak of some kind of where did it happen? And how can it be corrected?

Jodi Daniels  11:15  

What does that look like? So who’s typically the person who’s reviewing the findings that you have?

Dan Frechtling  11:24  

We find it so it’s a hybrid, as compliance and privacy groups are stepping up and have more of this capability in their audit list. And we’re dealing with legal departments, with data protection officers with people that are more traditional privacy, but just as much kind of in line with what we’re talking about today. It’s a it’s a marketing problem, too. So we have this kind of hybrid where half of our of our users are on the compliance side, and half are on the marketing side, because they want to make sure that their campaigns are compliant, that they’re getting transparency into what other vendors are doing. And we find a little bit it’s like, we can be a bit of a bridge, because there’s a lot of nagging that goes on between compliance groups and marketing groups, when there’s a lack of visibility to well, how many pixels and tags and beacons do we have on a page? And who are our ad vendors? And who are we working with today versus last week. So being able to have that common, you know, single place to look at to log in and see exactly what’s happening. It keeps the compliance people from having to nag marketing people to tell them what’s going on. And it gets the marketers to free up their time from having compliance people on their backs.

Jodi Daniels  12:35  

I was recently talking to some chief privacy officers, and they were complaining about that exact thing. They issue of knowing what’s actually there and trying to get marketing to pay attention or do anything is is a big deal. One of the pieces I’m curious about, as you mentioned, many companies now have some type of consent management software that they’re using, how does what you’re doing connect or interact with those consent? platforms? And so one of the things I’m thinking about is you might have a list that scans here’s all the different cookies, I also might have some type of, you know, cookie consent scanning, that’s scanning, and they may very likely not actually match. What happens then?

Dan Frechtling  13:21  

Well, yeah, overall, we, we find that a lot of companies are inadvertently sharing consumer data. And when we talk to senior privacy leaders, they often say they’ve made these investments in consent management platforms, and they’re doing cookie scans and other forms of privacy tech, to try to comply with regulations. But occasionally, they say, well, they just feel like they’re not ready, whether it’s the the July one deadline that recently passed, or whether it’s the deadlines that are going to be cascading end of this year and into next year, that when the when enforcement starts and cure periods, and they find it, they really discover it’s really hard to to orchestrate company and vendor data sharing. And so it’s a bit of like a trust but can’t verify state if you remember, the Cold War mantra. What if there’s something wrong? We don’t know about? What if something does go wrong, and we need a paper trail. So we are an auditing monitoring system that runs constantly to make sure that content management platform is working as intended that the tag managers are working as intended that the ad vendors are working as instructed. And that’s really where the secret secret shopper technology is most powerful, which is something that we we developed under the guidance of the co author of CCPA and CPRA. We we aren’t lawyers, but we try as much to to without giving legal advice to map to the state, the state requirements around the areas that we’re talking about with consent.

Justin Daniels  15:00  

Hmm. Similar to what your company’s mission is. Can you talk a little bit about the intersection between protecting your brand, but also building customer trust? And privacy? Hey, maybe that could be a new triangle for Red Clover. There we go. With it, it’s

Dan Frechtling  15:21  

it’s really interesting when when you start to put boundaries, on commerce and on marketing, and there’s a analogy that was in Harvard Business Review that privacy is to the digital age, what product safety was to the industrial age, that you once had cars with no seatbelts, and no airbags and not even headrests. But now you expect those things, you expect cars to be safe, you expect medicine to have safety seals, but there was a time when none of those things existed. It’s a combination of kind of grassroots consumer pressure laws and regulation, marketing positioning companies like Volvo taking a safety position, or Tylenol or Apple in privacy. But at the same time, there’s a segment of people who care about privacy, consumers who care about and it’s about 1/3. There’s various studies, BCG, Google, Cisco have done studies, but the segment that cares about privacy and reads privacy notices is about 1/3. But it’s going to keep growing based on demographics, because it’s 40-plus percent of those 25 to 44. And it’s only 14% of those over 75. And so when you when you see the demographic change, and you see more people caring about privacy, I really believe is going to follow what what we saw in in product safety, especially as people who are online shopping more affluent, do care more about privacy? And will they’ll switch companies for privacy reasons. So I do see, it’s more than rhetoric, there really is a link between brand protection and privacy.

Justin Daniels  17:05  

So you mentioned something interesting, when you talked about product safety. So when we’re talking about, you know, the privacy tech, the ad tech, I worked on my first large scale AI deployment last year. So now, you know, let’s say you’re a company, you have interest based marketing, where you’re able to get data about Justin and say, Oh, Justin really wants an ad for a Gore-Tex rain jacket, because he clearly likes to be outdoors. So now onto the scene comes artificial intelligence. And the other thing you mentioned in your comment was consumer education laws. But now we’re going to bring this whole new technology that has really interesting implications in the privacy in the ad tech space, with none of that. So I’d love to hear from you. And Jody, how can you allay my concerns that this will just be this will be social media bad on steroids for the privacy and data collection, folks?

Dan Frechtling  18:09  

Yeah, well, it’s it’s sort of beyond my capabilities to answer that one. I’m curious what Jody has to say, I would just say that, that consumers need to be protected, you’re not going to have the means like you can as a consumer to delete apps, or delete browser extensions, or use a backup email address or clear cookies. I mean, these things that some users have the control to do are impossible in a blackbox of AI in when it’s well beyond anything that you can toggle on or toggle off. So I I think the answer has to be in what we’re encouraged to see that laws and regulations are coming in to, to try to govern that even though it’s a very complex arcane area of understanding.

Jodi Daniels  18:56  

When he said, That’s all you got, like, that’s, that’s all you got. Yeah. Oh,

Justin Daniels  19:02  

okay. All right. Well, chalk that up to the Justin was too complicated. I apologize to our guests and my co hosts.

Jodi Daniels  19:12  

We’ve I, we’ve talked about it before, I honestly, I would hearken it back to Dan, what you said in terms of the comparison to where we had nothing from a product safety standpoint, and then it’s just common and expected. The there is no regulation right now. And as a result, you’re going to have a big wild wild west with these tools and technology, regulation will attempt to catch up, someone will do something that they shouldn’t have. Then depending on what that is, you’ll have a consumer outcry. And that will educate more people to care and we’ll have the same kind of spinning wheel situation going forward. And how quickly that will happen, I think is dependent on how pervasive a AI is used and whichever company is going to do something that they shouldn’t have.

Justin Daniels  20:05  

So I’m gonna summarize Jodi’s answer is history is likely to repeat itself.

Jodi Daniels  20:10  

Well, you know why when I was a short learned, you just summarize my long answer.

Justin Daniels  20:17  

I guess, going into the next question, it’s just interesting. When you come on to talk about ad tech and thinking through these fourth parties, these fifth parties in the sixth parties, like, you know, when we were at the over the weekend, somebody took a picture at a kiosk, and they wanted to download the picture. And so they give their phone number and I have no idea where does that go? And how will they be served ads? And so I’m just curious, Dan, from your perspective, just for our if you’re just a consumer out there, how do you start to think about even understanding how the different ways your data could be collecting and serving you ads in ways that you hadn’t ever thought possible?

Dan Frechtling  21:01  

Yeah. Well, I do love the line that history does repeat itself. I wanted to before I answer that question, I think that it’s a very profound statement. If you look at the different media that have evolved over time, whether it is display ads, well, it was really summarizing what Jodi said, so you can take me? But but it’s true. Every new medium has its own privacy issues. Its display ads video. It Metis the metaverse has its own privacy issues. And now we’re seeing it with AI. So yeah. As a consumer, I think you’ve just take a moment to be aware of what you’re doing of when you’re giving consent. Just take a moment, right? It’s quite easy now with technology to to give consent or withdraw consent. But yet the the knee jerk reaction often is I just want to get to my content, I’m going to accept all. So consumers starting to be aware that they do have choices, I think is the beginning. And then once you’re aware of those choices, you can progressively take that as far as you want. Do you want to use a privacy enhanced browser? Do do you want to do the same on your with your apps and turn off app tracking in all cases. But just being aware and knowing that you have that choice now, by law, and by technology, I think is the best advice that a consumer can can use to be safe.

Jodi Daniels  22:32  

With companies wanting to have a well thought of brands when it comes to privacy. What are the best practices for the companies you see doing a good job in achieving privacy and still being successful in marketing?

Dan Frechtling  22:50  

I think, yeah, but while still being successful in marketing, right? That’s important because this isn’t, this isn’t intended to be overreach, we don’t want the pendulum to swing so far, that marketers can’t communicate and that people can’t discover products that they’re interested in, because surveys that show people care more about privacy are still showing that people want tailored personalized advertising. But I think one thing that companies can do, and it’s increasingly possible, it takes technology to monitor technology. And with not just the layers we talked about before of commerce and martech and adtech and privacy tech, but there’s so many different configurations, that companies have to be ready for have different operating systems and browser types and device types, and of course, different jurisdictions. So it’s overwhelming to try to solve this problem through annual or quarterly audits. It just can’t be done. It takes technology. And so we find that our clients use our personas, our secret shoppers in clever ways. Like they’ll use it to look for sensitive health information. They’ll look, they can use the personas to emulate children to make sure that data protection for children, which has gotten a lot of attention this year is being honored to look for video pixels in to protect themselves against lawsuits that VPPA lawsuits, and most importantly, to find vendors that they don’t know they’re sharing data with and this is true on websites. And this is true the same practices of using technology to monitor technology in apps also.

Jodi Daniels  24:29  

Yes, you’re looking at me strangely, for anyone who can’t see me.

Justin Daniels  24:33  

I just I find it interesting. Do I’m just curious for the benefit of audience did the two of you know where the origin of the VPPA law came from?

Jodi Daniels  24:44  

Why don’t you enlighten everybody?

Dan Frechtling  24:47  

Robert Bork. Is that is that the story? Yeah.

Justin Daniels  24:49  

That Dan. And I think he’s got ESPN. He might I bring it up because it’s interesting that that law was part of his comp formation process. Were back in the day where you would actually rent videos that like Blockbuster. They were trying to go and find out what kind of videos he was renting. So they passed a law. And what was it? What’s interesting to me is I think it’s becoming back in vogue because guess what kind of rite of action it has the private right of action, which is of interest to personal injury attorneys, and that

Jodi Daniels  25:28  

there’s all kinds of class action lawsuits that are here right before our podcast recording today, I was talking to a company about, do they have any of these kinds of pixels in their video, and we had an entire discussion about trying to understand what they have, where it is, and what to consider.

Dan Frechtling  25:44  

It is fascinating how laws get passed. And then decades later, they become reincarnated in forms that the the, the writers never imagined what I love about that Robert Bork case, of course, he was never confirmed. And who knows if the, if his video history was ahead that had to do with that. But the it’s certainly galvanized the Congress who would have said, well, it’s okay to do that to him. But I don’t want that done to me. So let me let us pass a law so we can protect ourselves from getting outed from video history.

Justin Daniels  26:15  

Well, then you make an interesting point about how Congress passes laws, and maybe this is of interest for privacy. So I was reading an article. Unfortunately, we all have young kids, one of the leading causes of death is people who died in swimming accidents. And apparently, what happened back in the 80s, is some congressional person’s granddaughter died from swimming in an accident because they got sucked into something under the pool. So they passed a law requiring a new design of pools, so that could never happen again. So it’s almost like, well, if one of the members of Congress had their identity stolen or something like that, maybe they might think differently of these laws that your technology helps us to manage. Point. Interesting. So

Jodi Daniels  27:07  

well, then, with all the knowledge that you have, and you have shared some really great tips so far. But we ask everybody, what is your best privacy tip if you are hanging out socially.

Dan Frechtling  27:20  

If I were hanging out socially, and we talked a little bit, I got three boys. And so be hanging out social part of my life is kind of in the past, and maybe in the future. But if I were at a cocktail

Jodi Daniels  27:30  

yawns you might be hanging out with, you know, their friends and their friend’s parents at a ballgame or a birthday party.

Dan Frechtling  27:39  

Okay, imagine, if you will a ballgame. Then I think for companies, for me, personally, I had to flush out a lot of misconceived notions about how, as a marketer at Mattel and General Mills, in other places, I wanted to collect as much data as technically possible, right. That’s what I wanted to do. That’s how how I wanted to be successful, I had to flush a lot of that out. And we said a moment ago, we’re not privacy is not intending to suppress or snuff out marketing. It’s intended to make it more trustworthy. And I think for companies think about but before you collect data, stay up to date on the rules, the state regulations, the federal guidance FTC, and OCR in particular, and the class action outcomes that we mentioned just a moment ago. And then think deeply, do you need all the pixels and the tags and the cookies and the SDKs? And your apps? And all that data partners that you have? Is there a less intrusive method? And is, is there really a purpose before you collect and then if you must collect the data, once you go through that process, it’s really important as we see out of California rulemaking, to audit your vendors, right to perform due diligence on those that you’re in contract with, to check your own consent systems to make sure they’re transmitting opt outs properly. And, and then to check the other aspects that the digital objects, the ads, the supply side platforms, the ad networks to make sure that they’re passing consent and honoring it properly down the chain.

Jodi Daniels  29:12  

Great tips. Thank you for sharing.

Justin Daniels  29:15  

I think the especially interesting part of what Dan said is his own personal transformation, because I’m still routinely advising clients. If you don’t need the data, you don’t want to get it because now you’re potentially creating a big liability. So it’s interesting to hear how someone in the industry has evolved their thinking.

Dan Frechtling  29:35  

Yeah, I mean, database structures were routinely read or write and not delete. They, the consumer databases were growing, intended to grow with more and more data because you could because storage was cheap, which led to a lot of the abuses and excesses that we’re unwinding today.

Justin Daniels  29:52  

So, Dan, when you’re not building motive, what is it you’re doing to Have fun.

Dan Frechtling  30:02  

I think it’s trying to keep up with my boys. You know, I’m slowly exiting the era of having to drive three kids to various activities, as my oldest has now gotten his license and is able to pick up some of that. But yeah, working and raising kids is kind of the 24 hours a day, leave some time for sleep. But other than that, that’s, how I will say, keeps you young. I will say having three boys means that I have to stay active. So there’s some goodness in that too. from a health standpoint.

Jodi Daniels  30:33  

I hear you on the working and parenting thing as they mass consumption of time and in between, there’s some sleep, maybe some exercise and some eating.

Dan Frechtling  30:44  

Right. All chores chores have to be done.

Jodi Daniels  30:46  

Well. Oh, yeah. There’s those two. Dan, it’s been so much fun. Where can people connect and learn more?

Dan Frechtling  30:53  

Yeah, visit our website, Boltive.com. And I love to talk about privacy on LinkedIn. So I just Dan Frechtling at LinkedIn connect with me and and let’s talk privacy. that’s those are the places I hang out digitally.

Jodi Daniels  31:11  

Wonderful. Well, we will be sure to include those in the show notes. And thank you so much for joining us today.

Dan Frechtling  31:17  

It’s been a real pleasure. Thank you both for allowing me to be a guest

Privacy doesn’t have to be complicated.