Click for Full Transcript

Intro 0:01  

Welcome to the She Said Privacy/He Said Security. Like any good marriage we will debate, evaluate, and sometimes quarrel about how privacy and security impact business in the 21st century.

 

Jodi Daniels  0:21  

Hi, Jodi Daniels here. I’m the Founder and CEO of Red Clover Advisors a certified women’s privacy consultancy. I’m a privacy consultant and Certified Information Privacy professional, providing practical privacy advice to overwhelmed companies. Hello,

 

Justin Daniels  0:37  

Justin Daniels here I am a technology attorney who is passionate about helping companies solve complex cyber and privacy challenges during the lifecycle of their business. I am the cyber quarterback helping clients design and implement cyber plans as well as them manage and recover from data breaches.

 

Jodi Daniels  0:56  

And this episode is brought to you by my company Red Clover Advisors. We help companies to comply with data privacy laws, and establish customer trust so that they can grow and nurture integrity. We work with companies in a variety of fields, including technology, SAS, media and professional services. In short, we use data privacy to transform the way companies do business together. We’re creating a future where there’s greater trust between companies and consumers. To learn more, visit redcloveradvisors.com.

 

Justin Daniels  1:30  

Today is super fun. Yes, I see you’re very perky

 

Jodi Daniels  1:33  

today I am I had my coffee. Okay. But it’s it’s it’s a two for one special

 

Justin Daniels  1:41  

that it is we’ve like Connect for today

 

Jodi Daniels  1:45  

is connect more except in our three box. But there you go. So we have very special guests from the amazing Perkins Coie law firm. Yes, we do. You’re gonna kick us off. Yes. So

 

Justin Daniels  1:58  

we have David Biderman, who leads Perkins Coie’s consumer products and services litigation practice and its food litigation industry group. David has an active litigation practice in federal and state courts throughout the country for 39 years. He has litigated some of the most consequential food litigation matters in this emerging field and has established precedent and preemption, the reasonable consumer doctrine and Consumer Protection Law.

 

Jodi Daniels  2:24  

And we are joined by my longtime friend Dominique Shelton Leipzig. She is known for asking deep questions that provoke transformative answers. And as the chair of the Global Data Innovation Team and co chair of Perkins Coie’s ad tech privacy and data management practice. She focuses on privacy data strategy, leveraging data, and avoiding litigation. And when you need an expert, you want the person who wrote the book, and her landmark book on CCPA has published a second edition, make sure you get it. And she pioneered the concept of data as a pre tangible asset and what she calls our post data world. She has trained over 18,000 professionals on the CCPA and as a member of the IA PP board and leading advocate for data ethics. We are so excited to have both of you here today. Hey, nice. Wonderful. So I want to let you all decide who goes first. But we always like to know, how did you find your way to the path to privacy and all things data related?

 

David Biderman  3:34  

Dominique always goes first.

 

Dominique Shelton Leipzig  3:39  

Well, first of all, great to be here. Jodi, Justin. It’s, you know, a long way pastor privacy has been working and collaborating with Jodi both as a client contact and then and your consulting firm as someone that we’ve collaborated with. So just a plug for how amazing Oh, God and Red Clover are? So my journey really started in 1998. I went to a law firm that did really I guess, for a tiny firm. We were only 70 attorneys 35 partners, and I was the fifth associate in Los Angeles. But we had 35 Associates, but one of our counsel or I guess associates left and gone to a new company at that time called Yahoo. And he called one of our trial attorneys. Michael Kohn to the firm was Folger, Levin and Kahn, Peter Folger, the Folger coffee family and then John Levin, whose father Jack Levin was very active in LA business and then Michael Kohn, who is a renowned trial attorney. And so when Yahoo ran into issues, John, who eventually became head of litigation eventually General Counsel reached Shout out to Michael Kohn. So I guess we didn’t really think of it back then as privacy litigation. It’s just what everybody in the firm did. This is literally one of our top clients. And so I grew up with privacy and ad tech being sort of, and litigation being completely intertwined in my mind. And in my mindset, just continued that on to another New York law firm, that was a bigger platform, I was at folder for about 10 years. And I just continued the digital practice founding the privacy practices of three other law firms in Los Angeles, and to where I am now working with David, no longer, no longer litigating, I haven’t litigated in about nine years. But I do sympathize and understand completely, but the litigators have to go through in terms of defending clients. And so that’s what I try on the counseling side to deliver or help, you know, the clients get as close to compliance as they can. So in the inevitable time when they are sued. There’s hopefully less work for David’s team, but I’ll turn

 

David Biderman  6:14  

it on, David. Thanks a lot. Hey, it’s good to see you all this morning. Thanks for for inviting us. Um, yeah, my journey so not as direct as Dominique’s I, as you said, just not 39, 40 years 40 Ugly years practicing law. So I’ve had to, like reinvent myself many times. And I’m a class action litigator, and I’ve litigated class actions. They actually litigated some cases with Dominique’s old firm. We were on the same side of the inside of the V. When we were represented another platform. years ago, Dominique, you probably don’t maybe for your time was one about gambling, gambling sites on the Internet. Remember that one?

 

Dominique Shelton Leipzig  6:54  

I remember that I wasn’t on that case. I remember when the firm handle it was a big deal for us.

 

David Biderman  6:59  

Yeah, yeah. Well, yeah, we actually tried the case anyway. But so I just had to reinvent myself constantly. And I’m betting when Dominique joined the firm, I always I start sort of felt a kindred spirit. And then Dominique mentioned to me, Hey, you know, there’s a statute, and it’s a it’s good apply in California. And it’s got a private right of action, and it’s got statutory penalties. And I thought, wow, there’s gonna be a lot of litigation involving that. So Dominique, and I started collaborating, thinking about ways that this statute CCPA would actually result in litigation. And then now we started monitoring CCPA litigation on a weekly basis, we public annual urine review, to keep an eye on how this how this statute is basically becoming realized, in the courtroom. And it’s, it’s been very interesting. I think it’s really, in unless I’m missing something. It’s just the beginning because you combine private right of action and statutory damages. And that’s a recipe for class action lawsuits.

 

Justin Daniels  8:07  

You’re looking at me,

 

Jodi Daniels  8:08  

hey, I’m like, talking about reinventing himself. I feel like that’s what you say all day long. Yes,

 

Justin Daniels  8:15  

it depends on the day.

 

Jodi Daniels  8:17  

You’ve gone through many iterations as well.

 

Justin Daniels  8:19  

I have. I’ve been a corporate attorney, which I still am, but then I’ve been do I’ve dealt with autonomous vehicles, drones and the latest iteration of the blockchain. NFT. So David, I understand where you’re coming from. Apreciate

 

Jodi Daniels  8:37  

that, David, you talked about the individual private right of action, and that this is just the beginning. Share a little bit more about maybe what you’re seeing out there. What, what are the issues that people are finding and trying to litigate against, and maybe some ideas of where do you think it’s going to be headed?

 

David Biderman  8:58  

Yeah, well, you know, they, that was I think Dominique represented the chamber and as that bill was being graded, drafted, and and that the legislature is very careful, to make clear that the only private right of action was based on essentially what’s called an exfiltration, or a data breach. And it went so far as to say, you really can’t use this statute as a hook under some of our other consumer protection statutes like you, like we often do. In other words, you can’t say you via the CCPA, and therefore, you violated the we would call the unfair competition law. And so, but what we are seeing still, less now, but more, our plaintiff’s lawyer is trying to test the bounds of what in fact, is an exfiltration or where in fact, a private right of action may be available under the CCPA. That’s number one. And I think that That’s gonna settle out that the statute is so clear that plaintiffs are gonna be unable to get those statutory damage hooks without actually finding an exfiltration. But the law is still a little bit unclear on that. And so that’s where we’re seeing. That’s where we saw the first sort of wave of lawsuits was where people are, let’s test the bounds of that. And there’s been some decisions that have been handed down there, then fairly, fairly uniform and saying you can’t expand the right. But still, there’s other, you know, state statutes, human rights, privacy, unfair competition laws in general, that could protect you if you suffered some kind of damage. And so, you know, as you know, the issue, oftentimes in these privacy lawsuits is whether there has been any damage. And the good thing about the CCPA, if you’re a plaintiff’s lawyer is you damages are statutorily presumed. And so, that that’s one thing we’ve seen is ways that lawyers are trying to sort of find some sort of damage hook outside of the CCPA. And then I’ll shut up on this. This is a little bit kind of wonky, obscure, but it is relevant. It’s, the Supreme Court recently decided a case called TransUnion. Y’all may be familiar with that. But basically just held that even if there’s a statutory violation, a statutory violation without injury, that was a fair credit reporting act statute, doesn’t create doesn’t create an injury for purposes of Article Three jurisdiction, which is the prerequisite for suing in federal court. Unfortunately, in our state courts, or I guess, fortunately, depending on what side of the VR, Article Three is not required. So for example, you could have a violation of the CCPA, you have no damage. But if you have the violation, that’s enough to get you into court, in state court. So what I think we’re going to see is a lot more cases migrating from the federal courts to the state courts, because plaintiffs lawyers would prefer to be in state courts, because state court judges tend to be a little more loose, a little more liberal classification rules are easier. And right now, defendants will try to remove a case from state court to federal court. And plaintiffs lawyers are saying, well, now there’s no Article Three jurisdiction under TransUnion. And the courts are sending it back. So I think we’re gonna see a lot more litigation into the state court, actually, onto the CCPA. Now, particularly after TransUnion long winded answer, sorry.

 

Jodi Daniels  12:40  

No apologies needed. It sounds like the ping pong of paper.

 

Justin Daniels  12:47  

So kind of building on what you’re talking about. Obviously, there are several states where privacy laws have come back onto the docket this year. And if I recall correctly, in Florida, where that law floundered, foundered was on whether or not they would have a private right of action. What are your thoughts based on what you’ve seen both of you as to how a private right of action may or may not make itself into the bills from some of these states like Washington and Florida that are going to put privacy back on the docket in their legislatures this year?

 

Dominique Shelton Leipzig  13:18  

So I have one little sort of backstory on the Florida bill. So as that was progressing, and it was looking like it was going to actually get enacted, I had some business, government affairs folks reach out and they said, Dominique, we understand that your firm’s keeping a litigation tracker or something. And so I actually gave them David’s litigation tracker, to be able to counter the argument that the consumer advocates were making that the private right of action would not blow up into a bunch of litigation in areas that were beyond the data breach parameters. And so back then, this was 2021. The stats looked, I mean, we were sort of looking backwards to 2020. The stats look like about a third data breach, or maybe 60% data breach, and then another 40% issues that had nothing to do with data breach. And so that, at least gave the a little water in the wine of legislature to give moment a pause to the private right of action. And that’s how that Bill floundered. So I don’t even know if I had a chance to tell David. But that is that is actually what happened. And so those government affairs and lobbyists folks, were really happy to have the tracker, which is on our website that lays out the stats, because nobody else is really keeping it across. You know, I’ve seen some splitters and xMatters But David’s teams, you know, actually reads every complaint that mentions the CCPA or anything related to California privacy rights, and they document that so they’ve got a really good handle on it, but I think those I think the Florida bill just barely missed it. So I think that is a very close one to watch. I think that one is prime as well as Washington State for possible legislation. So, you know, keep our eyes peeled for that. But the private right of action is at the core of all of it. And I’ll just say one last thing in terms of the federal legislation, I was literally just talking with a cam carry over at the Brookings Institute and the our institute with Tatiana Bolton. And, you know, frankly, there seems to be sort of a coalescing on the federal level around what can be compromised, potentially, in a private right of action. It used to be the business was like, No way, you know, we just don’t want any private right of action. But now that California is already afoot, there’s, you know, some, some reason to believe that industry might soften a little bit. And if we can keep it with parameters, strong parameters, like, you know, what the cases have borne out, as David mentioned, that it should just be related to negligent data breach, only, that there might be an appetite for industry on the federal level to go in that direction. So really interesting, really interesting times.

 

Justin Daniels  16:16  

Dominique can ask you both a follow up question on that, and correct me if I’m wrong, but when it comes to the California private right of action, doesn’t it really narrowly apply when you have a data breach and the data was not encrypted when it shouldn’t be is kind of the prima facia evidence of negligence.

 

Dominique Shelton Leipzig  16:32  

Yep. And now, that is the case. And I’ll just turn it to David to talk about how last year before they got sort of stamped down the plaintiff’s counsel, were trying to use things like cookies as negligent, or, you know, was trying to say that those are negligent exfiltration of data.

 

David Biderman  16:55  

Yeah, they’re obviously expand, trying to expand the definition of exfiltration is supposed to be unencrypted. And, you know, just you you mentioned sort of prima facie, you know, you think that if there was a, there was a leak, and it was non encrypted information, it would be almost five faced evidence of unreasonable security efforts. But there is a prong that basically says that it’s the burden on the plaintiff to try to prove that it’s unreasonable security measures. So you know, that that’s not a, as you likely know, if that’s not a, that’s not a defense, you’re gonna win on summary judgment, because if it’s a reasonable issue, a real issue of reasonableness, you’re gonna have experts on one side, we’re gonna have experts on the other side. So I think the reasonableness issue is going to be one where you’re gonna see a lot of litigation, you’re gonna see a lot of sort of a cottage industry of experts on both sides, talking about what is and is not reasonable security measures. And then just go back to your made your question, if there ever is a state that passes the broader private right of action, such as Washington, it is Katie bar the door, or was it really and and one, Dominique member, we interviewed someone from Washington, DC, and we said, well, you know, when do you think there’s gonna be a national statute, and he says, there will be a national statute, when some state creates a private right of action for any violation, because he said, then the feds are going to step in and try to shut it down. So we’ll see if that happens. But that’s that’s the predict.

 

Jodi Daniels  18:31  

That’s interesting. I’m always asked about the national law, and I have all kinds of financers to my my dad, that one to my collection. You one of the questions I was gonna ask you is around this concept of reasonable and what is considered reasonable and what is considered negligent? Based on what you’ve seen so far, can you share a little bit of maybe some of the themes that are emerging of you do these types of activities, it’s likely going to be maybe considered reasonable? I know there’s no perfect answer, and I’m not looking for that. It companies just are always kind of wondering what is considered reasonable.

 

David Biderman  19:12  

I got I tend to agree with Justin, that if it’s, if it’s unencrypted information, and it gets out, it is almost prima facie evidence of being unreasonable, because how else did it get out? Right? It shouldn’t get out. I mean, Dominique, you tell me, that’s you, you tell people how to do

 

Dominique Shelton Leipzig  19:29  

this. I think that, you know, on the technical side, there’s sort of two answers on the regulatory enforcement side with the Ag office and these notices of violation letters that are coming out and even when the agency comes out, I think there’s going to be a lot of looking at standards like you know, the CIS critical security controls. The which back when VP Harris was our ag, as she talked about in her 2016 breach report as being premium fascia evidence of reasonableness. And I haven’t really seen any other ag take a departing point of view. The opposite Attorney General says that he didn’t. Now he’s our secretary viscera. But he said that he didn’t see any reason to depart from what she had laid, you know, put down. Similarly, I haven’t seen ag bond to go in a different direction. But I can envision for chords, that it’s not as clear, they’re not necessarily looking at is as clear cut. So we have seen some cases where it they’re outliers, but they’ve made it past motion to dismiss on things where, for example, there was one situation where the company had a do not sell link on their website, but had gotten a lawsuit claiming that it wasn’t conspicuous enough, and a judge let that go through. Even that is not supposed to be there. So I just think that reasonableness, I mean, let me just say one other thing on this to David, I remember, and Jodi and Justin. I remember right after CPRM was enacted. And we were asked by Chief Justice Tanay Soccerway, to train the judges in Los Angeles, as well as in Northern California. So we trained judges in Sacramento as well as in LA Superior. And then back then Chief, Chief Judge, Brazil, hosted us in LA. And the people speaking were mostly Alister McTaggart. Senator Hertzberg, back then now he’s a judge but Assemblymember Chow, a lot of advocates, consumer advocacy perspectives, and I think Ellen Samberg, the former CEO of the chamber, and Jennifer Pereira, the current CEO, and I we’re probably the only ones holding up a business perspective at all that I could just say that the judges we’re mostly nodding as now the current CP pa agency director Ashkan Soltani was explaining how tracking and cookies work, the only questions we got were, how could they, you know, disable cookies from their grandkids? That perspective, I don’t think things look great. I mean, I think David and his team are gonna really have a tough job, making it clear that, that we can’t just depart from the logic to try to find some way to address these issues, because there’s, there’s the whole society around us pointing a finger at a lot of things related to data these days. And you can

 

David Biderman  22:57  

always do something better. I mean, that’s the point. Right? So yeah, we have not seen any, say a motion for summary judgment, where, you know, one plaintiff expert is saying it’s unreasonable. And one defense expert is saying it is reasonable. We haven’t seen that, nor have we seen any kind of, you know, third party compliance, as being, you know, prima facie evidence of reasonableness. It just seems so, you know, there’s just no national standards, right? We, you know, if there if there were some national standards on on, you know, security, he would, it might make a difference, you say, Hey, listen, we comply with national standards. And then maybe there would even be preemption, but you don’t have national standards. And, you know, we just interviewed a guy who wrote this book for engineers, it’s really all wonky, but But it’s, it’s on how to develop privacy. And I mean, it is, I mean, Dominique understands parts of it. I don’t think I could understand the first page. But it is really complicated to engineer for privacy. And, and, and I think it gets more complicated. And as the perpetrators get better at infiltrating, your head defenses have to get better. So it’s a, I think it’s going to be an evolving standard. And but I do agree back to you, Justin, if it if it gets out and son encrypted, there’s going to be sort of a presumption that wasn’t reasonable.

 

Justin Daniels  24:21  

Let me flip the script a little bit. And we talked about what’s reasonable. But, Jodi, I think for your benefit in the audience is a lot of times we have to write laws and say reasonable because particularly in technology and the privacy and security space, things evolved so rapidly. So my question for the both of you is, if we had this conversation, say in 2019, versus today, what do you think of the argument is, well, multi factor authentication isn’t a standard. I think, personally in 2021, at least on all the deals that I do, I have a requirement that a customer using a SaaS product is all of their employees are using MFA. It’s just table stakes.

 

Dominique Shelton Leipzig  25:02  

I like that. Justin, I just think that you’re right. I mean that there’s there’s sort of three things at play, there’s our over our continuing increasing dependence on technology, we’re doing this videocast podcast, you know, on Zoom, where we’re conducting health, education, finance, everything through technology. So there’s just a heightened, and it’s going to continue talked about NFT’s. And there’s just going to be a heightened reliance on that in the years to come. So then, with that heightened reliance, you got heightened angst, I think, by societal members about what is actually happening with this data, how is it being used? And can it be? Can it backfire to hurt me as an individual that is what I think is driving a lot of the consumer class actions and what we’re reading about why privacy and data security on the front page of the news. So I bring it back to the third thing that I’ve been really interested in exploring these days, which is, how does data relate to the brand, the value the trust of the organization? So even if it’s technically permissible, and even if it’s in your rules, and it allows you to do it? How is this going to appear? In the broader public, if it were splashed on the front page of the Wall Street Journal, New York Times and David and I like to read the Financial Times to it. And that’s really what it comes down to. So to your point, Justin, yeah, there’s nothing that says that you need MFA today, or tomorrow, but how would you feel if it’s your brand, and this is something that is industry is happening in industry? It’s totally available, and you didn’t do it? These are the types of things and I just wonder if if, if the corporate leaders like officers and directors even know the difference between what you’re talking about Justin? And and, you know, sort of the whatever’s reported by the Cisco or whatever, once a quarter?

 

Jodi Daniels  27:21  

Yeah, I definitely think there’s there’s a disconnect of a variety of different leaders. And throughout even the organization, you’ll have companies who we’ve been talking about a specific individual, right, but all of the other runs that obviously exist right to access right to deletion should it apply to just those in California? Or what about Jodi and Justin hanging out in Georgia? We don’t have any rights. Do we get any to? And you see, you see some companies that have decided to make a GDPR level rights globally, you see some who have decided to make us, you know, using California as the barometer across the board, and then other companies whose internal, only if you’re in California, that’s all that’s it. And each of those decisions ties back exactly with what you just talked about when it comes to trust in the brand. What is the consumer expect? We can have a whole field day on cookie banners, we won’t. But right, should you have lunch? What did it look like? Why do you need one, but nowadays, customers have become accustomed to certain activities. And it’s very tied, in my view to every part of the experience, when I’m going to buy a product and service, I expect, it’s going to be a fabulous experience, I’m going to get what I wanted, I trust you as the brand, I think that thing’s gonna work. Or you’re going to do what you’re supposed to. I also now have an expectation of what you’re going to do with my data. And some people in a online advertising perspective might say, Well, okay, you might follow me around the internet, collect all my data and serve me relevant ads. That’s okay. Except until they understand a little bit deeper, potentially the type of data or who else it was it shared with, or if it was part of a breach, or it was shared with, you know, my next door neighbor’s new cool startup that had no security controls, then I might care. And so all of that is tied in I don’t think that conversation is well understood at even a mid level in a company, let alone necessarily many of the executives.

 

Justin Daniels  29:21  

Well, I guess I have a question for all three of you. So I was reading. So the Superbowl had their ads, and everyone was talking about them. So I’d like to ask all of you a question about the Coinbase ad. So that was where the QR code was just marauding around the screen for 60 seconds. And when you talk to the marketing people, they thought it was brilliant because everyone was trying to get on the QR code so much. So it crashed the site. I’m thinking from a security perspective, really, you want to just have all these people with a phone to a QR code. They know nothing about what does that mean from a security perspective, and I just would love to get your perspective on how you see Your Role trying to navigate knowing what some of the security concerns are, but yet dealing with marketing people say, hey, we were trying to mark it. And hey, Dominique, David, you guys are getting in the way.

 

Dominique Shelton Leipzig  30:13  

Yes, I am definitely used to being in the way. But I think that, that, actually, the advice is more welcome. And I’m starting to see I mean, David and I are speaking on Friday, to a group of retailers about how to look ahead and look around corners and see things new, this isn’t in the financial context. But we were just talking to a financial institution last week about the same thing. So I do think that there is an attempt by industry to, to look ahead and know more about what are the implications of, of these, you know, what are the implications of what we’re doing, so we can plan it. And we’ve got another one on global litigation, coming up for another major, major company. So I just think that, Justin, it’s, it’s communication, and this is where, like, you and Jodi, and all of us here, you know, on this call, and everybody listening really can play a very active role in bringing groups together, rather than just listening to one side and the other, you know, get the security team, and community in any collaboration with the marketing team, so that there’s always a conversation. And I feel like that’s the pivotal role that those that are involved in data can play in an enterprise.

 

David Biderman  31:41  

And more more practically, just to know what, you know, I’ve seen in the, you know, I do a lot of these consumer class acts, and what’s these marketing people get put in front of a court reporter, and have to swear to tell the truth, and they have to look at some of their PowerPoint slides and what they were talking about trying to sell what they were trying to do and explain that, it sometimes causes them to back off a little bit. And, you know, that happens a little bit and the culture constitution starts to change. And then we then provide some obviously in house training and say, Hey, listen, these are things you got to avoid. And these are things you have to worry about. And here’s an example how now how a deposition works and what kind of questions you’re going to be asked, and you’re ultimately going to have to own up to this at one point. And so I think that will have a deterrent effect. I don’t think it’s going to stop it at the front end necessarily. Dominique may disagree. I, I do like we did, we did speak to the the one privacy officer from a very major platform. And he basically was all about trust, his, his first word was trust, he said this, you know, this is a brand, and people should trust our brand. And they should trust our brand, for everything that you said. And for privacy, what kind of data they’re collecting, what they’re doing with the data, when they destroy the data, all of that was really, really great. And I think you’re probably is as, as, as things evolve. You know, we do this example where you can, you can critically with companies in the GDPR, or you can look at all the cookies that are being downloaded on your site, when you click when you click on ones. I mean, it’s on folios page after page after page after page. I think if consumers really knew that, that that’s what they were dropping out of their computer, maybe thinking twice. So anyway, and then I’ll shut up about it. But but, you know, obviously, privacy is now kind of a marketing thing, right? We’ve got a couple companies are giving, you know, basically, it’s a opt in, opt out. And, you know, that’s changing.

 

Jodi Daniels  33:52  

And that note what I was gonna say, since you did take the three of us I can’t too, okay.

 

Justin Daniels  33:58  

What do you mean, just a bend.

 

Jodi Daniels  34:02  

That’s good point of view. But I’ve seen more companies create marketing compliance roles, and before and that sits in marketing. Whereas before marketing might have been not as excited and welcoming to all of these new privacy laws, and they might have taken direction from those that were setting it. Now, they’re realizing we really need to make sure that we’re doing the right thing, proactively integrating all of the expectations from customers from legal and they’re creating these marketing compliance roles, which I think is is a sign of where we’re headed and makes me really excited.

 

Justin Daniels  34:43  

So I’m going to ask one last question. And this is more on a bit of a personal level, because we’ve talked about privacy in the ad tech space. So as we talked today, last year, you know, the Biden administration passed the American rescue plan act to help modernize cities. And you’re going to see things starting to come out. I’ve seen it already, digital license plate, technology and cameras, the one that really scares the crap out of me is putting facial recognition software on a drone with a camera. And so my question to the three of you as we start to think about privacy, and I’m out driving my car in the right of way, and there’s a drone flying overhead, and it’s taking video of me in the right of way, and can figure out where I’ve been where I’m going. And all of this information is being gathered. What are the implications there for privacy? Because I’m watching the drone industry and really not think through it much like previous industries. How do we, as a society start to figure out how to educate ourselves, like all the cookies, we just talked about? Think of all the drone footage on a daily basis with facial recognition technology software,

 

David Biderman  35:57  

what does where does that get us?

 

Dominique Shelton Leipzig  36:00  

Well, you know, we had a guest on our podcast, Professor Barry Friedman at NYU. And, frankly, he was just laying out that some of the stuffs already happening in LA in Los Angeles, with our law enforcement. So we already have a lot of that data being hoovered up. And then we had a chance to speak with Eileen Decker, who is the president of the LA Police Commission, and she talked about the context and the access controls in place, and so forth. But these are really tricky issues, Justin, super tricky. In terms of how the data is going to be used, who is going to be shared with what are the access issues, and frankly, we haven’t jumped into this, but the whole algorithmic fairness thing, and that is just getting to be such a huge topic. I saw Senator Booker just introduced a bill last week, and it’s something we’ve been thinking a lot about. So you’re right to, you know, raise these things, I do think that the drone industry is being you know, sort of brought into community with privacy. You know, this is something that’s been talked about, but there there are issues. I mean, how are you going to be able to really screen out? Things like facial recognition, when you’re using drone technology?

 

David Biderman  37:25  

Yeah, they better not run those drones in Illinois, they’re gonna trouble. Like, a circle, that one stayed off because of that. Information Protection, big penalties under that. And yeah, we’ll see, we’ll see. I do think that the, the, the, the facial recognition technology is, I think people are pretty sensitive to that. It’s funny, we interviewed a guy This was he was pretty good, everyone shut up. But um, he his job. He was a was called an open source investigator. And he had this whole group of people that just sat on the internet all day long, surfing for like atrocities, like, you know, human rights, atrocities, and really some horrible things like the executions and or whatever. But then he would use that information. And then I basically track back and try to identify the perpetrators, or identify some of the victims and using all this public information. But he did talk a lot about sort of the tension, because he says, You know, I can use this information for good, but others can use this information for bad. And and it’s really it is an interesting tension. I don’t know how it’s ultimately going to resolve but it you raise a really, really interesting point.

 

Jodi Daniels  38:49  

Well, I think as a privacy community here, our little group before will have to continue to include the conversation, raise these issues, and collectively make sure that other industries learn hopefully, from prior industries. Yeah, so with that being said, we always ask our guests when you’re at your cocktail party gathering, however, what is the best privacy tip that you might offer?

 

Dominique Shelton Leipzig  39:26  

Well, I’ll just start with what I was mentioning earlier. I just really encourage people I’m talking to you that are in the industry, and a lot I hate to say it, but a lot of the cocktail parties I go to are also people doing privacy and cyber, so I’m sure it would be nice to get your social life expanded, friends. But anyway, when I’m at a cocktail party, I often just encourage colleagues to interject themselves into the conference. rotation, and particularly as it relates to data in and the strategy of the enterprise. And so I really would love to see more Chief Privacy officers, more folks that chief information security officers in community with C suite, but you know, a board members, etc, who are discussing strategy and how to maintain strategy to pose the question of what data is needed to get there. And then, actually, that will sort of unfurl I think, a whole series of communications that will be beneficial to both parties. And so with that AI, is really the idea behind a Global Data Innovation team. I started at the firm, and David and I are on it. And our first meeting is on February 28, to just, you know, unpack this and make it easier for corporate leaders to get educated about these things. And for those, you know, folks such as yourself, ingestion who know the area to be able to help demystify

 

Jodi Daniels  41:04  

certain things. That’s my tip. That’s a great tip. David, do you have a tip?

 

David Biderman  41:10  

Yeah, what I will? Of course, no, no, the privacy professionals can find me a party. So I’m always there with a bunch of layman. So what I what I do is Dominique’s got this chart on how ad tech works. And what his ad tech is how many billion dollar industry is $800 billion global 100 coins. And most people don’t even know it exists. Or if they know it exists, they have no idea how it works. And I just say, you got to look into this, you got to just see how it works. I I can show them Dominique’s chart, which is its head spinning, because there’s so many dotted lines and all you know, you can imagine the intermediaries. But I think it’s really important for for the layman to sort of I just understand how that industry works, that it exists, and and how it’s used, and you make your own decision about what you want to do. But I think just understand that industry is really, really important. I think it’s a very, I think I think understanding that industry is very limited still even today.

 

Jodi Daniels  42:07  

I completely agree. It’s amazing what people think they understand. They really don’t know what is underneath. It all works.

 

Justin Daniels  42:20  

Well, our last question is, when we’re not talking about blazing the path for cybersecurity and Privacy Awareness, what do the two of you like to do for fun?

 

Dominique Shelton Leipzig  42:33  

Okay, I’ll go first. Yeah, well, I, I love to travel. So it’s been inhibiting of the last couple of years with the pandemic, but we were able to get out to Europe in November for just a little window of time in between variants and I’m hoping to be able to, to do some more travel this year. And so what’s on the on the chalkboard is potentially Sicily. So, Adam has a film that that is getting put together and so it just depends on timing and all that but if if the stars align that I would join him in Sicily and be able to see that part of the world which I did leave it never Sicily. So I’m excited.

 

Jodi Daniels  43:22  

That sounds really lovely. You can bring us back some pasta, please.

 

David Biderman  43:28  

Okay, yeah. All right. So, I like to exercise. That’s what I do. Alright, because I’d said de stressor, and I got to do it every day. So that’s my I love it’s fun. But it’s it’s a it’s a necessity, my wife.

 

Jodi Daniels  43:41  

Oh, well. Justin would love to exercise I think all day long. Cool. Well, thank you both so much for joining us today for really fascinating conversation. If people want to connect with you. Where is the best place to find you.

 

David Biderman  43:59  

We have our

 

Dominique Shelton Leipzig  44:00  

Decrypted Unscripted Podcast page so I can share that with you. And maybe we can put it in the show notes.

 

Jodi Daniels  44:06  

Absolutely. We will put in the show notes. And we need to have you

 

David Biderman  44:11  

Yes. We start out with what do you do for fun? Yeah,

 

Jodi Daniels  44:19  

we look forward to it. Well, thank you so very much. Okay.

 

Dominique Shelton Leipzig  44:24  

And we’ll be in touch to schedule. Alright.

 

David Biderman  44:26  

Thanks for having us. Absolutely.

 

Outro 44:33  

Thanks for listening to the She Said Privacy/He Said Security. If you haven’t already, be sure to click Subscribe to get future episodes and check us out on LinkedIn. See you next time.

Privacy doesn’t have to be complicated.