Click for Full Transcript

Intro  0:01  

Welcome to the She Said Privacy/He Said Security Podcast. Like any good marriage we will debate, evaluate, and sometimes quarrel about how privacy and security impact business in the 21st century.

Jodi Daniels  0:22  

Hi, Jodi Daniels here. I’m the founder and CEO of Red Clover Advisors, a certified women’s privacy consultancy. I’m a privacy consultant and certified informational privacy professional providing practical privacy advice to overwhelmed companies.

Justin Daniels  0:36  

Hi, I’m Justin Daniels. I’m a shareholder and corporate m&a and tech transaction lawyer at the law firm Baker Donelson, advising companies in the deployment and scaling of technology. Since data is critical to every transaction, I help clients make informed business decisions while managing data privacy, and cybersecurity risk and when needed, I lead the legal cyber data breach response brigade.

Jodi Daniels  0:59  

And this episode is brought to you by really with my hair, strange flick, Red Clover Advisors, those are our sponsors. We help companies to comply with data privacy laws and established customer trust so that they can grow and nurture integrity. We work with companies in a variety of fields, including technology, e-commerce, professional services, and digital media. In short, we use data privacy to transform the way companies do business. Together. We’re creating a future where there’s greater trust between companies and consumers. To learn more and to check out our best selling book, Data Reimagined: Building Trust One Byte at a Time. Is it redcloveradvisors.com. Well, today, we’re gonna have so much to cover today. I’m so excited!

Justin Daniels  1:47  

About how many espressos have you had today?

Jodi Daniels  1:49  

You know, I had a chocolate cherry banana smoothie before the podcast.

Justin Daniels  1:54  

Oh, wow. Wow, chocolate cherry and banana. Okay.

Jodi Daniels  1:59  

It’s really quite delicious. I’ll take your number one. If no one has ever had chocolate and cherries together, you should really try the divine.

Justin Daniels  2:07  

Oh, what a fancy word.

Jodi Daniels  2:08  

Yes. Well, thank you. Okay, you get the pleasure of kicking us off?

Justin Daniels  2:13  

Yes, I do. So today, as Jodi said, we’ve got a lot to cover because we have Keir Lamont, who is the Director for US Legislation at the Future of Privacy Forum. In this role, he supports research and independent analysis concerning federal, state and local consumer privacy laws and regulations. Keir, welcome to the mayhem with the divine smoothie maker and myself.

Keir Lamont  2:40  

It’s a pleasure to be here. Thanks both so much for having me.

Jodi Daniels  2:44  

Well, we always like to start with how people got to where they are. So share, if you can, a bit about your career path to the Future of Privacy Forum.

Keir Lamont  2:55  

Sure. So as you’ve mentioned, we’re talking about the states today, there’s a whole lot to cover, I’ll try to keep it as brief as possible. For me. I did graduate law school back in 2015. And at the time, I had this kind of vague interest in technology policy and the sense that there would always be this regulatory gap or lag between fast moving technology and new business practices, and are lagging legal governance, systems and frameworks, and not always be space in that gap that would raise interesting questions, and hopefully opportunities for gainful employment. So I was lucky enough to attend a law school that was really investing in building out pathways into the privacy profession. I very much took advantage of that and spent time in civil society academia, at a law firm and at a trade association before coming to the Future of Privacy Forum. So I’ve worked on many of the same legal and policy questions around privacy, security and emerging technology, from different postures and perspectives. And I hope that background serves in the Future of Privacy Forum where we really try to play a role as an expert, nonpartisan convener of different stakeholders, to create spaces where progress on some of these thorny issues can be made. Immediately before FPF I spent three years at the trade association where I really felt like I was banging my forehead against the doors of Congress for three years straight, trying to get movement on federal privacy legislation. Obviously, I was not successful in that. So to anyone who’s listening who was struggling with a rapidly evolving and expanding state privacy Patchwork, I just want to say my bed is on me, I really dropped the ball. But I did see that in that federal vacuum, the states were really stepping up to fill the void and be leaders on these issues from California to Washington State to Virginia to Colorado. So I moved to FBF where I’ve been able to work much more closely on state privacy issues. A lot of my work involves really trying to help the stakeholder community kind of keep track of everything that’s happening in the states and kind of separate the signal from the noise. When you have 50 states, when you have literally 1000s of lawmakers, that can be a lot that’s happening at one time, it feels like drinking from a firehose and really trying to help folks figure out what are the laws, regulations and enforcement actions that really are significant and that folks should be paying attention to. So that’s how I got here. And that’s what I do.

Justin Daniels  5:34  

So there’s been a flurry of activity at the state level. So correct me if I’m wrong. We have, what, 14 privacy state laws that are in effect that’s passed? And what do we have on the docket? Another?

Jodi Daniels  5:50

A lot that have been introduced

Justin Daniels  5:53

Which is why we’re okay. That’s the term we have a lot.

Jodi Daniels  5:57  

The day that we’re recording today could be really different from when it’s gonna be different from when someone listens, alright.

Justin Daniels  6:03  

When it will just say it’s a lot. So here very factually, we’re with this proliferation of laws. And we have a lot on the docket. What trends are you seeing around these comprehensive state laws, there seem to be models now there’s like the Virginia model California, anyway.

Keir Lamont  6:21  

Sure. So to level set just a little bit, I think that your audience will be familiar with this concept of the dreaded so-called patchwork of this potential, or the landscape of conflicting and diverging privacy laws are merging. Taking a state by state basis, obviously, that can make compliance very challenging. The Internet obviously does not stop at state borders. It is actually my view that To date, the worst case scenarios for such a patchwork emerging or have actually not yet come to pass. Now, don’t get me wrong, there are very meaningful differences between the different state privacy laws that will continue to create very profitable work for privacy lawyers. But at a high level, there are enough similarities between these comprehensive laws that most companies can build out their compliance operations that will encompass most of the states. So that’s really been the trend. And kind of what I spend a lot of my time doing is trying to keep an ear to the ground for new developments rather than potential trends that could buck the current landscape. So I will say that when it comes to the newer comprehensive privacy laws, we’ve got, I would argue 13, or maybe 14, depending on whether New Hampshire is signed, by the time that says, states have largely iterated on the privacy laws that have come before them. They state lawmakers don’t necessarily want to reinvent the wheel if it’s not needed, that could be very difficult and challenging. But they also typically want something that they can point to that makes their own law stronger or unique. And I think a prime example of this has been around the definition of sensitive personal data, which is a category of personal information that is typically subject to heightened protections in the state privacy rules. So last year, we had Oregon which added a status as a victim of a crime to its definition of sensitive personal data, we had Delaware specifically add pregnancy status to this definition, as well as tighten the exception for publicly available information. We also had New Jersey, which added health treatment information, not just health condition or diagnosis information to the categories of sensitive data. In terms of trends, you know, I’ll also say that we talk a lot about so-called comprehensive privacy laws. And really, that is kind of something of a misnomer. These laws always have carve outs for various businesses and activities. If you’re subjected to an existing federal privacy law. If you’re a small business, if you may be a nonprofit or a government entity. But for the most part, you would think that passing a comprehensive privacy law generally means you’re done. It was comprehensive, and it’s now time to focus on healthcare or emission standards or permitting reform or any other issue that’s out there that is very important. But a trend that we’re really seeing more and more is that passing. One of these so-called comprehensive privacy laws really can just kind of be whetting the appetite for lawmakers when it comes to working on and passing legislation around these issues. So for example, last year, we had Connecticut, which made major significant amendments to its comprehensive privacy law before it even went into effect, which added new protections for consumer health data as well as children’s privacy. This year, we are expecting it to be very active in California for potential amendments to the CCPA. And then every year is active in California for potential amendments to the CCPA. In Colorado lawmakers are currently working on legislation that would add Illinois PIPA style rights and protections to the Colorado Privacy Act for biometric identifiers. And then also we have Virginia, which is currently coming very, very close to adding some additional user privacy protections to its privacy law, the VC DPA. And we’ll probably see those finalized in the next two weeks or so. So, overall, a lot of states are increasingly passing new laws that follow generally the same model with some tweaks around the edges. However, once they pass, lawmakers seem to be coming back and thinking about how is this potentially working in practice? What are other states doing? How is this developing, and as a desire to kind of opened these privacy laws back up and started digitally making tweaks?

Jodi Daniels  11:00  

With some of those laws being kind of open back up and the tweaks happening? Some of the ones we just talked about have been here for we’re going to call them the base loss. Now, for this season going forward? Are you seeing any trends where they’re trying to put all of it almost like the kitchen sink privacy law? Because instead of creating one, and then I have to open it up for extra health or extra kids protections? Are you seeing it where we’re gonna have all of them at the same time for sort of the next batch that are coming?

Keir Lamont  11:32

Sure, that’s, I think, a really interesting question. I think Connecticut has served as very much of a template model for many of the farthest-reaching state privacy laws at this point. I mentioned last year that Connecticut kind of opened their law back up and added additional standalone provisions for consumer health data for youth data. And we do see some states out there. I believe Vermont, I believe Maryland, tried to pass laws that include those additional Connecticut protections, something we’re also seeing this year. And I think it’s something where state lawmakers are drawing some inspiration from the American data privacy and protection act of 2022, which folks will remember got through the federal congressional House Energy and Commerce Committee and made all of us in the profession very, very excited for about two months there are but it actually, contrary to what some of my friends in California made argue, seemed like it went much further than any of the state privacy laws across a range of issues that scope, including a private right of action, or regulation of potentially discriminatory our algorithms and data minimization. And this year, we are seeing some States increasingly start to pick up and consider including elements from that ADP, EPA, federal framework, which obviously never passed, and include that in their privacy laws. So we see familiar language from the ADP EPA being debated in places like Maine, currently, Maryland as well. So that’s another potential major trend that could develop where we may see states take a different approach or go beyond the existing kind of state baseline.

Jodi Daniels  13:17  

That’s so interesting, because I recall that when that didn’t happen in ‘22, everyone thought that would happen in ‘23. And it appears it took a little while to come out in 2024. Now, Justin, you started to talk about how some people are some of those states seem to be almost kind of replicas. And if people are like, “Oh, that’s the CCPA style, or the Connecticut-style bill, and I’m seeing this more and more even in Georgia, it seems like the one introduces a bit of a Tennessee model. I was wondering if you can help.” Are you seeing that? And if you can help with any of the comparisons that we’re starting to see, you know, are they really getting grouped? And what, what those models might look like?

Keir Lamont  14:05  

Absolutely. That’s a great question. And as I mentioned, many times, the state lawmakers don’t want to start from scratch on these extremely complex issues. And if there’s a framework out there that’s working and strikes the right balance for what they’re trying to do in their own state, they will seek to incorporate elements from what has come before. So everyone has their own heuristic for categorizing the state privacy laws, I will tell you mine since in my subjective experience, it is the best. So California, obviously, was the first mover in 2018. And to this day, it really is in a class of itself. For better or for worse. Everyone thought that other states will take after California, however, these 12 or 13 states to date that have active comprehensive privacy laws after California have all actually rejected this California approach. They contain similar scope and similar rights, but they really follow a different model. And the most obvious reason for this is that California has very much been a moving target since almost the first day it was enacted, its requirements keep changing through ballot initiatives, amendments, multiple regulatory processes, etc. And it is very difficult to copy someone else’s work if that work is in a state of constant flux. So instead of adopting the California approach, lawmakers in Washington State worked for several years on an alternative framework, the Washington Privacy Act. Now, it’s ironic that this Washington privacy acts never actually passed, largely due to debates over enforcement mechanisms. But I would argue that every subsequent state, I share something of a common ancestor with the Washington Privacy Act model. All these 13 or so laws, they share a similar structure. They share key terms rooted in the Washington Privacy Act to WPA and that’s good for compliance professionals. But make no mistake that these different laws do vary significantly in their scopes, and how far reaching they are, despite sharing this common kind of structural ancestor. So at the upper end of the spectrum, you have states modeled on the Connecticut data privacy act, they require explicit opt in consent for processing sensitive personal data, they have opt out rights that can be exercised on a default basis through users device settings. And they also have additional opt in requirements for certain uses of adolescent data. Then you have the somewhat narrower Virginia model, which has been followed by Indiana and sort of Tennessee or the lack could be its own model. These have opt outs for sale, the sale of personal data like the Connecticut model, however, sale is defined more narrowly just to encompass exchange of data for monetary consideration, and not other valuable consideration. Or you also have a narrower definition of biometric data that would actually potentially seem to carve out some facial recognition technologies, which are definitely a privacy priority for many people. And they don’t have these global privacy controls or universal optical mechanisms that I mentioned that the Connecticut model has. And then finally, at the narrowest end of the spectrum, you have Utah and Iowa. There’s no consumer right to correct inaccurate data. Under these laws, there was no right to opt out of profiling, conducted for decisions with legal or simply significant effects. They are missing restrictions on secondary uses of data for undisclosed purposes. They have opt-out rather than opt-in for processing sensitive personal data. And so that’s the Washington Privacy Act model. And then you have the California approach, which I mentioned are many differences between California and the Washington framework do come down to language and framing. Instead of importing a clean processor slash controller distinction from Europe, California has businesses, service providers, contractors and third parties. Don’t ask me to try to explain the differences between all of those. California also clearly does go further than most of the other state privacy laws. It sets up a well-funded privacy protection agency that has broad rulemaking authority. It applies to employee data, not just customer data, though that may actually be something of a drafting accident. It also establishes a narrow, private right of action for data breaches. And California is also cooling narrower in some ways than many of the other states, most prominently it has more limited protections and a scope for sensitive personal data. It covers less sensitive personal data and is more of an opt out rather than an opt in model. So just to try to recap all of that it’s California on one hand, and the Washington Privacy Act descendants on the other. Within the Washington Privacy Act framework, I would identify three distinct clusters, the narrow Utah model, the middle Virginia model, and the broad Connecticut model, though as more and more state laws get enacted, and these laws continue to be updated. As I’ve mentioned, this neat little heuristic that I have developed is continuing to be free.

Justin Daniels  19:36  

Seems like that is tailor-made for a visual slide.

Jodi Daniels  19:39 

It would be an amazing visual, and so incredibly well-articulated. Thank you. I think everyone listening will really appreciate understanding all of those differences. And as more of the states keep coming, it will be very fascinating to see how those models intersect

Justin Daniels  19:57  

and what will happen so Oh, we are also seeing actions on kids privacy laws. I think both at a federal and state level we are. So what can you tell us about that? And what should we know as parents when it pertains to our kids?

Keir Lamont  20:19  

Sure. Well, it’s tough to say what you should know, because the legal system hasn’t necessarily figured out what any of us should know about this yet. I will say that, look, one thing I always try to remind people is that privacy is bipartisan. Red states and blue states alike have passed strong versions of the foot bar each and Connecticut data privacy, from Montana and Texas in one hand to Oregon and Delaware on the other. So privacy is bipartisan. And of course, protecting kids online is really bipartisan. But we are seeing the emergence of a big partisan split in how state lawmakers are approaching Children’s Online Privacy and safety issues. And it’s currently the case that both of these approaches are running into a buzzsaw of litigation and injunctions largely on First Amendment grounds. So here’s the general landscape. California once again led the way on Children’s Online Privacy and Safety in 2022, when it passed the age appropriate design code Act, which was largely modeled on a United Kingdom code of practice, which was developed to implement the general data protection regulation. And the GDPR kind of copycat in the UK, it’s not really in the EU anymore. So not surprisingly, picking up a framework from a dramatically different legal context and plopping it down in the United States has led to some legal questions and challenges. The California ADC required companies to conduct age estimation of the users to conduct data protection impact assessments and mitigate the risks of youth accessing potentially harmful conduct and content on the services. And that law has since been enjoined, as a result of a lawsuit last year. And just recently, we actually saw major consumer advocacy groups come out and file amicus briefs in that ongoing litigation, arguing that the California age appropriate design code act is actually harmful to privacy because requiring companies to collect more sensitive data to conduct age estimation raises privacy implications and challenges. So the groups behind the age appropriate design code Act are back this year with the new version and model legislation, I call it the age appropriate design code act 2.0. And this AADC 2.0 approach seems to sand off a lot of the original California ADC’s more constitutionally suspect elements. Age estimation has gone DPI A’s ever worked. But who can say what will happen? Right now looking at the current landscape, I would say that Minnesota and Maryland are the states to be watching most closely, but potentially adopting this new age appropriate design code framework. Separately, the more Republican led states are really targeting restrictions on youth use and access to social media platforms. And it actually turns out that defining what is or is not a social media platform is very, very difficult. These new laws typically require such platforms to conduct age verification, which is more stringent than the age estimation that the California law provided for. This kind of red state model would also require these platforms to obtain parental consent, before allowing a child or a teenager to open an account on the social media platform. Sometimes these rules have different restrictions on these child accounts if HR applications conducted and parental approval is given, such as not allowing the owner of the account to access it during nighttime hours. And also in certain cases, requiring that platform to turn over the messages and posts that the child uses conducts some sense to their parents or guardians, which doesn’t necessarily seem great for privacy or security to meet. Our laws of this nature have also been challenged and have been enjoined and states like Ohio and Arkansas and there was also presently very active litigation in Utah. So at the end of the day, I think we’re really seeing a split in these concepts of whether it should be the government that has say over what is not appropriate online sites or content for children and teens to access or whether it should be parents that have that final say. However, under either approach, there’s still a big question of whether we can pass laws that won’t limit everyone’s access to lawful content, or undermine privacy and security for everyone. And this is a big live issue that lawmakers and courts are still trying to figure out. Separately from all of this, though, I will once again mention that Connecticut, last year, went ahead and updated its privacy law to add in additional children’s privacy protections. It includes a duty of care with respect to child users, which I think is a very significant, and in some ways novel model in the privacy, legal context. So in terms of laws on the books, Connecticut, I would argue has been vastly overlooked in the current privacy discourse.

Jodi Daniels  25:58  

Are you seeing any themes perhaps amongst all those different kids laws regarding age, you know, are they certain scopes of age? And then what about anything about any commonalities in terms of limitations of use, for example, you know, at the FTC level, there’s a really big focus when it comes to advertising. Are any of these kids laws taking that piece into consideration as well?

Keir Lamont  26:20

Yeah, it’s a great question. And I think I have to start with the baseline, which is the existing comprehensive privacy laws that we’ve discussed, treat children’s data, which is typically used under 13 as sensitive data, and that’s kind of matched to the federal Children’s Online Privacy Protection Act. And then some of the strongest state privacy laws or create additional opt-in consent requirements for children aged 13 to 15. Or in the case of Delaware up to 17. Where opt-in consent is required for using data for things like targeted advertising, data sales, in the case of Oregon significant profiling decisions. And then there’s the issue of knowledge standards. To what extent do you actually know that one of your users is actually a child and those additional protections kick in most states have gone with an actual knowledge or willfully disregards standard at this point? And separately from the comprehensive privacy laws, most of these new child specific laws that I’m talking about, I have really been setting the threshold at individuals under the age of 18, as attempting to set these ITIN protections. This age verification or age array of age estimation requirements that I’ve discussed that go certainly beyond the knowledge standard of actually having actual knowledge that a user is a child, you actually have to do work to try to figure that out. And so it is an issue that is very much live and active and being debated in the states currently.

Jodi Daniels  28:09  

I really wish some of these lawmakers would observe some of these teens using these platforms and understand the way some of the companies are collecting the data because I think even 13 to 15 or 15 to 17 those, those teams are not well-versed and they’re all just going to hit OK, because they want to move forward. And that as a parent, that’s just super scary. I’m hoping, hoping and crossing my toes and my fingers here that we might get some more sensible to really protect our kids data. But I can’t solve up here today. Does Tasha Marissa want to put that out there in the universe? And you did and I did. I feel better.

Justin Daniels  28:49  

Just remember, what do you think the average age of a lawmaker is?

Jodi Daniels  28:52  

That’s not the point. They should go and talk to their grandkids then?

Justin Daniels  28:56  

It is actually half the US Senate, as I’ve always said, is basically now a retirement home.

Jodi Daniels  29:01  

Yes, well, alright, we’re gonna bring it back. Okay.

Justin Daniels  29:06  

So here, so if I’m a company, and I’m wondering, will there be other laws like the Washington My Health My Data Act, because it’s so broad? What would you tell them?

Keir Lamont  29:19  

Sure. Well, yeah, first off, I would tell those companies that are already other health laws like the Washington my health my data act, last year, Nevada asked a Senate Bill 370 I wish it is of a very similar scale to the Washington State my house my data racked, it doesn’t have a private right of action, like the Washington law does. So I hope that would not be a reason that those companies aren’t paying attention to it. The Nevada law also is more limited. It covers data that businesses actually use to reveal health information. Whereas the broader my health my day track standard is for information that actually identifies health status. And during the hearings in Washington State business groups would come in and they argued that Washington standard for information that identifies health status could encompass things like putting purchasing ginger at a grocery store that could have medicinal properties. It could encompass following a fitness influencer on Instagram, things of that nature. I will also note that again, Connecticut added additional protections to consumer health data last year in its comprehensive privacy law, which does include nonprofits as well as small businesses. So that was a very big change in Connecticut. And then, New York State also passed a law that limits geofencing at health care facilities, which is obviously also a component of the Washington State and Nevada health privacy laws. In terms of copycats of the My Health My Data Act framework. So far this year, we have seen bills introduced in Hawaii, Vermont, Illinois, Ohio, and Mississippi. The only one to advance through a hearing so far is the bill in Hawaii. So it doesn’t seem like any states currently close to another Washington State My Health My Data Act copycat law at least. However, I would say that the biggest lurking bill out there that businesses should be aware of is a major proposal in New York, which is Senate Bill 158. D, which is actually already passed the state senate, it is of a similar scale as the my house my data acts, but has different definitions and would do some very different things. One requires a one-day delay between signing up for a service that uses consumer health data, and then actually obtaining verifiable consent in order to process that health data. So something to follow closely in New York State there. This bill actually did pass the Senate last year as well and didn’t find traction in the Assembly in New York. But it’s a new year. And we will see what happens.

Jodi Daniels  32:13  

The definition of consumer health data in that New York bill is as broad as it is in Washington’s.

Keir Lamont  32:18  

I’ve seen people argue that it could be even broader. I do not recall the specific language now. But hopefully, the future privacy form may have some resources to make publicly available on that New York Bill soon, especially if it continues to advance.

Jodi Daniels  32:34  

Well, that just makes our jobs all kinds of fun now, doesn’t it?

Justin Daniels  32:38  

So here, obviously, all these laws are passed, and companies comply with them because there are potential consequences. And I want to ask a question I’ve asked on prior episodes. And it’s this as it relates to enforcement. One of the mechanisms under Privacy laws for enforcement is the private right of action. But the more I see it work out in the wild, it seems to really only benefit the trial lawyers who bring those actions, does it really benefit the people it’s intended to protect? Or is there a better scheme of enforcement that may be out there like paying large fines that help further fund the enforcement agency? I’d love to get your perspective on that.

Keir Lamont  33:27  

Sure. So it’s a really important question. Obviously, the decision to include a private right of action or not in the enforcement mechanism for any one of these privacy laws is often one of the most hotly debated questions, and it is ultimately a political question. It’s not the place for someone sitting at a think tank like me to decide. Obviously, you have the Illinois biometric information. Protection of Privacy Act should remember that oops, that dates back to 2008 includes a private right of action. And many, many people point to that as kind of like the key example of why we should or should not include a private right of action in any state privacy law of these comprehensive builds that I’ve discussed, for the most part enforcement has been left to state attorneys general offices. You do have California which has a very narrow private right of action, only for data breaches that would not have violations of the privacy obligations, the pure consumer privacy obligations under the statute. And then we have the Washington My Health My Data Act framework, which I mentioned that does include a private right of action, but it is tied to the state’s Consumer Protection Act, which does require a showing, I believe of injury to business, a property and there was some precedent in Washington State. I think most people point to the Hanged Man case, which I think is generally favorable to defendants in litigation involving the Washington Consumer Protection Act. So that’s generally the landscape now, but overarching, for the most part, most of these laws that we’re talking about, have decided not to include a private right of action, with the exception of Washington State, which I believe takes effect, end of March, beginning of April of this year. And we will certainly be following very, very closely to see how that private right of action is used in Washington State and how it plays out in practice.

Jodi Daniels  35:33  

Speaking of enforcement, we had our second CCPA enforcement action get delivered yesterday. So for those listening, that was kind of mid February, and we have other enforcement arms, who have been sharing what they’re looking for, and I was, you know, those from Connecticut, California, I was wondering if you can share what you’re hearing from regulators and what companies should really be mindful of this year, obviously, we, we would encourage all the different components of the laws, but at the same time, when we hear regulators say this is what’s really important, it would be great to be able to highlight those companies.

Keir Lamont  36:14  

Sure. So as you mentioned, yesterday, the California Attorney General’s office just finalized its second ever settlement involving enforcement of the California Consumer Privacy Act. That was with DoorDash, where they reached a $375,000 penalty, largely involving allegations of data sales without proper consumer opt outs or disclosure mechanisms, stemming from the organization’s participation in a marketing Co Op. Or the other CCPA final enforcement action was back in 2022, I believe, which was a $1.2 million settlement with the French retailer Sephora, which also involved that company’s advertising practices. So I think we can draw the lesson from this, that online advertising practices are and will continue to be a major topic reinforces, I will back up just a little bit and say that I think there is kind of something of an unfortunate tendency for some groups to say, Okay, we’ve got a privacy law in the books. Now, if we don’t see multimillion dollar fines and the front page of the papers the next day, then it’s a failure. I don’t necessarily agree with that. I think both businesses and regulators are going to continue to build out their compliance practices and gain experience with enforcement over time. So the ultimate impact of this new era of state privacy laws that we do have on the books right now is going to continue to evolve. We have five comprehensive state privacy laws that are mostly in effect with another eight on the way, and Utah, Virginia, Colorado, and Connecticut, all still have a so-called right to cure alleged violations. And the experience in California and I believe the Attorney General’s office said this was about 70% of potential violations. They had to notify the business and that business had I think, 30 days to bring their operations into compliance and then the matter could drop. And I think it is likely that in many of the state laws at this time, letters of inquiry will be sent, particularly in Connecticut and Colorado, which as you mentioned, I’ve been extremely active, but much of the enforcement activity happening this year is going to be out of the public eye. And now that is different in California, where the right to cure has already expired. And the CCPA just won a favorable court judgment that said it can begin immediate enforcement of its implementing regulations without being subjected to a one year delay. That decision has been appealed. So we’ll see what continues to happen now. But what is also really interesting to me in California is that we have seen both the Attorney General’s Office and the California Privacy Protection Agency, the privacy specific regulator, tip pick up enforcement sweeps in sectors outside of traditional tech companies or just online advertising. You have seen I believe the Attorney General initiate an enforcement inquiry involving a treatment of employee data, as well as smart TVs and streaming devices. And you’ve seen the CPP initiate an enforcement suite involving connected vehicles. So I’m very interested to see outside just like online advertising and data sales, are we going to see additional enforcement in sectors outside those that are typically thought as the primary subject Next off for commercial privacy laws.

Jodi Daniels  40:05  

It’ll be a very fascinating 2024. I know you’ve lots of thoughts just on connected cars, but we’ll save that for another time.

Justin Daniels  40:13  

Okay. So, here, when you’re out and about at a party, hanging out with some friends, what favorite privacy tip might you share, if they asked you over a beer?

Keir Lamont  40:32  

You know, I would say, both, both my friends and fiance typically have the good sense to try to get me to shut up. If they see I’m at a party and I get that little glint in my eye where I may start trying to talk about privacy issues. However, my old privacy tip used to typically be you should use a passcode for your phone and not a face scan are not a fingerprint, because the law is developing in the United States, that typically provides for greater protections against police, or law enforcement seeking to compel you to turn over an access code. But a while back, I saw my grandmother open her phone with a face scan. And I started to go off on my little explanation. And she very kindly kind of intervened to tell me that she used a facial scan because she had arthritis. And that was the easiest way for her to access her phone. And I do think that was genuinely a very important moment for me. And I think there’s a danger for a lot of folks kind of in this field to get wrapped up in their own kind of little privacy Maximus bubble, without necessarily thinking about what practices work best for other people. Different people have different needs, they perceive different risks. And for the most part, they just want the technology and devices to work as seamlessly as possible, while having their privacy respected by default. So that’s, I think, really motivated me to think through how technology, how these new laws will work for everyone, not just people very interested in their privacy and security. And I hope that has made me a better policy professional.

Jodi Daniels  42:06 

I think that makes a lot of sense. When we talk about right sizing privacy programs, I think right sizing for what individuals feel comfortable with is an important step. Now, if you’re not studying privacy laws and explaining them and identifying future trends, what do you like to do for fun? Sure. So

Keir Lamont  42:29 

I would say my main hobby is that I’m a runner, it is something I picked up during COVID. When I desperately needed a solitary way to get out of the house. It’s, I know, it’s not for everyone, but that has been fantastic for my mental — and when I’m not injured, my physical health. Last year, I was actually one of 11,000 people to hit the posted qualifying times to participate in the Boston Marathon, but subsequently actually miss the cut because they had so many applicants, they had to lower the thresholds. Fortunately, however, I have since aged into a new qualification tier, I’m a little older, I may or may not be wiser, and will have significantly more slack for the required qualifying times. If and when I’m ready to make another go of it. So that’s something that’s always on my mind, trying to stay fit and healthy and and train as much as I can.

Jodi Daniels  43:23  

That sounds like a very favorable outlook. But I’ve heard you say in your squash tiers. Okay. Maybe you didn’t like it that way? Well, sometimes it’s all about looking for the positive advantages with where you are. And like that positive approach. Now, this has been an incredibly valuable episode. If people would like to connect with you to follow along and learn more, where’s the best place that they should go?

Keir Lamont  43:54  

Sure, obviously, the Future of Privacy Forum website, we have a mailing list, we put out a lot of I think, really expert leading content, to follow me personally and a lot of what I’m doing and tracking emerging both state and federal privacy laws and emerging trends. LinkedIn is probably the place now after what befell the website, formerly known as Twitter in the past year, on LinkedIn, I have a privacy newsletter. I know I’m not the only one. But it’s called The Patchwork Dispatch. And I try every fortnight or so to put out the five what I think are the top five most important developments in terms of privacy, legislation, regulation enforcement, so happy to connect with folks there. And again, this was a lot of fun. Thank you so much for the invitation to come on and chat through these really interesting and important issues with both of you.

Jodi Daniels  44:51

Well, we’re so grateful that you were here and I want to put an extra plug for that LinkedIn newsletter. What you have is a phenomenal read and I highly recommend everyone Make sure that you go and subscribe to that so you can stay in tune as well. But thank you so very much for stopping by we really appreciate it.

Outro  45:13  

Thanks for listening to the She Said Privacy/He Said Security Podcast. If you haven’t already, be sure to click Subscribe to get future episodes and check us out on LinkedIn. See you next time!

Privacy doesn’t have to be complicated.