Click for Full Transcript

Intro 0:00

Welcome to the She Said/Privacy/He Said Security Podcast, like any good marriage, we will debate, evaluate, and sometimes quarrel about how privacy and security impact business in the 21st century.

Jodi Daniels 0:21

Hi, Jodi Daniels, here. I’m the founder and CEO of Red Clover Advisors, a certified women’s privacy consultancy. I’m a privacy consultant and certified informational privacy professional providing practical privacy advice to overwhelmed companies.

Justin Daniels 0:35

Hi. I’m Justin Daniels, I am a shareholder and corporate M&A and tech transaction lawyer at the law firm, Baker Donelson, advising companies in the deployment and scaling of technology. Since data is critical to every transaction, I help clients make informed business decisions while managing data privacy and cybersecurity risk. And when needed, I lead the legal cyber data breach response brigade.

Jodi Daniels 0:58

And this episode is brought to you by Red Clover Advisors, we help companies to comply with data privacy laws and establish customer trust so that they can grow and nurture integrity. We work with companies in a variety of fields, including technology e-commerce, professional services and digital media. In short, we use data privacy to transform the way companies do business. Together. We’re creating a future where there’s greater trust between companies and consumers to learn more and to check out our best selling book, Data Reimagined: Building Trust One Byte at a Time, visit redcloveradvisors/com. You ready for a fun episode today?

Justin Daniels 1:34

I am. I hadn’t thought about the fact I can always make you laugh.

Jodi Daniels 1:38

Don’t do that, then we’re gonna have a case of the giggles again.

Justin Daniels 1:41

That’s entertaining to our listeners. 

Jodi Daniels 1:43

No, okay, we’re gonna get into it. Okay. Today I am so excited. We have Omer Tene, who is a partner in Goodwin’s technology group and data privacy and cybersecurity practice, or nicknamed DPC. Prior to that, he was the chief knowledge officer at the IAPP. He’s also an affiliate scholar at the Stanford Center for Internet and Society and a senior fellow at the Future of Privacy Forum. And Omer, we are delighted that you are here to join us today.

Omer Tene 2:11

Thank you, both delighted to be here.

Justin Daniels 2:15

So Omer, can you tell us a little bit about your career journey to your current role?

Omer Tene 2:22

Yeah, so, you know, at my age, kind of the career journey becomes such a long path that it’s it’s kind of hard to know where to start, but the outstart at the turn of the millennium, which actually does make me sound like a dinosaur, which I am, because that’s when I got into privacy. So before that, I was a corporate lawyer and worked for a couple of law firms doing corporate M&A, but around 2003 I decided that I wanted to change, and I actually answered an advertisement and the economist, or these ads in the first couple of pages For a beef economist of the World Bank and stuff like that. So this one was for leading a think tank focused on data protection, and this was based in London at the time. Was actively working in Paris, and I moved to London for this. I didn’t really have any background in privacy or data protection at all. I didn’t even know what it was, but it sounded technology related. And being Israeli, I always kind of felt held back and not kind of participating in the real game of Israel’s economy, which is the high tech sector. I thought it might be somewhat related. So I took this role when I started doing this. And these are really the first years of privacy as a field. And certainly, you know, we think about it as a profession now. And in fact, the two view are in this profession and different roles. And there are many other roles and types of, I think, specialties these days, but back at the time, this was something new, and we were all generalist. And if you think about it, these are the years that companies appointed the first ever Chief Privacy officers. So companies like DoubleClick, Axiom, IBM, those were kind of data intensive companies that understood that data is a big asset, but also presents challenges and. Risk and risk. So privacy emerged after that. I was a law professor, actually, in Israel for almost a decade. My kids were born in we spent a few years there in, you know, privacy wasn’t like a main subject you could kind of own in the law school. So my big topic was Business Association, corporate law. That was like my main class. But I also did research and privacy, and I came to know the IPP is like a big organization in this field, and at some point I joined the IPP. Initially, I was in charge of research and set up the Western Fellowship, which has been incredibly successful, and then became the chief knowledge officer there, basically in charge of programming and publications, research and public government affairs. Three years ago, I decided to make a switch and went back into big law, and that’s where I am now as a partner at Goodwin, focusing on all things privacy.

Jodi Daniels 6:32

Well, speaking of all things privacy, there is a lot happening in the privacy space these days, and what I thought we would do in this episode is talk about some of the big hot topics that are coming from some of the US state regulators. And I wanted to start with New York. So the New York Ag, and as a friendly reminder to everyone listening, New York doesn’t actually have a comprehensive privacy law that passed, and instead, the New York ag used a, you know, consumer protection law, a regulation, to share what some of its expectations are for people. So Omer, I was hoping you could share a little bit. What exactly did the New York AG share, and what is it that companies need to do?

Omer Tene 7:21

Yeah, thanks. Great question. And there’s a lot there, actually, because, as you said, New York doesn’t have a comprehensive state privacy law, although it just passed a couple of very important kids privacy laws, and probably will have a state privacy law in not too far future. But I think, you know, taking a step back from the New York attorney general guidance, which I’ll talk about and respond to your question, the big picture, I think here is that it’s no longer just California that has state privacy law and enforcement. You know, for a few years, we knew that there’s CCPA and that California is really becoming focused on privacy compliance, CCP, and then amendments and the CPRA, and then they set up the California Privacy Protection Agency. So there is a lot going on in as council when we talked about us privacy, asides from the federal level, oftentimes, we kind of talked about California. But now it’s not just California. You have six or seven state laws that are already in four so California, Colorado, Connecticut, Virginia and Utah from last year, a couple of laws just came into force in Texas and Oregon, Montana is coming into force later this year. And then I think there are eight or nine state laws coming into effect in 2025 and those are states with comprehensive privacy laws. Here is New York that doesn’t have one, but it does have a UDAC law, right? So they’re sometimes called the mini FTC laws, the deceptive trade practices act and New York, and not just New York, Texas actually just used its UTEP law to bring an action against General Motors for collecting and selling data to different. Third parties, including insurance companies. Texas does have a state privacy law, no, but it’s still the Attorney General still use the UDAP law to bring that claim. So I think we need to remember that, yes, there is California, California, and, yes, sir, a bunch of other states that have state privacy laws coming into effect are already enforced, but all laws have UW all states have UW laws and can deploy them in the context of privacy. So I think the guidance that you’re referring to from the New York Attorney General is the one focused on cookies and tracking technologies. A lot of action always in this space, you know, with antic and third party data sharing or data sales data brokers like this is always the tip of the spear for privacy policy making and enforcement. And the Attorney General basically said, make sure that you don’t mis categorize cookies and your cookie notices or cookie management platforms that you don’t misconfigure those platforms when you set them up to solicit consent, It’s warned against trackers that are hardwired into websites different tags and pixels that can’t actually be manipulated by using consent management platform. And also provided guidance on how to structure design and word a cookie notice so it’s not deceptive. No dark patterns. For example, opt out and opt in buttons need to be featured in the same prominence and requirements like that. So that was the guidance on cookies. There’s also an ANPRM, also an advanced notice of proposed rulemaking from the Attorney General with respect to the new kids privacy laws, and we can talk about that if you’d like.

Jodi Daniels 12:34

Well, sure, why don’t you share a little bit more about what companies should be thinking about regarding kids, especially those who might be processing data on kids from New York.

Omer Tene 12:47

So kids’ privacy is an incredibly hot topic, and probably the hottest topic in privacy, at least in the United States. As you know, even at the federal level, there has been a bill that is actually moving, has support in the Senate, passed through the Senate, and we’ll see how it fares in the house. Of course, you know, the election and Zimmer and so there isn’t much time to for anything to pass. But this bill, it’s called cost by it’s kind of a combination of two federal bills, COSA and COPPA 2.0 definitely has some bipartisan traction on the hill. And then at the state level, New York passed the safer Kids Act, which is more focused on content moderation, but also the child Data Protection Act, which is about kids data. And that comes along together with the age appropriate design codes in California and in Maryland, and with real focus and enforcement by the FTC that’s read a couple of cases in this space against gaming companies and Ed Tech education technology companies over the past year or two. So sorry, the Child Data Protection Act, the CDPA, which is the New York legislation, passed several weeks ago in July and goes into effect a year from its passage. So basically, next summer, it applies to operators of online services that collect process PII of covered users who are kids, not just under 13, like. Those protected by the Federal COPPA but any minors, basically so under 18 and the law protects kids. They in two cases, either where the operator knows that the user is a minor, a kid under 18, or where the site is primarily directed to minors. But this is a significant expansion of the scope of COPPA, because COPPA is just under 13. Needless to say, you know, the scope of websites and apps that are used by teenagers is vastly broader than that. You know, of those used by kids under 13 and also the language primarily directed to minors. It’s not just directed to kids under 13 and primarily directed so you get some ex audience types in COPPA, for example, only applies to data collected from kids, but the New York law would also apply to data collected about kids. So again, this is a significant expansion of the scope of child data protection laws that are already enforced and with respect to teenagers data, the law basically says you can’t sell it and even just to use it, you can only use it if it’s strictly necessary for the purpose of preventing the service that the teenager is buying or getting from you or with informed consent, pretty strictly defined and with very tight parameters. So, um, you know, New York is a big state, and if you just offer your services to teenagers in the country. Chances are teenagers in New York are gonna use them, and you’ll need to comply.

Jodi Daniels 17:29

People are always asking about and you mentioned this, the primarily directed what might be an example, not a particular company, but maybe a type of site that someone listening might not have thought, oh, wait, that could be considered the primarily directed.

Omer Tene 17:47

Well social media sites, you know, if you can do an analytical breakdown of who their users are, I think you’ll find that a big chunk of their users are under 18. It’s a fact. So, I mean, you know the formulation of the scope of the age appropriate design codes in mail in California is even broader than that. Its sites are likely to be accessed by kids under 18. And within that, I would say the entire internet basically falls there, because asides from maybe, you know, 401k sites, even gambling or pornography sites are likely to be accessed by teenagers. So it’s a very broad scope, I think, and that’s one of the reasons that these laws are kind of creating backlash from the freedom of speech advocacy community, which is something you don’t typically see, you know, kind of civil rights organizations petitioning against application of privacy laws. But the reason is that you know, these laws could have a chilling effect on adults using the internet, and kind of push the internet towards a state where you have to be identified, basically. So just for the age verification and you know, to know who’s who. And of course, the upshot of that is less privacy, because typically, yes, we know browsing isn’t really anonymous, right? That’s kind of something. Any basic that the police privacy professionals know, but it’s also not, you know, kind of hard ID-based where you have to kind of go around with your social security number kind of tattooed on your forehead. So I think laws that require age verification, and that creates significant risk for companies that don’t actually check the age of their users. Have, you know, a chilling effect on speech and perhaps a detrimental effect on privacy at the end of the day.

Justin Daniels 20:44

So related to kids privacy, there’s also big news out of California regarding CAADCA, the California Age Appropriate Design Code Act. The privacy pieces like dark patterns, no data monetization and age estimation appear to web survive. What does this mean, more specifically to companies today, and I know many wonder about when enforcement might begin.

Omer Tene 21:09

Yeah, So Justin, I think you’re alluding to a decision that came down from the Ninth Circuit Court of Appeals just last Friday and net Choice versus bond at the Attorney General of California. And the context there is that California passed a law called California prepared design code, which is modeled after a regulatory instrument in the UK that was issued by the Information Commissioner’s office there, it’s supposed to create a friendlier environment for kids online, requiring as he said, the absence of dark patterns and privacy by default, and less sort of ubiquitous collection of kids and teenagers information online. But it was challenged on constitutional basis in California, well in federal courts in California based on it restricting speech, also because it required companies to make sure that kids aren’t exposed to harmful content, for example, and a trade group net choice petitioned against it. Federal district court in California issued an injunction barring enforcement of the law, and that was appealed by the Attorney General of California, and a few days ago, the Ninth Circuit held that what basically upheld the injunction from the district court with respect to some Parts of the law, and those are the parts where the law actually requires companies to monitor in essentially to censor content, harmful content that kids can be exposed to. But it also held that the law doesn’t, you know, obviously violate the First Amendment with respect to its privacy provisions, so those that don’t directly address speech now, it sent this for further discussion and District Court. So there’s, you know, it doesn’t facially violate the First Amendment, but the district court could still hold that it does. But right now, the injunction was vacated with respect to those parts of the law, so technically, they are in effect. Now you ask about enforcement, I still think that the California Attorney General won’t enforce this law until the issue is fully debated in court. But you know, there’s a similar law already in effect in Maryland, and that one hasn’t been challenged, and also doesn’t have the kind of obviously speech related provisions in it. So it doesn’t mention the words harmful content, for example, in. Short, I think businesses should be prepared, over the past, over the next, sorry, couple of years, to comply with laws such as the California to appropriate design code and similar to the general state privacy law situation, which, as we talked about, kind of starts at California and spread to the rest of the nation. We’ve already seen one state follow suit, and I think we can expect to see other states adopt similar legislation.

Justin Daniels 25:34

You know, it’s interesting that you bring up the point about the First Amendment. I go back to law school, and I remember about the penumbra of the implied right of privacy in the Fourth Amendment. And I’m just curious, Omer, from your perspective, as somebody who has been very thoughtful about policy with your work with IAPP and others, how you see these evolution in Privacy and Technology laws intersecting with these, you know, with the First Amendment and the Fourth Amendment, because you mentioned earlier, you know, there’s substantial connection with teenagers and pornography sites, and a lot of adults parents don’t want their teenagers going there. But just any thoughts you could share with our audience around how you see this evolving landscape with privacy and technology versus the first and the Fourth Amendment?

Omer Tene 26:23

Yeah, well, it’s a great question Justin, and there’s a lot there probably too much for me to chew on, kind of without taking notes, just because, you know, the First Amendment and the Fourth Amendment each have, like a pretty enormous number, in terms of their privacy footprint and very different ones, also because, you know, the fourth is more about government access and Law Enforcement. Search first is more about free access to data, I would say, and freedom to use data, and, you know, any way you want, really. And this has already reached the Supreme Court of the United States and the IMS Health case, although that was, you know, a very kind of commercial speech focused context, I think where the rubber hits the road on the First Amendment side of your question, Justin is with respect to publicly available information in the context these days, is the use of publicly available information to train AI, that’s where it usually comes up. And you know, if you think about it, on the one hand, the privacy advocate say that AI companies can’t just sort of suck up the entire volume of data that’s out there on the internet and use it to train models, because that’s a repurposing of data, re contextualization of data in ways that people you know couldn’t foresee it, and think about when they shared that data in public ways. Now, some of it could have been shared under the law because, you know, you have the government databases that are accessible by anyone, but some of it could have been shared voluntarily on social media or, you know, on a podcast like this, I’m talking and then I shared information about my career path. And, you know, the privacy community is saying, I know you can’t just take all this data and use it. That’s a violation of privacy, whereas the freedom of speech side is, hey, it’s public. I can do anything with it, right? Like the there are no it comes with no strings attached. And if it’s publicly available, and I can’t unsee it, like I, you know, I’m free to sort of learn from anything that’s out there, and so is my AI. So I think it’s a real kind of huge threshold issue, which is yet to be fully litigated. What does publicly available information encompass? Into what extent are companies or governments free to just collect it and use it for any purpose they want?

Jodi Daniels 29:49

Thank you. So Omer, when you are not advising — what is your best, last personal privacy tip that you might offer your friends.

Omer Tene 30:08

So one thing that I take care to never do is to advise my friend and I really don’t have any sort of personal privacy advice. I can tell you that, you know, I do tell my kids, as a concerned parent and three teenagers so they’re right at that age, that anything they post or share in any online platform they should think of as out there and not in their control anymore, and publicly available and probably available forever. So you know, it’s not just kids. We are myopic about kind of seeing what you know, envisioning what’s going to happen 20-30 years down the road. It’s important, I think, to educate kids to understand that all kinds of foolishness that they do totally age appropriate. You know, will stick with them once it’s kind of etched into the fabric of the internet, and it’s kind of related to what we just talked about in response to justice. Justin’s question, I think if it’s publicly available, even with all the policy in the world, it’s going to be very hard to kind of put the genie back in the bottle. 

Jodi Daniels 31:41

That’s good advice kids and adults. I think, yep, should that advice to remember what you put out there? It’s there.

Justin Daniels 31:49

But you know, when he said he shouldn’t give friends advice, maybe I should take that about not giving legal advice to my biggest pro bono one.

Jodi Daniels 31:55

Oh, no, no. That doesn’t apply here.

Justin Daniels 32:01

So when you are not practicing privacy, Omer, what do you like to do for fun?

Omer Tene 32:09

Um, I’m pretty, I’m pretty boring. Guys, like, I’m kind of a typical boring lawyer. I like to read. I like to read novels and like, keep up with news a lot of like, The New York Times, The New Yorker, The Economist, I try to stay active, but, you know, I don’t climb any high mountains or run marathons. So I would say that, yeah, it’s a pretty mundane existence, the existence of a privacy lawyer. 

Jodi Daniels 32:48

Well, Omer. We’re so grateful that you came to share with us today. If people would like to connect and learn more, where could they go?

Omer Tene 32:55

Um, well, email, LinkedIn, I used to be very active on Twitter, but once it kind of became X, I stopped. I still have, like, a profile there with a lot of followers, but I don’t really check it much. Yeah, the usual ways it’s easy to connect, I feel, these days.

Jodi Daniels 33:19

Amazing. Well. Omer, thank you so very much. We appreciate it.

Omer Tene 33:23

Thank you — pleasure being here.

Outro 33:31

Thanks for listening to the She Said Privacy/He Said Security Podcast. If you haven’t already, be sure to click Subscribe to get future episodes and check us out on LinkedIn. See you next time.

Privacy doesn’t have to be complicated.