Click for Full Transcript

Intro 0:01

Welcome to the She Said Privacy/He Said Security Podcast. Like any good marriage we will debate, evaluate and sometimes quarrel about how privacy and security impact business in the 21st century.

Jodi Daniels 0:22

Hi, Jodi Daniels here. I’m the founder and CEO of Red Clover Advisors, a certified women’s privacy consultancy. I’m a privacy consultant and certified informational privacy professional, providing practical privacy advice to overwhelmed companies.

Justin Daniels 0:36

Hello, Justin Daniels here. I’m a shareholder at the law firm Baker Donelson, I am passionate about helping companies solve complex cyber and privacy challenges during the lifecycle of their business. I’m the cyber quarterback helping clients design and implement cyber plans, as well as help them manage and recover from data breaches.

Jodi Daniels 0:55

And this episode is brought to you by — that was a terrible drumroll. Oh my goodness, Red Clover Advisors. We help companies to comply with data privacy laws and establish customer trust so that they can grow and nurture integrity. We work with companies in a variety of fields, including technology, ecommerce, professional services, and digital media. In short, we use data privacy to transform the way companies do business. Together, we’re creating a future where there’s greater trust between companies and consumers. To learn more, visit And check out our new best selling book Data Reimagined: Building Trust One Byte at a Time. Well, today, we’re apparently going to teach Justin how to do some drum rolls. And we’re going to talk about some really cool cutting edge technology and privacy and all kinds of things because we have an unbelievable guest, Jules Polonetsky, who serves as the CEO of the Future of Privacy Forum, a Washington DC based nonprofit organization that serves as a catalyst for privacy, leadership and scholarship, advancing principle data practices and support of emerging technologies. Justin, this is right up your wheelhouse with emerging technologies. And we already had so much fun that we had to finally hit record because we were having such a good conversation, but we weren’t recording it.

Justin Daniels 2:17

Indeed, we did. So why don’t we get to it? And we’ll have a great discussion. So, Jules, how did your career evolve to your present position, leading the Future of Privacy Forum,

Jules Polonetsky 2:30

A lot of dates and a lot of opportunity that just developed at the right time, I started out as a lawyer and realized that wasn’t a very good real estate lawyer and went to work for my local congressman, and moved to Washington and worked for another Congressman named Chuck Schumer, when he was in house. And my eyes were open to the fact that you could be fairly young and if you were willing to show up and work hard and listen and learn, you really can work on issues that matter to my neighbors, my parents, the neighborhood I grew up in, which was a sort of blue collar immigrant neighborhood, Brighton Beach, Coney Island in shorefront area of Brooklyn, where people worked hard, wanted to get good value, when they spent their money if they were going to splurge because they saved up for a TV they wanted, you know, and it was on sale, they didn’t want to be ripped off and learn that it wasn’t really on sale, but just, you know, claimed to be 50% off, but the price had been jacked up, you know, two days before, and they were going to be taken advantage of. So I kind of had this sort of sensibility that government could really make sure that businesses and other people and other agencies treated people fairly, you worked hard, you ought to not be ripped off because someone else had a little more power, or a little more time than you when you worked a long, hard day and just wanted to, you know, be with your family. I said I can do this. And I ran for office and was a state legislator for a number of years where I took on the funeral industry. And I want to tease out some themes because they’ve been very helpful to me over the years in understanding and thinking about privacy and data protection. I learned that all of the funeral homes, serving my neighborhood and serving different communities, Jewish communities, Catholic communities, African American communities, you know, who all seemed like these local, communal, quasi religious, clergy connected, you know, entities all belong to a large corporation based very far away, that was focused as corporations are on profits and so forth, and motivated to sell things and I didn’t realize but you know, a funeral is the third largest Purchase most people make in their life, you know, home, car and a funeral. Now you shop around when you buy a house, and you’ve got all kinds of legal protections that you sign, and you can back out of it. And there’s all kinds of oversight. And obviously, you know, you know, you better know what you’re doing when you buy a car, because you won’t get a good price unless you go in there, you know, somewhat somewhat informed. But think about a funeral. Right? You don’t have a lot of planning often. You don’t really want to talk about it. Can you imagine calling up a bunch of funeral homes and saying, My Herman, he’s not doing well? What’s the deal, you got to deal for me this month, because shorefront said, they’ll give me this price. So right now you call up the home that seems to serve your community. And you say what’s appropriate to do here based on my tradition, my religion and my background, right? You know, what I expect? And I realized that when I was a legislator, the senior citizens in my district were coming into my office. And they just paid more than they ever imagined from their limited savings. And they realized they’d been sold stuff that was not even appropriate for what they needed, or what they wanted. We know in the Jewish community, for instance, certainly in the whole religious community, embalming or, you know, flowers, these are not, we’re an elaborate casket. But, you know, the salesperson said, This is what’s right for your, you know, your Harmon. And so people need information so they can make smart decisions, and they need that information in advance, they need it disclosed. And it’s not the case that businesses will always do that, if the market doesn’t support that sort of, you know, competition and that sort of service. I became a Consumer Affairs Commissioner for New York City and had the authority to do something about some of these things. And so I, you know, really built up this passion for trying to make sure people got a fair deal, got a fair shake. The first day I took office, the mayor called me and said, I just did my show on Friday on the radio, and people were calling in complaining that Circuit City had sold them big screen computers, a big screen TVs. And then when people you know, decided to return it, there was this 20% restocking fee, and they’re all unhappy about it. I saw and mayor and I just been appointed mayor said, what are you gonna do about it? I said, I’m on it. And we sent our inspectors to Circuit City when they came back, and they said, well, it’s all the receipts. And I’m like, when do you get the receipt? You get the receipt about this policy? afterwards? Aha. And so we forced them to refund all of these restocking fees, because they hadn’t disclosed this significant part of the term, right? It wasn’t the price. But the fact that you might not want it might not work the way you expected, you might or might not fit on your wall. You know, this matters when you buy something. So it’s not just the price, right? It mattered that the full terms of the deal was aware before people transacted. So now, some of you are going to zoom out on this and say, did you learn anything years later, having worked in industry, here’s what I learned years later. Why was Circuit City doing this? Turns out, people started buying these giant big screen TVs for the Super Bowl, frat parties, he, you know, people would buy him, had a blast, and then return it. So that’s why Circuit City, which was getting all these, you know, people kind of just renting a free TV, you know, returning it box open and so forth and wanted to discourage us by having this restocking fee, which was going to be enough to you know, convince people, you know, think about whether you’re doing it. So, what lesson’s there for me, right? One is yes. The market doesn’t always support the kind of disclosures, the kind of information, right when, when competition or different levels of knowledge, or just time, convenience, whatever it is, right? We talk dark patterns. I’m like, what dark patterns we consumer protection law for years, has said no, you can’t use little print to take away what the big print gives, right. So I came to privacy really, with a passion from the consumer protection law that I enforced in New York City, which was as many cities or states have very similar to the US FTC standard for deception and unfairness. We had as many jurisdictions do a mini FTC. One day I read the newspaper and I see that some company called DoubleClick is in all kinds of trouble. Cookies tracking, I didn’t know anything about this. I had an AOL account at the time. The you know, there were dot coms populating New York. I had heard of this company because they had a big sign up. That said, welcome to Silicon Valley, New York at the time was trying to brand itself Silicon Valley and the DoubleClick People were smart enough, even though they were digital by a giant billboard, realizing that when the media wrote about Silicon Valley, what were you going to show there were no valleys to show. That sign became the backdrop of thousands of stories reporting about what was going on in Silicon Valley. So I had some awareness. But I became their chief privacy officer when they were looking for somebody with a reputation and some expertise in consumer protection issues, to sort of come in and help work on their policies as well as work across industry and trying to set the first standards for how cookies and tracking and behavioral targeting and that entire world as an early developer, so that’s how I got into privacy.

Jodi Daniels 10:41

Well, first their Circuit City story reminds me of I’ve been with a big life event happening in our family soon, and I’ve been dress shopping. And so many of the sites have big disclosures, I can’t return my dress and or there’s a restocking fee. So the whole restocking fee and disclosure piece has percolated past New York City, so congratulations to other sites. And I didn’t buy from them. I bought from other places that would let me nicely return. But really fascinating, I think connection that you were able to share. And the work that you’re doing now at I’m going to say FPF so Future of Privacy Forum, many people don’t really know what it does and what it is. So can you provide a little bit of a foundation and overview of the great work you all are doing?

Jules Polonetsky 11:30

Yeah, we do. Three things of which only one is highly visible. After serving as Chief Privacy Officer at double click, and then later at AOL, I thought that there wasn’t a place for my peers who were increasingly as companies around the world started having these roles, whether they call them a chief privacy officer, or whether there was a legal lead. As teams dealing with the complexity of these issues started growing, I felt that I didn’t have a place to connect and work and do best practices, we would see laws proposed. And we’d say, that wouldn’t work. Why don’t they understand better? Well, the government affairs people were talking to them, but the folks who worked day to day with product teams, and we’re working on creating policies, and often dealing with new issues that were gray and unclear exactly what data is sensitive or not, where do you draw the line, wanting to talk to each other, if you don’t talk to each other, you don’t have best practices or laws, the norms can sometimes go, you know, to the bottom, again, when the market works, right? Everybody’s treating the consumer well, but it doesn’t always work. Right. And sometimes, you know, talking amongst industry, and there wasn’t a place for that — the IPP was great at putting on big events and conferences and training and so forth. But once in a place that was a little bit more intimate, smaller for different sectors, auto banking, finance, to put their heads together and say, what do we think the right rules are here? And we share that, can we learn from each other? We write them down? Can we propose them to I put them out, we publish them. So my goal was to create a place that was in the center of these debates that was more progressive, more quick than the trade groups ready to ready to have regulation, right? optimistic about what could be done with tech and data, but not there to say, we don’t want rules. We want rules now. And I’m right, and should it be a law? Or should it be a policy or should it be a code? Let’s talk about that. But in the early days, the trades were generally we’re here to oppose legislation, don’t break the internet, support innovation, right? And all my peers were like, oh, no bad things will happen. We want it to happen. We want it to work. We want to support responsible advertising, we want to support health research, we want to support, you know, cars that can be safer, we believe in the mission of the organizations were working yet we believe that counsel that we’re giving, but indeed the bad things that the critics think will indeed happen if we don’t have the right policies in place. And so society and academia sometimes weren’t at the table to have all the details and be able to be informed that off. Or sometimes we’re just pessimistic, and sometimes we’re right, but weren’t necessarily always eager to sit down and say, can we hammer out a compromise? Can we figure out how we can make sure health aid is available for research instead of only talking about the risks? Right? Can we figure out how to do this? Right? I didn’t think there were enough people there. And that was our goal. Could we sit in the center with an advisory board that includes academia, civil society, and the senior executives at companies? So that was the mission 10 years later? 15 years later, we have about 200 or so member organizations. And what we do are three very separate things. One is we convene — peer-to-peer conversations. So almost every day, every week, in multiple cities throughout the world, we will pull together 10, 15, 20 people. What are we going to do about MLMs? What are we doing about compliance with state laws that seem to be contrary to carry or need more clarity? How are you moving data from Europe when I don’t know it’s sort of illegal to move data from Europe? What are you doing about Google Analytics? Are you pulling out of China? What are you doing? Right? There are so many issues where, you know, even the best counsel can give you advice, but can’t say, you’re good, right? The Washington Health Act just passed, right? I haven’t talked to anybody who knows how they’re going to comply with it. Right? Other than not being in Washington, but we need to figure it out. Right. So we’ll quickly pull together, folks. And we’ll have those conversations. So peer to peer, learn from each other’s tools, just moderate, be quiet, let people talk and learn from each other. Number one, that often leads to — oh, and you know, what FPF should do about it, you ought to write this down. You ought to take this input. And by the way, go talk to academia, civil society, FTC, government TPAs. So we run working groups that are focused on AI on Smart City on Ed Tech on ad tech on AI on you name, the issue that is on the agenda of busy people in this space, we have a working group that might be doing, oh, maybe an explainer, maybe a training, maybe a code, maybe a best practice, maybe a symposium because none of us know the answer. And it’d be nice to have a lot of people banging on it. The third thing that we then do is hopefully informed by these conversations, and smart about what’s going on. I then tell the team go out now and do good. You’re not representing anybody right now. We want you to be a thoughtful expert, helping policymakers get their agenda done, you’re not there to yell about advertising as good or bad. If somebody wants to regulate, bring the expertise you have and say, well, here’s what would actually work. Here’s what would be really hard. No, we don’t have a way to verify every teenager’s age. So if you’re worried about teens accessing content, here’s some things to think about. Here’s what works today. And here’s where you’re proposing is going to have all the compliance people saying, I’m gonna remember how do I do this? I don’t really know, you’re just creating confusion and risk, right. So we try to be a thoughtful, optimistic about what can be done with tech and data, but thoughtful voice, helping educate, and provide policymakers and civil society and academia with the information and expertise to regulate well, to criticize well, to provide the right sort of input, the right sort of compromises.

Jodi Daniels 17:16

As a very helpful overview. Thank you so very much. I think we want to dive a little bit into some of the leading topics that are on your agenda today.

Jules Polonetsky 17:26

Well, we were talking to Ella Lambs, before we spoken just was sharing some of his optimism that it’s going to be a useful pool in, you know, in certain areas with appropriate oversight, and, you know, skepticism, but but is indeed, and I share that optimism as well as I can concern that all the bad things that critics are pointing out, are also there. Here’s the bright side. And I’d love to hear Justin’s reaction to this. Here’s the bright side. We’ve all many of us, right? Been working on the issues around AI for a long time. Here’s how you need to do this sort of risk assessment, beware, the data can be bad there, there can be bias, there will be bias, right? And then all of a sudden, here comes this, you know, incredible consumer facing, everybody can look at it and play with it. And actually see that, yes, the data is biased, and see that you better not be making, you know, this may have been the best way for the world to take a hard look at AI because I think we were all trying to convince business people who were using what they thought were good datasets, right. And we have to say, hey, you can be by What do you mean, we’re just using the data, the data about our customers, we’re optimizing based on what they want and say, we had to convince them that, oh, you could end up with a biased result. And maybe it would be something you could easily determine or not, right? You assumed that if there was a lot of data, it was gonna end up being, you know, more accurate than less data. But it was a hard left. And it’s not a hard left anymore, right? It’s clear. We’ve launched this to the public. The best example of bad data bias, don’t trust that it’s faulty. And you don’t have to twist the business people’s arms anymore. They get it now how we solve it and how you go forward. That’s the challenge. But we don’t need to twist their arm and be like don’t rely on this it can be really wrong or it’s going to be biased because it’s built on all the garbage and hate that’s out there. So it’s a great teachable learning moment because I think of how famously obviously bad as well as incredibly useful this is what do you say Justin?

Justin Daniels 19:40

Jules, I thought about this. And I think the best example I can give you when we start talking about bias is I was pretty sad to see that they settled the Dominion Fox lawsuit and here’s why. Whether they won the lawsuit or not had they gone to trial, you would have seen a parade of people have to get put on a stand under oath and have to admit to a lot of disinformation that had gone on for weeks and months and months. But because Dominion’s a for-profit corporation, had they not settled for what they did, they could have been held accountable, because that is probably in the best interest of the shareholders than to take the risk because Fox would have appealed for years. But where I’m heading with this is I think people’s appetite for what they really think is misinformation or bias in this day and age has become kind of warped. It’s really hard these days to figure out what is a common set of data that we can agree on, hey, this is the data. And when I read the people think, oh, that $787 million settlement sends a message? My view is no, I don’t think it does, because it’s just a cost of doing business. And so my concern is I bring it back to commenting on your question is, if you’re a business, and I’ll use the Samsung thing as an example, why would you go and try to debug your code with ChatGPT, without fully thinking through the consequences, and yet, I see that behavior, Jodi sees that behavior, you see that kind of behavior all the time. And I think AI is now going to accelerate a lot of these bad trends absent a federal privacy law.

Jules Polonetsky 21:43

Indeed, the best shot we now have at getting a privacy law is the fact that my old boss, Senator Schumer, has said, we’re going to do AI regulation. And I think those of us in data protection kind of say, yeah, how do you do that? Without a lot of that being data protection regulation. So this could be the way if the House passes the bill that it has some consensus on although we’ll see how much consensus there really is once that moves, but right there’s there’s at least a dem and Republican, you know, leader who have been in place who had some degree of consensus, obviously, the Senate Commerce Committee, we don’t yet know exactly what Senator Ted Cruz’s perspective on this is, we know that Senator Cantwell is still deeply opposed to the house framing versus the one that she’s put together. But once the majority leader steps in, and it becomes an AI regulation piece, we could see this getting unstuck. Now, there’s a lot of work to do for that ad, EPA to be ready for prime time. And I hope that there will be times — opportunity to kind of relook and wise up some of the language there. But I know I wasn’t optimistic because I thought Congress was still pretty much stuck. I think the AI regulatory effort if Schumer who is a master at understanding how to move bills with his members, we could be seeing some action.

Justin Daniels 23:15

As I guess my question for you from an FPF organizational and given your own background in legislation, don’t you think our Congress’s inability to pass any kind of laws that really deal with the 21st century economy meaning privacy, security, to a lesser degree blockchain that’s more isolated, all of that is now going to come home to roost because of AI and how it applies and all these different areas? And can’t you just contrast our approach with what Italy did? They’re banning it? Where are the protections with GDPR?

Jules Polonetsky 23:50

It does appear that Italy may have blocked further about their quick decision. And it appears given ChatGPT a path to read, launch very quickly, if they provide more detail about their explainability and some more policies and what their basis for what their legal basis is. It looks like they’ll be back in action. And of course, the regulator is independent and that sort of thing. But I think there was a large reaction to that action by many in Italy and beyond who and again, the DPA is do what data protection law, you know, mandates, but there were certainly a huge reaction in Europe who is looking at another wave of US-led innovation, right the dominant theme of regulation and political activity. When it comes to tech the last couple of years has been, we made a mistake allowing US tech to colonize our data and our consumer services, and we will not allow that to happen. When it comes to AI, we will lead the world Europeans with a structure for trusted AI. And here we are moving quickly to do that, and the world is focused on the EU AI act. And all of a sudden, boom, here’s this wave of Google-barred Microsoft ChatGPT. And the Europeans as much as they want to win with regulation, don’t want to go through another phase of, oh, we are not leading when it comes to technical innovation. So I see a lot of pressure on regulators around the world. You know, I look at this — Euro when Uber rolled out. Again, I came out of New York City politics, as we mentioned, and my first fundraiser, I was advised not to have a fundraiser in Albany, like you’re a junior guy, you got no cloud, everyone knows the junior people have no cloud, it’s the leadership. And that was very much the case. In Albany years ago, it’s still in large part the case. But we really had no, you know, say, the speaker and the committee chairs really were the bosses. They said, raise money in your district. I said, Well, my district is a poor district, and I don’t have rich family members. So I got to raise money, because that’s how you run for office for reelection, so I’m going to do it. So I held a fundraiser and nobody came. Nobody came. The speaker who was a friend and mentor came. And he brought along two people. Who did you bring along? The only two people, I had some friends that came, but the only people who came with checks, were the trial lawyers, because speaker said, Hey, why don’t you help this guy, and they had enough money to help everybody and make sure everybody was well-aware of their particular issues. And I chatted with the guy he was on, you know, top law school brand. And we had an interesting legal conversation and all that sort of thing. I didn’t fully agree with a lot of things. But you know, we had a good conversation. The other guy went down and sat in the corner, and was counting out check if the check if the check. I think I’ll go over him. So hi, who are you I saw you came in with the speaker says, Oh, I represent the taxis. I’m like, Oh, well, you go into a lot of events. Tonight, you have all the checks — he says no, these are all for you. What do you mean, they’re all for me? Each taxi in New York City was its own little corporation for liability purposes. And each company could only donate a certain amount of money. But he represented thousands of individual taxis. And so he was doling out thousands of dollars, the maximum he could give me from a huge number of companies. I had this impression in my mind, that the taxi people because of their politics, and because when they didn’t like things they had, like literally shut down the city by like, you know, blocking up the bridges. How could you launch a business, that would violate all of the Taxi and Limousine Commission rules and take away, you know, the livelihood of the taxi drivers? I’m like, regulators are going to shut this thing down. And regulators did want to shut this thing down and not let them pick up at airports and, and no and seized vehicles. But you saw what happened, right? People wanted their Uber. Right. And you were like a Luddite. If you were like not allowing people to have their ride share, like, What are you talking about? This is so much better. I can call with that. Right. And I think regulators are looking at this. And they’re saying, this seems to be a pretty big thing. We need to figure out how this works, and understand how data protection law is going to apply. So I think they’re under a lot of pressure. I mean, again, they got the guts, though, they’ll say no, but a lot of them are under a lot of pressure to figure out how to apply data protection law. And it’s going to be pretty tricky. You know, we’ve been chatting around the office is personal information in a model, right? Clearly generated AI uses public data on the internet, and other data sources, and clearly, lots of personal information in there, just like search engines, spider all this stuff. But then once you build the model, and you turn these words, into scores that simply give you some statistical probability of what the other words are in a sentence. So Jules Polonetsky is a globally unique name for better or worse. It’s probably not in the database as Jules Polonetsky. It’s in the database as the word Jules shows up the word Polonetsky shows up the word privacy shows up other terms and when you say, give me a bio of Jules Polonetsky and writes a pretty darn good bio, except it changes my school’s my college and law school, even though that information is on Wikipedia. It’s on FPF website. It’s on many, many sites where IO has appeared. It’s right there. So how could it be so dumb? On the other hand, how amazing that it writes a pretty darn good bio, predicting what words are going to appear, and yes, Jules usually appears next to Polonetsky. But let’s say you’re Josh Smith, or, you know, John Green, or, oh, I don’t know, every other name that probably has more than one individual. And it just predicts that these names belong together with these other pieces of information. I guess I consider that algorithm that is de facto personal information. But it’s a little bit harder than just sort of the obvious answer, and we’re still sort of chewing through it. And I was just reading open AI letter responding to privacy consultant and advocate, Alexander ham, in the Nordics, who made a deletion request, I just tested it, he’s been deleted, you can’t search for his name anymore. He said, delete me. But in their response to him, you know, they walk through whether they believe they have personal information, and whether they have it incidentally, and what their legal basis is. So it was really, it’s going to be an interesting time for all of us, who have been applying data protection law to public datasets to, to, to big data to sort of look, again, you get no excuse by braiding technology don’t fit in the law, it’s your job to fit in the law and GDPR and sale and all the rights that we have in a number of states are here and it’s your job. You can’t just say, Well, you know, he doesn’t work, we created something that can comply. But I think there are going to be some hard analyses and hard questions, and probably a lot of technical work needed to be done to get this stuff, right for global data protection laws.

Justin Daniels 31:41

So Jules, and I know I’m kind of hijacking this episode. So maybe I’m going to defer Jodi, would you like to say something?

Jodi Daniels 31:49

So okay, the hot topic, webinar, and every panel for the next couple of months is 100% LLM?

Jules Polonetsky 31:51

In my experience, no, no matter where you start, we end up talking about ChatGPT and AI. It’s unavoidable.

Justin Daniels 32:02

So here’s another question I have, for you, Jules, related to what you’re talking about. You’re saying you think we’re at a point where maybe AI regulation motivates regulators to act more proactively? So let me ask you this question. Another area where we really have a divide or intersection with data privacy, is public safety. And, in my experience, and I’ve worked on multiple digital camera projects in smart cities, and platforms, and in my view, you’re gonna have cameras in schools all over the country, in a very short period of time. And it won’t be long before they add image capture, meaning some form of facial recognition or something so that if someone bad comes onto the premises, or as an outline of a weapon, something will go out quickly. And so my question to you is, that’s already starting to happen. And we’re not really having this debate. And public safety is like a very big concern that I suspect is going to overwhelm privacy. So if we can even have a discussion and debate about that, what makes you think AI will get treated differently?

Jules Polonetsky 33:16

Look, I’m both pessimistic and optimistic having like you guys been doing this for a long time. On one hand, I’ve seen the evolution from in the US at least privacy being viewed as a consumer protection thing, and nothing more than that. So as long as I’m not, as long as I’m giving you the right disclosures, and I’m maybe giving you an opt out choice. I’m not being fraudulent. I’m not being deceptive. Move on, right. And I think the dynamic, the credit of the civil rights movement, the reactions to you know, police, you know, killings, the abuses that we’ve seen when facial recognition, has been used to, you know, arrest innocent people. I think we, we’ve moved, and I think it’s not just the Europeans, who now talk about autonomy, and talk about, they may not use the word civil rights, they might talk about it as human rights. And it’s not about privacy anymore. If I was starting the organization from the beginning, I don’t know that I would say, Future of Privacy Forum, although perhaps privacy and data protection are eating up a lot of other issues. Right. But GDPR is supposed to protect the rights and freedoms under the individual under the charter. And I think we recognize when it comes to issues like you know, dogs and abortion data, everybody in the privacy world sort of jumping and is it exactly a privacy right? I don’t know if it is or isn’t clearly it’s about you know, an individual autonomy, right. And, you know, allowing people to make decisions about their bodies, whether we call it privacy or autonomy or civil rights or women’s rights, whatever you want to call it. So I think there’s a generational change where policymakers, when they wake up, they don’t think privacy law anymore. It’s too important. I’ve worked on a couple of presidential campaigns. And, you know, the tech policy group, we write up all kinds of documents, but they’re meaningless until the senior campaign staff say, this is going up on the website, or the President is going to say something about this in a speech, or this is in the briefing book. And he may be asked a question all of a sudden, what you did, you know, matters. And otherwise, it’s a bunch of prepared materials. So every year, both in the Biden and Obama administrations, you know, I was part of various, you know, teams who were scraping away. So here’s what the ruler, and then comes the debates, you know, somebody in our group would raise our hand and say, hey, somebody from our group should be part of like the debate prep, right? Because, you know, obviously speak now, and much more expert campaign, people would say to us, stop, it is not going to be a debate question about privacy, it’s going to be about terrorism, abortion, the economy, inflation, you know, the issues people vote on, and we’d say, no, no, no, but it’s really important. But guess what privacy is now in the State of the Union speech, right. And it may not always be called privacy, and maybe big tech is too powerful. It may be we need to protect, you know, reproductive rights. So the reason I think we’re going to do this, it’s got nothing to do with “AI,” nothing to do with privacy. These are now actual rights and freedoms that the media that individuals that that people of every age, care about. And it happens to be that data protection is a pretty good tool to use for some, maybe not all, but it’s a pretty good frame point, to start from, to understand the structure. So that’s why I’m optimistic. Now, is Congress going to be smart enough to get it? Right. You know, I wish the administration which drafted a pretty decent bill under the Obama days with smart people like Weitzner, and Kim, Carrie, and they spoke to everybody. And we all pounded on it and tweaked it. And they had a pretty decent bill, which today, if we went back to, we might say, Hey, that’s a pretty good basis, right? Because that’s what you can do. When you’re on the executive side, you can have experts, you can spend a lot of times Congress, people run around with their hair on fire, they’re raising money, they’re campaigning, they’re criticizing each other, they’re yelling on Twitter, they’re yelling, you’re woke, you’re right, that they’re trying to figure out whether the company country will go into debt because of, you know, resistance to the debt cap, and whether China and Russia and right like I get that you’re a busy person in Congress. And when I was a staffer, I had like hundreds of issues on my plate. And, you know, Schumer wanted to be the lead on every one of them. And so we didn’t have a lot of time to become the leading experts on the substance of an issue. We’re talking about regulating much of the American economy, right GDPR took seven years, seven years. And frankly, most of it was built on data protection law that already existed on this directive. And each country, right didn’t invent the stuff 80% of it was already in theory, law in large part in different countries. And it took them seven years with mandarins experts and all these different drafters. So I’m amazed again, add needs a lot of work. They’re smart, staffers, some of the members, smart, Schumer is enormously smart, knew more about any of the issues I ever worked on, when he was doing 100 other things. And I had the luxury of paying attention to a few issues and reading everything, and so on and so on. And he would in two words, you know, zip us the pieces, because it turns out, we actually didn’t get to the core of the issue. So I do wish the Biden administration, which now is finally stepping up on some of the people who I think could be quite useful. I don’t know yet how they insert. Congress is doing its thing. And you know, you don’t get to show up and say, let’s start from, you know, let’s start from scratch. That might may or may not be good politics, but Deirdre Mulligan, who just came on as one of the deputy CTOs is a brilliant, brilliant person understands industry — understands data — is a deep, deep academic, started out in civil society. I’m hoping that she’ll play an important role. How that ends up becoming part of a drafting process. In some ways, I wish we could go back to scratch. Had somebody the other day say to me, why don’t you guys just take GDPR markup the things make some improvements. I mean, it’s sort of forbidden in Europe to talk about amending GDPR. But the commission is going to reprocess they have to look at it every couple of years to figure out whether it’s, you know, suitable, all their AI regulation, in effect is an update to the GDPR even though it all says Oh, consistent with GDPR. But in many places, they’re proposing things that create great tensions with, you know, GDPR. So I think it’s possible, but I think it really requires some of the real real experts that I do think are now at the administration that get In their in and help, and maybe this process with the the Senate moving AI, and maybe some new eyes that could not fix up the AEDPA are able to unstick it.

Jodi Daniels 40:14

While we might have to have a part two, because there is so much to unpack here and see what even happens with Congress here, but also countries around the world, and there’s so many other topics that we didn’t get to and, and sadly, we don’t have all day to be able to record because we were both helping organizations deal with all of these different topics. So Jules with everything that you know. And if we kind of stick on the theme of AI, we always like to ask people what personal privacy tip, they might offer their friends. So if you think about all your friends, they’re probably trying to use all kinds of cool AI to look up their profiles, build their bios, all kinds of other interesting things,

Justin Daniels 40:57

talk to their wife, so that they don’t have to be taught that whatever he said,

Jodi Daniels 41:03

Jules with everything that, you know, how might you suggest individuals use these different tools? What from a privacy perspective?

Jules Polonetsky 41:12

I think there is very little one-size-fits-all information, you know, you need to take advice from people based on where they are in the world. And I mean that in a more nuanced way, right? What works for me, right as a senior person in my career as a white, straight male, right, who’s the boss of my organization, I can have things out there. And I have a public presence. And I, you know, for me, it’s not about being private. It’s about shaping and structuring and advocating and promoting. But that’s not going to work for somebody who maybe is more junior, who has to worry about the next employer who has to worry about an ex spouse who has to worry about being discriminated against about divulging, you know, maybe some information that might be revealing in a way that they don’t want, and so on and so forth. So, here’s my advice. Don’t take anybody’s general advice, because it really matters, who you are, who you are, and where you are in your career. And it might be perfectly fine for you to be out there, shaping your identity instead of maybe letting others shape it. And other people may have a very different risk factor. And yeah, maybe you really do want to lock down this and lock down that. Because, you know, in privacy, we say, what’s the benefit? What’s the risk? How do I assess, you know, these things in our personal lives as well. We need to assess what makes sense for me. And this can change, right, where you are at a particular time, what country you’re in these factors, how you’re going to be perceived. So privacy is all about context, as famous privacy phosphor, Helen Nussbaum says, during the time of COVID, or risk or emergency, our notion of privacy might be different, I might have obligations. We talked, we didn’t really have time to get into sort of the security privacy. Different countries might have a different profile. It’s might be all one thing for one country to say, hey, we want a lot of transparency. We don’t want any surveillance. Another country may say, hey, we’re on the threat, right? We had bombings, we have to protect our citizens. But you want to make sure that there’s some Democratic oversight that ensures that that’s not an excuse for over surveillance, right. We need to adapt, whether it’s a peacetime a country, an individual for the right protections, a different time, different times of allies, different contexts.

Jodi Daniels 43:45

I think that makes a lot of sense. Just like companies need to right size individuals need to right size.

Justin Daniels 43:53

So when you’re not out there discussing all these brown, great, groundbreaking privacy topics, what do you enjoy doing for fun?

Jules Polonetsky 44:06

During COVID, I started studying wine. I didn’t know much about wine, wasn’t part of my life. When I grew up, it was sort of a thing that you had, you know, Friday night as part of bringing in the the Sabbath, but it was sort of sweet wine and I said, you know, I want to learn more I want to understand and I’ve been taking wine certifications and training. And it has helped me understand European data protection law in a very significant way. Do you realize we the Americans destroyed all the European vines because of pests from America that came across but then we rescued it all because the North American rootstock was then used so all of the European grapes actually have American roots and then have had grafted European vines on top. So there’s a little bit of a lesson there we cause problem we help solve the problem better. But number two, the best regions in Europe have very strict rules. In France, there are regions where the irrigation of whether you can put water very specifically yes or no. If you buy a grape from the Bordeaux region, you don’t you don’t get to call it Cabernet or Willow, you call it the name of the appropriate village. And then it can only be labeled by that village, if it’s the noble brand. So those of us who show up and try to understand cultures around the world, and hey, while you’re regulating this, so why are you regulating that? And who is it really helping? Is it helping consumers? Or is it protectionist? Understand that there is a long legal history, whether it’s the regulations around wine, or other regulations and other areas and other cultural differences that really shape how policies and issues are, you know, are dealt with? You know, we get shocked when we learn that, you know, in some countries in your your tax, your aren’t your income is published, because that’s what that equitable society does. Or people who’ve underpaid, their taxes are published in a register, the court just sort of struck that down deciding that it was, you know, not not proportionate. So, wine has been a recent hobby, but I think there’s some lessons for the rest of our life as well from that sort of education.

Jodi Daniels 46:18

That is fascinating. I did not know and I’m excited to learn more. So Jules, thank you so much for sharing this and all the insights that you did today. If people would like to learn more, where should we send them

Jules Polonetsky 46:29 Or of course, find me on LinkedIn, Twitter, or Mastodon

Jodi Daniels 46:36

Jules, thank you again, we really appreciate it.

Jules Polonetsky 46:40

Great to be with you.

Outro 46:45

Thanks for listening to the She Said Privacy/He Said Security Podcast. If you haven’t already, be sure to click Subscribe to get future episodes and check us out on LinkedIn. See you next time.

Privacy doesn’t have to be complicated.