Click for Full Transcript

Intro 0:01

Welcome to the She Said Privacy/He Said Security Podcast. Like any good marriage we will debate, evaluate, and sometimes quarrel about how privacy and security impact business in the 21st century.

Jodi Daniels  0:22

Hi, Jodi Daniels here. I’m the founder and CEO of Red Clover Advisors, a certified women’s privacy consultancy. I’m a privacy consultant and certified informational privacy professional providing practical privacy advice to overwhelmed companies.

Justin Daniels  0:36

Hi, I’m Justin Daniels. I am a shareholder and corporate M&A and tech transaction lawyer at the law firm Baker Donelson, advising companies in the deployment and scaling of technology. Since data is critical to every transaction, I help clients make informed business decisions while managing data privacy and cybersecurity risk. And when needed, I will don my hat and lead the legal cyber data breach response brigade.

Jodi Daniels  1:03

And this episode is brought to you by Red Clover Advisors. We help companies to comply with data privacy laws and establish customer trends so that they can grow and nurture integrity. We work with companies in a variety of fields, including technology, commerce, professional services, and digital media. In short, we use data privacy to transform the way companies do business. Together, we’re creating a future where there’s greater trust between companies and consumers. To learn more, and to check out our best selling book, Data Reimagined: Building Trust One Byte at a Time, visit redcloveradvisors.com.

You ready for some fun?

Justin Daniels  1:42

Yes, now that we have our new backer.

Jodi Daniels  1:44

I know those of you listening can’t tell, but you will have to go check out our video on YouTube. Because forever I’ve had a really blank blue wall behind me for years, I had nothing to put there. And now I have a beautiful banner. I’m so excited about my banner. I touch it all day. Because it makes me happy.

Justin Daniels  2:07

Breaks up the monotony of your blue background.

Jodi Daniels  2:10

If you don’t like my blue backgrounds, this is what you’re trying to tell me. I remember the night you didn’t, you ignored it for years, you didn’t highlight it. Now you are pointing out it was terrible.

Justin Daniels  2:19

For our listeners, I tried in Utah to say why don’t we buy this really cool picture, she wouldn’t have any of it.

Jodi Daniels  2:26

So I’ve tried. Actually, for everyone listening, if you do ever want to buy a picture, we did learn the hard way because I bought something I liked, but you can’t have glass behind you. So it has to be some type of painting or picture that doesn’t have the glass reflection. Otherwise, you have no privacy, which that wouldn’t really work for a privacy person all day long. But it could be entertaining. It could be so you know it’s going to be entertaining, we’re going to actually focus on our guest today, because we have Bill Piwonka, who oversees all marketing functions for Exterro. And his background is firmly rooted in b2b marketing operations. During the past 30 years, Bill has led marketing teams and initiatives spanning strategy, product marketing, product management, demand generation marketing, communications and business development. And today, we’re really excited that you’re here to talk to us about external and data retention, and what you’re seeing in the marketplace. So Bill, welcome to the show.

Bill Piwonka  3:23

Thank you very much. I’m really happy to be here.

Jodi Daniels  3:26

Wonderful. All right. That’s your turn.

Justin Daniels  3:29

I get to speak now?

Jodi Daniels  3:30

Yes, you get to speak now.

Justin Daniels  3:31

All right. I’m speaking now. So well, tell us about your career journey that led you to Exterro today.

Bill Piwonka  3:41

You know, it’s interesting, because I didn’t come to Exterro with a legal background or a privacy background. For people who don’t know, Exterro is a software vendor, focusing on data risk management. The company was founded initially, with the notion and vision that if we could combine the disciplines of data science and process orchestration, we could help our customers respond to requests for data more effectively, and efficiently and inexpensively. We initially focused on e-discovery. So for anybody not familiar with that, it’s just the law that in the US where if you have civil litigation, you have to share whatever is potentially relevant to the dispute with the other side. And you can imagine in a very large organization, how difficult that could be spanning different countries and hundreds or thousands of different employees and that sort of thing. But from the very beginning, our founder had this vision of looking at how in house legal departments manage their operations, you know, and if you think about other departments in the organization, there’s always the software platforms, you’ve got Salesforce, you’ve got Oracle, you’ve got s SAP, you’ve got all kinds of other technologies that help parts of the organization manage their business. And what he saw was that legal didn’t have that. He also saw that with the proliferation of the amount of digital data, as well as the number of sources of data, the way we communicate, the way we pray, all those sorts of things, that he saw a convergence that was going to happen between legal compliance privacy, litigation, and so started building a company that focused at the very core on managing data risk. So that’s getting back to your actual question, How did I get here, I actually came from a company that created social login. So you know, login with Facebook, login with Google login with whatever little company in Portland, Oregon, called January that created that. And prior to that, I was working for a company that did employee hotlines, so a little bit of privacy related stuff there. But my background was much more of coming in and helping technology companies scale and build marketing teams and their organization. And when the CEO of external Bobby Balachandran, reached out to me, I looked at what they were doing and was convinced that they had the right vision. And more than 10 years ago, I came on board.

Jodi Daniels  6:33

Well, it’s always lovely to hear how people find their way. And it sounds like privacy and risk and data have been a part of your prior companies. And with external data retention, and trying to help companies find all that data and how long they should and shouldn’t keep it is certainly a really big focus, I’d love to dive a little bit deeper on how Exterro is helping companies solve those challenges today.

Bill Piwonka  7:01

Sure, and again, it all starts with your data. Right? If you think about it, um, you can’t really respond to requests for that data unless you know where it is, who owns it within your organization, what third parties have access to it, what they’re doing with it, what regulations apply to the retention or disposition of it. And when I talk about a request for data, that could be an e-discovery request, it could be an internal investigation, it could be a subject where it’s a data subject access request, it could be a breach response, we know that data was potentially compromised. Is there any personal data within that corpus? And if so, to whom does it belong? And then where do they reside in the world, because there are going to be differing reporting or notification laws, depending on the scope and where they are. And all of those things require that basic knowledge of your data, which sounds, you know, easy if you don’t really understand just how difficult of a task that is. Again, you think about some of the surveys and reports that are out there of corporate data doubling in size every 18 months. And if you think about it from a large organization that is multinational and has operations throughout the world, really getting a handle on your data is exceptionally difficult. And so that’s one of the things that we excel in, and is what I think the foundation of whatever you’re trying to do from a compliance or privacy perspective.

Jodi Daniels  8:48

Okay, read hashtag.

That’s your favorite hashtag for anyone new and doesn’t know it yet.

Justin Daniels  8:54

I want my Red Clover “know your data” t-shirt.

Jodi Daniels  8:56

Yeah, I know, one day, one day — gonna have it.

Justin Daniels  9:02

Well, I’m interested in a perspective on this bill for both you and Jodi. So in my world, increasingly, clients are asked to provide their security questionnaires. It’d be a sock two, it could be the architecture of our network. And I find increasingly, I’m writing into contracts things such as sure we’ll let you see it. But it’ll be a screenshot or a read only link that still resides on our network. You can look at it but we don’t want you to keep it because I don’t even want to get into worrying about data retention or anything like that. And I’m just curious, Bill, from your perspective, in certain areas where we’re dealing with highly sensitive data, are you starting to see companies push back and say, hey, look, we don’t even want you to have it because we don’t even want to get into these data retention issues. I know from a cybersecurity perspective. There’s certain things I just don’t want them to keep. I can’t trust that they’re going to retain it or get rid of it when they say they will

Bill Piwonka  9:59

Oh 100% And that, you know, if I move into kind of our e-discovery area, you think about litigation, and there’s always going to be times where you need to work with your outside counsel or potentially a legal service provider to help call through all that information to try to find the relevant or responsive data, review it and then determine what you’re going to turn over to the other side, you think about the risk where you’re turning over all this sensitive data? You know, and with lawsuits is probably you’ve probably got a lot of your executives’ information, even just the most basic level, what is their email address? Or, you know, what, what kind of IP address do they have for their, their laptop, that sort of thing. And we’ve seen in the space breaches at law firms, or, you know, the loss in the transfer, and it is a, it’s a huge security risk. And so, you know, one of the things that we really encourage people to do and organizations to do is keep the data in house, there’s a way to do that. And then do you know, put all the appropriate security measures around it, but the less that you’re transferring, the lower the risk you’re going to have?

Jodi Daniels  11:23

Well, in-theme with our She Said Privacy/He Said Security, you asked your security question, I’m gonna ask my privacy question, Bill, how has the influx of privacy laws impacted data retention program?

Bill Piwonka  11:39

I know you know, this, I mean, the thing, I just let —

Jodi Daniels  11:41

Our listeners want to hear your perspective.

Bill Piwonka  11:45

Well, I just, it’s humorous at one level, because we have had retention policies forever, right. But most of those retention policies were in a three ring binder and sat on a shelf and got dusty. Nobody wanted to actually push the button and delete, right there was, you know, the free years it, of course, you want to get rid of data by having the data, it’s discoverable in litigation, it’s potentially putting you at risk. If you have a security incident, it costs money on — you know, for storage, there’s no reason to keep the data. That being said, nobody wanted to delete the data that, you know, the alternative was, well, storage is getting cheaper and cheaper. And, yes, there’s risk that the data might get compromised or might be discoverable. But I’m afraid of doing that career limiting move of pushing the button and getting rid of data that oops, I shouldn’t have gotten rid of. So nobody really operationalized retention.

Justin Daniels  12:51

With the laws coming in now, it’s part of the laws, right?

Bill Piwonka  12:56

It’s not just you have to have a retention policy, you actually have to execute on that. And the enforcement agencies are ratcheting up the fines and their actions, particularly when you know, there’s a security incident. And it’s determined that there was data that had no business purpose for being retained, and should have been defensively disposed and part of the retention program. So what we’re seeing is GDPR, and all the very similar laws around the world, you know, India, and Canada, and everywhere else. All of those have this component of ensuring that you have an operational retention program, meaning you are actually doing it. And that’s what’s driving the focus on retention, much more than I think the risk of the data actually getting breached or something like that. It’s the enforcement agencies. And I think you’re gonna see that more and more as enforcement actions continue to increase.

Jodi Daniels  14:11

While it does appear that we need some laws to get companies to dust off their paper policies and identify a program that will actually work, it is scary. There are, I’ve seen some companies actually they had such a stringent deletion policy, they actually deleted their data inventory and had all over. So it is important to make sure that the retention program actually works on the data that needs to be deleted.

Justin Daniels  14:38

Interesting.

Jodi Daniels  14:40

True story? Like, we’re not very well.

Justin Daniels  14:47

I’m just curious Bill you know, kind of building on what Jodi and you were talking about with the influx of privacy laws impacting data retention programs. When I think about your e-discovery program, and I’m not a litigator, but it really comes up for me when we have to go through an inbox to find keywords to try to assess what notifications we’re gonna have to make under various breach notification laws. And I think now what’s probably going to happen in that part of your business is artificial intelligence now comes onto the scene to kind of turbocharge how you conduct e-discovery. But at the same time, it comes onto the scene where we haven’t even grappled with the implications of social media. And there really aren’t any laws outside of the FTC section five. We’d love to get your thoughts around that and how you as an organization, and maybe you personally are thinking about how this will work in light of what we’ve seen before it.

Bill Piwonka  15:44

In terms of how AI is going to affect both e-discovery as well as broader?

Justin Daniels  15:49

Yeah, and the fact you don’t have these regulations yet, so kind of the point you and Jodi were talking about, you need these laws to get people to do retention programs. Well, now here comes AI that people are being onboarded with and just your view on how that will shape your industry or maybe in general. And here we go, again, with technology far outpacing our ability to understand and potentially put guardrails around. Yeah.

Bill Piwonka  16:14

So I mean, the genie’s out of the bottle AI is going to get implemented not just, you know, in legal technology, but everywhere. I think the, you know, the notion of responsible AI, which is not mandated right now, but I think you’re gonna, over time, we’ll see more and more laws and regulations applying to it. I think companies have to start with that is, you know, sort of like privacy by design, you’ve got to start with responsible AI by design. Things of things, you know, to be thinking about? How do you train the models? And, you know, the first question any of our customers ask is, Is my data training somebody else’s model? And you know, that that’s clearly something that nobody wants, it’s one thing to train the model on your own internal data. But if you have the, the risk that that’s going to be exposed somewhere else, or somebody else’s data is impacting it, clearly an issue. The other thing that I think the developers and engineers understand, but we’re not there yet is how do you ensure that you remove all bias? And how do you get rid of you know, the hallucinations and that sort of thing. So we are very much at the early stages. But AI is absolutely going to revolutionize and make things better, as we implement it in a responsible way. As an example, if you’re a data breach attorney, what’s one of the most difficult things to do? GDPR says you have 72 hours to determine what reporting notification? notices, you have to send out, right? Well, how if you’ve got terabytes of data, and maybe they’re structured databases, and there’s account numbers, or healthcare numbers, or something, who do they belong to? And where do those people reside? That’s a perfect opportunity for AI to come in and, and determine, yes, I’m finding personally sensitive data. And now I’m able to pull those entities out and recognize the entities. And now I can correlate those numbers with the actual consumer or employees whose data it actually belongs to. And now I understand what my reporting or notification obligations are. And I can do so in a much more timely manner than the brute force of hiring hundreds of hundreds of reviewers to go through it and try to figure that out. It’s a great use of AI. But you also have to make sure that you’re doing it in a responsible way.

Justin Daniels  18:58

I guess, I would say two things to that. One is, you know, as you know, the SEC cyber rules, you gotta figure out if something’s material within you know, without undue delay. GDPR is 72 hours, you’ve got other regulations on the federal level with financial services. I can tell you from experience, you don’t know squat in 72 hours. You’re just trying to figure out what’s going on here. Most companies, they have no breach response team. Hey, Justin, we heard you know how to do this. And I got to spend the first 24 even putting together a team. Yep. And then I guess the other thing that concerns me is, companies are going to start putting these tools, these ai ai into their tools. And if you’re the company who’s going to buy those tools, before I start using it or relying on the information that it’s helping me to produce, you know, AI hallucinates. There needs to be a vetting process. Like if I were putting on my legal hat for Bill’s firm, I would be very reluctant to use artificial intelligence to start giving information to a customer until I really had a good handle on how it worked, I put his solid parameters around what is an acceptable level of mistake because like humans, machines are going to make mistakes. So those are the things that I would be doing that might delay me quickly wanting to do this, because if we had a problem, and it was found out, it was attributable to AI at this stage in the game, I think that could be a reputational extinction level event for a company who’s spent years building up their reputation eDiscovery, pick your space.

Bill Piwonka  20:39

I 100% agree. But I also would say that that’s probably no different than any technology that you’re going to use, right? Any and regardless of what, whether it’s in legal or anywhere else, you want to make sure that whatever you’re using is fit for purpose. And so, yeah, you shouldn’t do proof of concepts you shouldn’t do testing, you should run it through its paces to ensure that the results that are coming out are what you’re expected and can be verified. So then when the actual incident occurs, and you are under the gun, and you’re trying to comply with the various regulations, you have that confidence.

Jodi Daniels  21:23

I think we all agree in that. And for companies who are now trying to figure out how do I get started on a data retention program, maybe all they have is a really dusty policy, or maybe they don’t even have a dusty policy. We often come across companies and one of my favorite questions, when we do data inventories, we ask all the business owners about data retention, we often get back, I don’t know. Not sure, or indefinite, which means there’s certainly a lot of work to be done in this area. What would you suggest as maybe just three starting points of how to get started for those companies?

Bill Piwonka  21:58

Besides calling you?

Jodi Daniels  22:01

Well you write what just you have companies who own Joe, no one’s calling you? Wow, I can call you to why I can call any of us. But the idea is, I have all these challenges, we know we have to do something where were in your eyes Bill, did they start? What makes it six? What makes the data retention program be successful?

Bill Piwonka  22:25

It may be a little bit of a cliché, but it goes back to people process technology. You know, if they have the dusty old retention policies, dust them off and look at what are they? And are they still relevant to the business of today, because depending on when they were written, they may be completely out of date. Understand, and you’ve got to have the right group of people to start that program. And that probably does include your outside counsel and potentially advisors like yourselves, because if you don’t know what you’re doing, you know, you don’t want the blind leading the blind, right? But it starts with know your data. What data do you have? Where is it? Who owns it? What third parties have access to it? And what’s the context of it? Right? Because it may be that you’ve got sensitive data that is used in one particular business context, that then triggers a retention obligation in one jurisdiction. But that same, PII could trigger a different retention obligation, because of the way it’s used somewhere else. So you have to be able to understand how am I using this data? What’s the business purpose? And then how does that relate to the various retention or disposition regulations around the globe? And, obviously, wherever you operate, because doesn’t really matter what the retention obligations in India are, if you don’t have operations there,

Jodi Daniels  24:03

Just then you get the first phone call. Then they decide on the software, the consulting piece. So see, you get tickets,

Justin Daniels  24:13

You told me I don’t get called.

Jodi Daniels  24:15

I know I take about I can

Justin Daniels  24:16

Play back the transcripts?

Jodi Daniels  24:19

Well, you know, what I’ve seen.

Justin Daniels  24:24

What. I may fall off my chair.

Jodi Daniels  24:26

You can go back to the transcript.

Justin Daniels  24:31

All right. Well, Bill as we sit here in February of 2024, overall, you know, what are your thoughts around the big privacy challenges that you think companies are facing at this moment in time?

Bill Piwonka  24:46

If I take a US centric approach to it. I think that what we’re seeing is we’re still at the very early stages um, it’s interesting, like when we go to the IPP shows or privacy, security risks, the different conferences, the conversations we have with people who stopped by our booth. There’s a lot of people that are really new to the privacy area. And they’re still struggling with what do I need to do. And I think, when we had, you know, the CCPA, as kind of a driving force, you had a lot of organizations put in, almost checkbox type, technology and programs. So they could say, oh, yeah, I do have a consent program, or I do have, you know, a data rights management program. But they’re, they often were not very robust, and they may not have really address the bigger challenges, but it was sort of like, hey, if the regulator’s come in, look, I can check the box and say, I’ve done all these things. Look, I have a privacy policy. And I think what we’re getting at now is a much more sophisticated or recognition that the check the box approach just isn’t appropriate anymore. And it’s going to get more we’re going to see more and more legislation in the US. Certainly, the rest of the world is leading by example, where I don’t think we can say that we’re leading the world in terms of privacy regulations. But I do think that people are taking privacy — organizations are taking privacy more seriously now. But there’s still a tremendous amount of education and maturation that needs to happen.

Justin Daniels  26:54

Is that consistent with what you’re saying? Justin?  I would say so, I guess where I’m struggling a little bit is we’ve, we really haven’t grappled with what the attention economy means with all the data that’s created in all these places. We have no federal laws we talked about many times on the show. Now, AI is going to iterate faster than we probably ever felt possible in ways we don’t anticipate. And now you’re going to overlay that on a privacy and security kind of regulatory scheme that is patchwork and behind. And that’s the part that kind of has my attention, and not probably in a good —

Bill Piwonka  27:45

And I think that just underscores how early we are in the game.

Jodi Daniels  27:50

I think and extremely early for sure. And the challenges that companies have are getting more complex based on the AI pieces that you’ve described before the patchwork approach that’s coming and will continue in the US the different kinds of data the struggle with what the business wants versus the risk to litigation in certain parts of the world to the privacy laws, and trying to blend all of that together. That is a complex challenge.

Bill Piwonka  28:19

I also think once California starts enforcing employee desires, we’re going to see the plaintiffs bar weaponize the DSR process. I mean, if you think about consumer rights request, you know, you think about the process. Okay, we have to have a way to take the request ID and that’s not that difficult, we have to be able to verify they are who they are, again, not that difficult. And then how many places do you store does an organization store consumer data? It’s probably not very many. You know, there’s a couple of databases, the point of sale and those sorts of things. So responding to a desire for a consumer isn’t a really difficult challenge to overcome. If you start thinking, though, about where personal data of your employees reside, and how difficult it would be if somebody says, hey, I want all the emails between Bill and Jodi. And because this is Jodi, requesting, you have to redact out to Bill. And then anybody else if they were on a, you know, a thread, or how many team sites, how many Slack channels, how many documents, how many, you know, the challenge of addressing that is going to be very, very difficult. And it gets that much more if the request is to delete it, because now we’ll get back to retention. Is that any of that information on a legal hold? Or is any of that information under a retention obligation from a law, whether in the States or somewhere else, and being able to harmonize that and do it within the timeframes? And, you know, not if the plaintiffs bar has weaponized the desire, you’re, you know, they’re going to hold you to your timeframes. So how do you get your arms around all that?

Jodi Daniels  30:24

Some very good questions that I do not have the perfect answer to at this moment. But it will be very interesting to see the enforcement’s come, what those precedent cases will look like. Now, bear with everything that you know, so far, we always like to ask everyone, what is your best privacy or security tip that you might share with your friends when you’re at a party?

Bill Piwonka  30:45

Yeah, so you know, we don’t get into lots of deep privacy conversations where I can start talking about understanding your data, I’m gonna say, it goes back to passwords. It was crazy when I was at Janrain, which was, again, that the social login company, we you know, look at breaches and look at what the passwords were. And it’s ridiculous how many of them are still 1234, or password or whatever. And, you know, I’m helping my father who’s 87. And he has a PhD from MIT, and he can’t remember any passwords. And so, you know, his defaults are, it’s, it’s not good, right? So, biggest tip, figure out a sentence or something, and then modify it based on that, that site that, you know, is long and strong, and I’m crackable. But it’s easy for you to remember. But don’t don’t take shortcuts with your password. That’s, that’s what I would tell my friends. Very nice.

Jodi Daniels  31:59

It is still shocking that people use 1234 in password. I mean, you read about that, to hear you say that, Bill, it’s just very fascinating to me.

Justin Daniels  32:07

It’s one of the best lines from Spaceballs. Now that’s already a Bill is for you.

Jodi Daniels  32:12

What that’s my movie quoting is for you. It’s not my specialty. I’m really good when there’s a phrase and I can break out into song, because I recognize it in a song.

Justin Daniels  32:21

Well, maybe I’ll have ChatGPT do a Red Clover song.

Jodi Daniels  32:25

No, no, no, that’s not it. In other words, someone says something and I recognize that phrase from a song and then I just keep singing the song.

Justin Daniels  32:33

Well, Bill, since we’re being a little frivolous, when you are not practicing privacy, what do you like to do for fun and frivolity.

Bill Piwonka  32:45

I’m big on activity. So usually, my morning, my early morning is about a five-mile walk with my dog. I love to play golf. I love to hike. I love to do those sorts of things. And then in the evening, it’s cooking. I’m a bit of a foodie. By my good craft-made cocktails. I make my own bitters. I make my own syrups. My wife and I love to try new recipes and get to spend time in the kitchen. While maybe going back to the movie quoting I’m of the generation now where it be sitting in a staff meeting and I’ll pull out a Spaceballs or something else. Well, and I’ll just be met with blank stares. Never been more than one occasion where I’ve said okay, when the meeting is over, we’re all saying in here. I’m pulling up the YouTube because that was a really funny quote. And I used it purposely, perfectly and you are all going to see just how witty and clever I am. And then I’m still usually left with blank stares.

Justin Daniels  33:49

But you know, Bill, it’s funny you say that? Because when I have said in the past, when it comes to data collection, just saying no. Nobody’s like, Why do you say that nobody understands you’re trying to quote Nancy Reagan and her drug campaign in the 80s. And I’m like, it’s perfect that she’s like, in your own head.

Bill Piwonka  34:07

My is the Seinfeld. You know how to take the reservation, you don’t know how to hold the reservation, and people don’t get it?

Jodi Daniels  34:18

Well, we appreciate you sharing, I might have to grab some recipes from you on a favorite dish that you’ve tried recently. If folks would like to connect with you and learn more about Exterro and maybe connect with you personally, where’s a great place for them to go.

Bill Piwonka  34:33

You can reach me billpiwonka@exterro.com Or you easily find me on LinkedIn and if you want to learn more about external and our privacy solutions or our broader data risk management solutions, come to our website at exterro.com

Jodi Daniels  34:48

Wonderful. Well Bill, thank you so much for sharing all of your fabulous data retention information and cooking info with us. We really appreciate it

Bill Piwonka  34:59

Very welcome. Thank you for having me.

Outro  35:07

Thanks for listening to the She Said Privacy/He Said Security Podcast. If you haven’t already, be sure to click Subscribe to get future episodes and check us out on LinkedIn. See you next time.

Privacy doesn’t have to be complicated.