Click for Full Transcript

Intro 0:01

Welcome to the She Said Privacy/He Said Security Podcast, like any good marriage, we will debate, evaluate, and sometimes quarrel about how privacy and security impact business in the 21st century.

Jodi Daniels 0:21

Hi, Jodi Daniels, here. I’m the founder and CEO of Red Clover Advisors, a certified women’s privacy consultancy. I’m a privacy consultant and certified informational privacy professional providing practical privacy advice to overwhelmed companies.

Justin Daniels 0:35

Hello. I am Justin Daniels, I am a shareholder and corporate M&A and tech transaction lawyer at the law firm, Baker Donelson, advising companies in the deployment and scaling of technology. Since data is critical to every transaction, I help clients make informed business decisions while managing data privacy and cybersecurity risk. And when needed, I lead the legal cyber data breach response brigade.

Jodi Daniels 0:58

And this episode is brought to buy the same problem, you have to say, ding. Red Clover Advisors, we help companies to comply with data privacy laws and establish customer trust so that they can grow and nurture integrity. We work with companies in a variety of fields, including technology e commerce, professional services and digital media. In short, we use data privacy to transform the way companies do business together, we’re creating a future where there’s greater trust between companies and consumers to learn more and to check out our best selling book, data reimagined, building trust one bite at a time. Visit redcloveradvisors.com, I am feeling very Monday morning, end of school, middle of May, tired. Think you’re being too hard on yourself. Wow, I’m feeling okay. It might be an afternoon cup of coffee kind I never, ever do. All right. Well, that’s what you feel you need, I know, but we’re very excited. You’ve been talking about this, this episode guest, for like, a billion years.

Justin Daniels 2:01

Well, today we’re going to have a little discussion about AI tools, particularly for legal professionals. And so I’m really appreciative that we have Anita Gorney Today, who is the head of privacy and AI legal at Harvey AI. So Anita say hello.

Anita Gorney 2:18

Hi. Thank you so much for having me.

Justin Daniels 2:23

So just before we get started. So Anita happens to be the head of privacy and AI legal at Harvey AI, and she’s based in New York. Harvey is an AI tool for legal professionals and professional service providers. Before Harvey, she was privacy counsel at stripe, she studied law in Sydney, Australia, we’ll have to talk about that. And began her career there, before moving to London and then New York. Welcome to the show.

Anita Gorney 2:47

Thank you so much for having me. I’m very excited to be here.

Jodi Daniels 2:51

So Anita, we always like to understand a little bit more than our brief intro of your career journey.

Anita Gorney 2:58

Sure. So as Justin mentioned, my career started in Sydney, Australia. That’s where I grew up, and that’s where I went to law school, and my first job out of law school, probably, like so many other privacy professionals you guys have on this podcast, was not in privacy. I was a litigator, and I really enjoyed it, but it was always my dream to move to London and work in the UK as a lawyer, and it’s actually quite a common path for Australian lawyers to take to work for a couple of years in the UK. So I got my visa together, I packed my bag, I promised my parents that I would move back to in Australia within a couple of years, and I headed to London, where I got a job at a big UK law firm called Kennedy’s, and I was still working in litigation, but it was around 2013 2014 and the GDPR was being drafted, and it was in the news a lot, and I was just, like, really drawn to it. I found it to be a really interesting area of law, and something that was, like, so relatable, right? I am a consumer. I use the internet, and I was becoming increasingly aware of how companies were collecting my personal information, so I did whatever I could to get some privacy experience. Attended seminars, I eventually did my IPP exams, and yeah, I did whatever I could to do some incident response work and some GDPR work. And then in 2018 we decided that we were going to move to New York. It was actually my husband’s career that instigated the move to New York, and I was very fortunate that Kennedy’s had an office in New York, and they offered to transfer me here to New York. So. The first like 10 months of being in the United States was very difficult because I was working full time. I was still remotely working for some of the partners in London. I was working for some of the partners in New York. I also had to sit the New York bar exam so I could officially practice here. So it was a very busy time. But once I got all of that out of the way and I was a fully licensed New York attorney, I thought it was a really good opportunity to pivot my career and to become that full time privacy attorney that I wanted to be. So I got a job at stripe, the chief privacy officer hired me to work on her team, and my role was really focused on, like, the commercial side of privacy, so negotiating DPAs with our customers, doing all the third party risk management work and then, but the role did involve, like, I was doing incident response, work and product counseling, and just really the whole gamut of privacy. And then it was last year, probably exactly a year ago, that a really good friend of mine started talking to me about this new AI startup called Harvey, and it was in the legal tech space. And so she thought that I would be really interested in it, and she encouraged me to look at their website, and lo and behold, they were looking for somebody to head up the privacy function, or the general counsel was looking for somebody in his team to take on all the privacy work. And not long after, he offered me the job, and I’ve been at Harvey for about a year, and I absolutely love it. I get to work with a great group of people. We’re building something that’s really interesting and that I can personally use in my day to day work. So yeah, that’s how I got to where I am.

Jodi Daniels 6:59

So let’s learn a little bit more about what Harvey AI is all about and the problem it’s trying to solve?

Anita Gorney 7:04

Yeah, sure. So Harvey is an AI app or a platform, and it was built specifically for lawyers or legal professionals and some other professional services, like tax professionals use Harvey. I guess the problem that we’re trying to solve is we’re trying to reduce inefficiencies, and we’re trying to reduce the amount of time a lawyer has to spend on that very manual, repetitive work that most of the time they can’t bill for if they work in private practice and really like to free up their time to spend on the higher value, more strategic legal work, which is probably the reason why they became a lawyer in the first place. So within Harvey, within the platform, there are a number of tools that help lawyers do this. The first tool is called assistant. And Assistant is that q and a tool that’s great for like brainstorming and summarization, and that Q&A style where you can ask it a question and it’ll give you a response. So like, for example, you might ask Harvey, you know, you might say to Harvey, the company received a deletion request, do we have an obligation to delete the individual’s data? And then Harvey will give you a very structured answer back that’s like, really built, or the foundation of it is legal analysis, so it will explain to you, like, generally, these are the concepts within the law. These are the things you should consider, whether or not you have to delete the data, and these are some of the ways that you should think about doing it. In a system, you can also upload a document which helps ground the output in that specific document. So you could upload the article of the GDPR that talks about data subject rights, and then the output you’ll get from Harvey will be like, quite different. It’ll be grounded in GDPR so it will reference a specific article in GDPR that talks about deletion. It will say like, these are the obligations that you have. These are some of the exceptions. You know, these are the time frames that you have to respond and you know, this is how you should think about it. Another really cool feature of assistant that I like is that it also suggests follow up questions. So it’ll give you five or six, like, hey, maybe you want to ask, and then maybe the next question you want to ask is, you know, this or that. And so it really does help with the brainstorming and help you think about different ways about. Thinking about the problem. We also have a tool called Vault, which is very I think it’s our most popular tool, and it’s designed for large scale document review. So you can upload thousands of documents to vault, and then you can extract data at scale. So one way that I used it recently was when we were doing our sub processor update. We were updating our sub processor list to add a new sub processor, and I uploaded all of our customer agreements, and I told Harvey like this is our standard language for notifying customers of sub processor updates. Do any of these contracts have language that is non standard? Yes or no, and then, if yes, what is that non standard language? You know, I think like if I had done that, or somebody in my team had done that manually, that could have taken us a couple of days to do, but with Harvey, we were able to do it in, you know, a matter of three or four hours, including the verification. So vault is really popular. It’s popular for due diligence exercise, for contract review, for some litigation tasks, and then we have this other tool called workflow. And workflows like guide a user through a process to produce a very specific outcome. So like, if you’re doing a due diligence exercise, it will prompt you through the steps of the due diligence exercise and get you to the desired output that you want. So really, we, you know, Harvey is trying to be that, like full tech stack solution for legal teams, if, if, if you need technology to help you with something as a lawyer like you can just open up Harvey and it’ll be there for you.

Jodi Daniels 12:01

Your new legal little assistant.

Justin Daniels 12:07

Yes, I used it the other day to summarize. Had to look at a bunch of covenants in a loan document. I uploaded it, and I had Harvey go and summarize all the affirmative and negative covenants. And obviously I read them, but that saved me, too. Yeah, so that was my case.

Anita Gorney 12:20

I love to hear about Harvey in the wild.

Justin Daniels 12:30

I have other cases. But I think another thing that’s interesting, because I go through this all the time, when I vet legal tools for my firm, as well as for other clients, is, you know, when we’re in diligence, you know, how does Harvey address concerns around security and privacy and diligence with law firms like, for example, you upload all those client documents to vault, and so how does security work to make sure that you can do that and have confidence that we’re upholding our ethical standards as lawyers for confidentiality?

Anita Gorney 12:58

Yeah, I mean, that’s a great question. I think really early on in the company’s life, the founders and leadership identified that in order for this product to be successful, because of who our customers are, they have, we had to get their trust, and in order to do that, we really had to make privacy and security at the forefront of everything we do. And, you know, I have a story which demonstrates this. I think it was like my first or second week on the job, and we there was like a company all hands meeting, and the CEO, Winston, who’s also a co-founder was talking about privacy and security. I mean, he must have spent at least 20 minutes talking about trust, talking about customer trust, and talk and framing it in terms of privacy and security. So I think that really demonstrates that it’s a top down approach at Harvey, and it is really central to everything we do, we built out like a very strong security framework. So we have ISO 27,001, we have our SOC two. Data is encrypted in transit and at rest. We have extremely strict access controls and retention policies and penetration testing, so we have this, like, very strong security framework. But you know, I think when you’re talking about i ai specifically, one of the most important measures that that we put in place is that we don’t train on customer data and content and we don’t our third party model providers also don’t train on customer data and content. So you know that is something that you know, our clients really like to hear. Our customers really like to hear. And luckily, we have this amazing security team that we work quite closely with. So. Make sure that we are fully transparent with our customers and that they understand this. We have a security portal on our marketing page, on www.hobby.ai where you can go and access all this information. And you know, we’re constantly building out documentation to be transparent with our customers.

Jodi Daniels 15:25

I love how you have a public facing page that is talking about that in a completely different space. I was helping a company vet an AI tool. They don’t have any such page, and even in their description, they don’t address the common question, Are you trading or not training on the data? There’s just nothing. Nowadays, many companies list all their privacy features, security features, AI use features. And I brought Justin into the fun of reading terms. And we actually even found conflicting terms. So we were not clear on what this very popular tool is out in the wild, and so that you’re able to very clearly and publicly show here’s what we Harvey, are doing with your data. You can read it here. Is really helpful.

Anita Gorney 16:12

We have a security welcome pack that we have in our security portal and that we give to prospective customers and to customers, which is very clear and sets out all of these commitments that we’ve made and that we have from our third party model providers, like no training. We also let customers set their own retention periods. And we have, you know, we have, like I was saying, very strict access controls. And so we have a process of making sure that data is completely eyes off our model providers don’t review the data that we send to them to process just for the purpose of producing the output.

Jodi Daniels 16:55

That’s really, really, really helpful for prospective people to be able to see.

Justin Daniels 17:01

So Anita, can you be a little bit more specific? You said something earlier, I wanted to kind of talk a little bit more about, which is, how does Harvey really like operationalize protecting the confidential, confidentiality of the prompts by end users from being used to teach large LLMs. And hopefully you can walk us through a little bit so that the audience can understand a little bit about how that process actually works.

Anita Gorney 17:25

Yeah, so you know, like you say, confidentiality is really important, because for all the reasons that confidentiality is important, but also because of who our customers are, and they have their own very strict, regulated obligations with respect to confidentiality, I think all the things that I that I mentioned, like the no training on customer data, goes to confidentiality and respecting confidentiality. But one of the questions I get asked a lot is like, how do you ensure that you know my data doesn’t get mixed in with another customer’s data, or another customer isn’t going to see my data. So like and the process, and this is like our process is that we have put in place, like, logical separation in our infrastructure, so that every user is associated with unique work spaces, and work spaces can’t communicate with each other. There’s no access between the different workspaces. So we do work to validate the separation. So the penetration testing that we do specifically focuses on workspace isolation and making sure there is no mixing of customer data and

Jodi Daniels 18:55

Justin. Maybe you want to talk a little bit more, because I heard you recently talking about for people just to be mindful of when you’re doing a prompt and you’re thinking about different tools, the confidentiality piece, I think so many people are focused on. Here’s my document, but the prompt, I think, is also —

Justin Daniels 19:12

Yes, and so Anita, I actually have had the conversation now with my publicly traded clients, and I say to them, what security you putting around the prompts that you guys do? Because if I were a threat actor, and I’m, you know, thinking about how I might attack a publicly traded company, I might want to hack in and see what they’re prompting go after the prompts themselves from a threat actor perspective. And so I guess what that kind of goes to is maybe, if you can talk a little bit about, I believe, you know, if the customer asks, you know, you can have it so that those prompts get deleted and are not stored anywhere after they’re done. Because obviously, depending on the prompt, even at a law firm, you could learn a lot about, well, what are they prompting? What are they working on? Is that? Are they working on a Walmart merger, and that could move market, so maybe you could, would you talk a little bit about that as well?

Anita Gorney 20:06

Yeah, sure, and that’s what I mentioned earlier. Is that we give our customers the power to set their own retention periods, so customers can choose how long they want us to retain the data. And so some customers have selected zero day retention, no retention at all, and so that it is automatically deleted after the prompt or the output is produced within a certain period of time, but within that day. And we also give customers the ability to delete the data themselves, or to go into the app and delete the data themselves. So you know that, of course, that definitely helps with confidentiality, because if the data is not there, you can’t breach it, right? So, yeah, that that’s another one of our tools that we use, that customers feel really good about.

Justin Daniels 20:57

I guess, Jodi for your perspective, and Anita probably shares this is that’s another way of practicing data minimization, to limit your footprint of data that could be captured, that could be maybe it’s not personal information, but it’s certainly proprietary, and that’s a way to operationalize data minimization.

Jodi Daniels 21:15

The good point now, Anita, earlier, you talked about privacy by design, and how it really being from the top down, with executives, including it in large, all hand meetings for those listening, what might be another suggestion or tip that you would offer people of how to have this privacy by design in their AI tools?

Anita Gorney 21:41

Yeah, well, I think the first thing that you have to do is as a privacy professional, I think there’s a lot of responsibility on you to build trust with the product team so that they feel comfortable coming to you when they want to do something different or when they want to collect new data, so that you can have regular check ins with them, weekly check ins with them, and so that they’ll come to you with with then, you know, then new ways of thinking about the product. You know, getting then getting started on your privacy impact assessment early in the process is really useful, because you definitely don’t want to be slowing down the product teams. So you want to make sure that you’re evaluating the product or the new features, and if there are risks, you’re coming up with ways to mitigate those risks well before they’re even considering putting the product in GA. I think the transparency is also really key here. So developing documentation to give to customers or to put on your website that makes people completely aware of what data you’re collecting and how you’re using it addresses concerns specifically from customers. And then, as well another, like privacy by design tool that I think is really important, and then we’ve touched on is like giving users control, so letting them set their configurations themselves. So whether that’s data retention or like, who can within the customer’s workspace, see data or see, you know, see another colleague’s prompts, or share prompts. Giving customers control over their own privacy settings is another way that we think about building privacy by design into our AI tool.

Jodi Daniels 23:37

And I imagine in certain scenarios like this, the configuration of what is by default is really helpful, like, maybe it’s not on by default that all the prompts are shared with everyone in the firm, yes, but that’s a choice, if the firm chooses, or, you know, or whoever is using the tool to be able to do that. So the control, but also thinking about which way the default is equally important.

Anita Gorney 24:02

Yeah, that’s a really good point. Jody like so we recently just launched a new feature within the app, and we didn’t automatically turn it on for everyone. It’s an opt-in feature, and so people come to it, we have given our customers, like all the information we can about what the feature does and how it works, and then the admin has the control about whether or not they want to turn it on. And, you know, questions come to my team a lot from customers about like, well, how does this you know, can you explain how this feature works and what data it’s collecting and and you know, we provide the information that they’re looking for, and they can make the informed decision.

Jodi Daniels 24:41

That makes sense. So communication and old fashioned talking, despite all the AI robots in this discussion, actually talking to people is still important.

Justin Daniels 24:53

So Anita, when you’re out in New York, maybe when you’re at a cocktail hour or whatnot. And people ask you, what is your best privacy tip that you might have for our audience?

Anita Gorney 25:05

Yeah, so I, you know, had a feeling you might ask me this question from listening to previous podcasts you’ve done.

Jodi Daniels 25:11

Yeah, everyone those questions.

Anita Gorney 25:15

And I actually thought it might be fun to ask Harvey, what is Harvey’s best privacy tip?

Jodi Daniels 25:19

Oh, what does Harvey tell us?

Anita Gorney 25:23

Well, Harvey says one of the most effective privacy tips is to use strong, unique passwords for every online account and to enable two-factor authentication whenever possible. This approach significantly reduces the risk of unauthorized access to your personal information, even if one of the passwords is compromised.

Justin Daniels 25:44

Harvey, hey, Anita, I want to know. Can you show us what are a couple other suggested follow-up prompts?

Anita Gorney 25:51

On this topic? If you might, hold on a second, I have to go back into it and okay and have a look what the follow ups were.

Justin Daniels 25:58

I think it’s an interesting because it’s a really cool feature of Harvey that it suggests things. So I’m just curious as to what it’s suggesting.

Anita Gorney 26:08

Yes, okay, here it says. So the follow ups are, why is it never why is it important to never reuse passwords across different sites or services. What are the some of the common forms of two factor authentication that can be used? How often should you review your accounts for suspicious activity to ensure your online privacy? And how does using a password manager help in generating strong and complex passwords?

Jodi Daniels 26:39

There we go. On Harvey in action.

Anita Gorney 26:43

Yeah, yeah. I mean, those are great. That’s a great example of follow ups. And just like, helping you reframe how you think about something or some or giving you a question that you wouldn’t have thought to ask my personal tip for privacy, I thought maybe I’ll give you, like, a tip for, like, working in privacy, and maybe this is like cliche tip, but building strong networks is really important. So whether that’s ex colleagues or people you met at, people you meet at seminars, or even people that interview you, I’m still in touch with a woman that interviewed me years ago, and I never ended up working for that company, but, you know, we keep in touch, because benchmarking is a really important tool when you work at an enterprise company and seeing how other people are dealing with new regulations or changes in regulations. And so being able to have like conversations with peers is a really good tip for becoming a better privacy lawyer.

Jodi Daniels 27:44

Now, when you are not doing all things privacy, what do you like to do?

Anita Gorney 27:50

For fun, I asked Harvey this question as well. I wanted to see if Harvey knew who I was, and I’m very proud of the response that he gave me, because it’s very privacy forward. It said, I don’t have access to your personal preferences or past activity, past activities, so I can’t say what you like to do for fun, but here’s what people like to do for fun.

Jodi Daniels 28:15

So what is Anita’s answer? What?

Anita Gorney 28:17

So you know, I, you know, as you can probably tell, because I’ve lived in three different countries, I do like to travel. I like to see new places, and I like to see new cultures. I also like to go to live music and something that I do. And maybe this isn’t so much for fun, but it’s for relaxation. I like to do jigsaw puzzles. So especially if I’m working late into the night, and then, you know, have to go to sleep, I will do like, 20 minutes of a jigsaw puzzle. And it really helped me to unwind from the day. But also, right now in my life, I would say that I’m very time poor. I have a young toddler, and I’m working for a very fast growing AI startup. So some of that stuff does take a back seat right now.

Jodi Daniels 29:13

Understood well, our younger daughter likes jigsaw puzzles, but it would definitely not help her go to sleep, because she’ll just sit for a really long time to finish it. Why, we have quite a collection of them so well. Anita, we’re so grateful that you joined us today. If people would like to connect and learn more, where could they go?

Anita Gorney 29:29

Yeah, so they could go to harvey.ai. Is our website. We have some great product videos on there, and you can reach out to our sales team there, or you can reach out to me on LinkedIn. I’m Anita Gorney.

Jodi Daniels 29:46

Yeah, wonderful. Well, thank you so much for joining us. We really appreciate it.

Anita Gorney 29:50

Thank you so much for having me.

Outro 29:56

Thanks for listening to the She Said Privacy/He Said Security Podcast. If you haven’t already, be sure to click Subscribe to get future episodes and check us out on LinkedIn. See you next time.

Privacy doesn’t have to be complicated.