Click for Full Transcript

Intro 0:01

Welcome to the She Said Security/He Said Privacy Podcast. Like any good marriage, we will debate, evaluate, and sometimes quarrel about how privacy and security impact business in the 21st century.

Jodi Daniels 0:22

Hi, Jodi Daniels here. I’m the Founder and CEO of Red Clover Advisors, a certified women’s privacy consultancy. I’m a privacy consultant and certified informational privacy professional, providing practical privacy advice to overwhelmed companies.

Justin Daniels 0:36

Hello, Justin Daniels, here. I am a Partner at the law firm Baker Donelson, as a corporate M&A and tech transaction Lawyer. I am passionate about helping companies solve complex cyber and privacy challenges during the lifecycle of their business. I am the cyber quarterback helping clients design and implement cyber plans, as well as helping them manage and recover from data breaches.

Jodi Daniels 1:02

And this episode is brought to you by Red Clover Advisors. We help companies to comply with data privacy laws and establish customer trust so that they can grow and nurture integrity. We work with companies in a variety of fields, including technology, e-commerce, professional services, and digital media. In short, we use data privacy to transform the way companies do business. Together, we’re creating a future where there’s greater trust between companies and consumers. To learn more and to check out our best-selling book, Data Reimagined: Building Trust One Byte at a Time, visit redcloveradvisors.com Well, how are you this morning?

Justin Daniels 1:41

I’m doing well. I know we ran into each other at coffee this morning and you couldn’t shoo me away fast enough.

Jodi Daniels 1:48

It’s true. You got you got a social or your shoes did not get a good fashion grade.

Justin Daniels 1:57

Well luckily for me, I was meeting with a good friend who wasn’t too interested in what I looked like and more of what I had to say.

Jodi Daniels 2:03

I suppose so. But the friends who I was meeting for coffee when I highlighted how wrong your shoe choice was to your outfit that day commented Is that what it’s always like you always have that answer and when you just did the like flip of my hair for anyone not watching this but listening. In our intro he’s flipping my hair. Yes, this is actually the banter that we have all the time, even at the coffee shop.

Justin Daniels 2:30

Or on the public speaking Dyess. So let’s turn…

Jodi Daniels 2:33

Let’s begin. So today this is really fun. We have Nia Castelly, who is an expert in digital privacy, IP, and product law, who holds a large degree from Columbia Law School. Nia is the Co-founder and Legal Lead for Checks, which is a Google owned company. She spent nearly five years as a legal adviser for the Google Play Developer Console, Policy and Operations teams and the Certified Informational Privacy Professional with her European Designation. Thank you. Massive brain freeze at the moment. But Nia, we are so excited to have you here today. So welcome to our show.

Nia Castelly 3:05

Thank you. Great to be here.

Justin Daniels 3:18

That’s your is yes. See, that’s where here’s the bad shoes had to bail you out today.

Jodi Daniels 3:25

Did you bail me out today? designation? Oh, see? I already forgot. No, we just carry on as much. Can we talk about me? Yeah, came here to talk about wearing all the cool things that he is doing.

Justin Daniels 3:37

Okay, so in the How did you evolve to your current role?

Nia Castelly 3:43

Well, it’s been a bit of a long road. I’ve been practicing for over 20 years. And I’ve been at Google for just over nine years. And prior to joining check, which I’ll talk about in a bit. As you mentioned, I was product counsel for the Play Store. And so there you know, I counseled a number of the teams, including what you set mentioned, the Play console, which is really the developer side of play. So this is where developers come to publish their apps and make sure they’re readily available on all of your devices. And I also supported the our operations and policy teams, which helps ensure that the store stays safe. And developers have a great experience producing apps. Well, while working on that for five years. One of the things that I had a privilege of working on and privilege I’m sure for those in the privacy spaces and understatement is the GDPR efforts. And during that time, I heard we heard from a lot of developers that wow, this is hard. We’re like yes, yes, it is. Can you help, you know and uh, back then, you know, there wasn’t a lot we could do because we were all trying to figure it out together and about how best to comply to it. You know, I think everyone can agree was a seismic shift, and how companies and just the people at large thought about privacy, especially in the mobile app space. But fast forward about three years, my Co-founder and I, Fergus Hurley, we work together on Play. And we both had seen him from the product side, you know, seeing that, you know, developers use the tools if they’re available to do great things. And we saw the need to help in this space. And that’s where the idea for Checks came in. And so we moved to a Google has an in-house incubator, where Googlers with ideas can go in build them, and we were fortunate to be accepted to that program. And so over the last three years, we built Checks, which is like, I guess you’re gonna ask me that next. It’s Google’s AI-powered-compliant platform, compliance platform for mobile apps. And so what we effectively do is help companies quickly discover, communicate, and fix compliance issues. Because really, you know, we know that, you know, all companies, particular mobile app providers struggle with privacy, compliance. It’s complicated, right? It certainly can be. And so we’re really on a mission to take the complexity out of compliance. And I’ll talk more about that later. But that’s how I got where I am.

Jodi Daniels 6:29

I’m excited to hear more, I have like 400 different questions. But don’t worry, I won’t ask them all. So let’s, let’s segue a little bit and talk about cultural differences and how they may impact how a company approaches privacy in different countries. Yeah,

Nia Castelly 6:46

I think, you know, there’s, I guess, I think of it as two types of cultures, right? There’s the culture of the company, or even the product, at the product level for large companies. And then there’s the external culture, or like where you’re located. Internally, I think culture starts at the top, you know, and it’s particularly in the privacy space. You know, the leaders, either the product leaders, the leaders of the company, set the tone for what type of company or product you want to be. And you also need to be globally aware. So going to the external sort of culture differences around privacy. So we know and we’ve heard, as those in the US have a different approach, or thoughts about privacy, and, and perhaps, you know, people in the EU, or, you know, any of the European countries where they are much more sensitive to this. And so really, both internally and external contributions to cultural differences make a difference, because even if you’re not, in those countries, if you’re providing a global product, you need to be sensitive to that, and make sure you know, differing views around the globe are incorporated into what you’re doing.

Jodi Daniels 7:47

That makes a lot of sense. Thank you for sharing.

Justin Daniels 7:58

So you talked a little bit about how mobile app developers have challenges with privacy compliance. So are you starting to see privacy regulations, making developers for innovative startups actually shift left to more of a privacy by design approach?

Nia Castelly 8:15

Absolutely. I think, you know, we’re definitely talking about privacy more. You know, as I mentioned, back when GDPR came into effect, that really was a seismic shift in the conversations and thinking about it. And it made many companies but small and large, wake up to, we need to start thinking about this more carefully. And more and more specifically, and, as you mentioned, thinking about how to apply privacy by design. But even in that space, it’s still tough, like I mentioned, it’s complex, and, you know, companies, you know, you want to do well, and this, you know, compliance may not be a priority, when you’re trying to get out there and either have a great new product, a great product launch, or if you’re launching a company, you need to make money. And so is, you know, we really do need to shift the narrative around, you know, privacy being a differentiator. And so I think we’re starting to see that because, you know, consumers are starting to think about it more. You know, they talk about it more. We’ve seen, I’m sure you both have seen in all of the conversations around AI. A big part of it is AI safety, AI governance. You know, we don’t want, you know, like, we don’t know what’s happening, that’s what our consumers are saying, but it sounds scary, because it’s Terminator. So we need someone needs to be governing it. And so I think, you know, of course, it makes sense, the privacy, too, comes into this conversation. So I think, with the constant dialogue around that, I think that is making companies really, you know, understand that this is something that they should incorporate early, you know, sooner rather than later

Jodi Daniels 9:59

Can you explain a little bit more about some of the challenges. You had shared developers were saying this is really hard. I’m curious, what were some of the pieces that they can often would say, This is really hard? And then how does Checks help them solve for that?

Nia Castelly 10:18

So I think, well, the first complexity for non lawyers is, what am I supposed to do? What does that regulation mean? And how does it apply to me? So that’s always, you know, the first issue. But specifically in the mobile app space, I think, you know, what I, what I talked about is, you know, it’s true for all sorts of business related endeavors. But particularly in the mobile space, like you want to get out there, and you want to get out there fast. So building quickly, innovating quickly, launching new new features, quickly is a really big paradigm of that space. And so, compliance by definition is just simply going to take a little bit longer, right. So there’s that the second piece is the technical piece of it, just the way mobile apps are developed. Often, they use things called SDKs, which are software developer kits, which is just third party code. Because when you’re building an app, right, any type of software, if you have to code each line, you’re not going to go quickly. And there’s a lot of repetitive functions, whether it be login screens, or tracking analytics or accepting money. That sort of code is used in lots of apps. And so there’s sort of off the shelf code or software that they can incorporate it into their apps. But the outside of that is that this third party code, and if you don’t take the time to make sure you’ve edited you understand how it’s actually working. It can have unintended consequences in your app. And so including data collection, and data sharing. And so when we think about privacy requirements, you know, data privacy is sort of the most, the biggest one right about what you’re doing. So it’s in so what Checks does is, we have what we call a triangle. And so we look at what you’re required to do. And so this can be what you’re required to do on based on global regulations, or even platform policies, which are another sort of big point of what mobile app developers have to comply with what you say you’re doing. So this is in their disclosures. And usually, that’s incorporated into your privacy policy, what you’re telling your users you’re going to do with their data, and what you’re actually doing, which is the hard part, because as they said, you have different code, it’s being updated, maybe you’re using third party codes, you really know what’s happening. And so when any of these three things are out of sync, this is the information these are the insights that we provide, we let you know, this is what you say or doing. You say you’re not collecting location data. But you know, in testing your app, we see location data being sent off your to your device to a third-party endpoint. And we let you know that. And a lot of those can often be unintended consequences, you didn’t realize that that’s why the what the code was actually doing. And they can go and make adjustments. And so it protects privacy in that way.

Jodi Daniels 13:23

Where in the lifecycle, do you find people using the product,

Nia Castelly 13:29

so many different places, ideally, it should be prior to launch? You know, we are, like I said, we just came out of the incubator, we’re still in beta. And so where we started is helping is helping companies look at what their public publicly available app. So these are apps that are already launched. So there’s a high, you know, priority of ensuring those aren’t correct. But we subsequently rolled out in terms of features pre publish. So before you actually get out there, making sure your app is doing what you intended it to do. And so that’s really where we think and where we’re seeing most of our customers want to ensure it happens, you know, after they built the app, or during the testing phase, they can run the checks to make sure it’s operating as intended.

Jodi Daniels 14:20

Very cool. I can think of lots of companies, you need to be doing this very thing.

Justin Daniels 14:26

So you talked a little bit about the app having artificial intelligence capabilities. I’m just curious as you were creating your product, how were you trying to think about AI governance and using AI within a tool and making sure that in and of itself, that it’s acting consistent with the mission of what your company is, which is in turn to help other companies with compliance?

Nia Castelly 14:52

Yeah, so yeah, as I mentioned, we leverage AI and our product in two ways. One, we leverage it to parse privacy policies at scale, right? These are the things that no one wants to read, as a user, probably don’t read your, your privacy policies for the app to the product you use, or maybe you do, but many people don’t. And so we use artificial intelligence, large language models to parse these and make sure and identify where certain disclosures are happening. What is being said, and using that to do that at scale. And so that helps you know, what your previous question was, how do we help make this less complex. So that’s providing that information real time and quickly as it changes, etc. to customers. The other way we use AI is testing the app. So I mentioned that one of the nodes that we look at is what your app is actually doing. And so we’re using that to navigate the app in an automated fashion, test it to see what’s actually happening when a user might be using their app, in terms of where we’re going in terms of AI compliance, or AI governance, we believe you know, what we’ve started with in terms of our framework, as I mentioned, that three prongs makes a lot of sense for AI governance, you know, you can have, similarly to an app, you can have an AI model, let’s say you didn’t develop it yourself. There’s lots of models that are available out there, but you can use it to incorporate into some technology that you’re creating, but it’s created by a third party, do you really know how that model was built, ensuring that it complies with what you as a company want in terms of your own, you know, AI governance priorities. And so having that sort of testing to ensure that what you’re getting how you’re implementing that AI technology in your products, and what the output looks like, what you’re actually conveying to your to your users, is, is going the way you expect it to. So that’s where we hope the next part of our product will be going.

Jodi Daniels 17:08

Overall, how do you think privacy regulations are being impacted by concerns around AI?

Nia Castelly 17:19

So I think, you know, from a privacy standpoint, a lot of our existing regulations are really already relevant and helpful in terms of how they apply to AI technology. I mean, you have to be like, if you’re putting data in to the models that you’re building, right? Where did you get it from? Do you have the proper consents, etc? are you disclosing the purposes, all of those things, you know, are very relevant and can be applied by regulators today. But as new regulations come out, like whether it be in the US as, as many of the states, you know, come on board with their own versions of a privacy regulations, you know, or amendments happen to our larger ones, like in California, or Virginia, etc. I think there will be more specific references to AI tech, because you know, and specifically how the technology uses personal data, for example, an effort to avoid ambiguity and address, you know, unique use cases of AI. But as I mentioned, I think there are a lot of existing laws and a lot of the ways that companies do business already and develop products already will go into, hopefully, ensuring they’re developing AI fuel products

Jodi Daniels 18:35

in the right way. That makes a lot of sense. And we have Delaware, by the way, in the midst of our recording that signed yesterday. So we are officially at 12 states in the United States who have a privacy law.

Justin Daniels 18:50

Delaware, when does that go into effect?

Jodi Daniels 18:53

Well, I’m going to claim that I’m hungry, and my brain can’t remember that because there are now 12 states and I can’t keep up with which one is which.

Nia Castelly 19:00

And please don’t ask me because I have a cheat sheet.

Jodi Daniels 19:04

I have a cheat sheet. It’s get there’s a lot now I was really good on my base five. But if we got seven new ones, all, we’re all digesting all seven. So if you come back to me, we will have that answer very shortly.

Justin Daniels 19:18

Right? We held accountable in the next show.

Jodi Daniels 19:20

Somewhere between 2024 and 2025. But there’s only one that’s 2026, and that’s far away.

Justin Daniels 19:30

So, Nia, from your experience as a seasoned privacy practitioner and just consumer, do you have a best privacy tip you would like to share with our audience?

Nia Castelly 19:40

Well, I actually have two that one for users and one for companies. You know, one of the things I say to my friends to my family, I was like you care the most about your privacy. So you know companies will tell you what they’re doing, hopefully, the ones good actors will give you all of this information about what they’re going to do with their data, or they’ll give you controls that you can opt out. Or you can make informed choices. But it’s up to you to do that, and be informed users of products. So that’s my tip that I always give to my friends, I think it’s probably annoying, because is what I do. But I still think it’s important. You know that they read information and make informed decisions, because that will, if you’re choosing not to use this company’s app or product, because of what you’ve read, that could that could make a difference, have lots of people do that, and to how they operate. And on the flip side, my privacy tip for companies is start now you want to finish, right? Whether it be at the company level, or the new product level, you know, what feels like a delay now, and shipping something quickly. You know, a delay in trying to implement privacy by design will be infinitely more costly. If you have to implement it down the road once your product or company are highly successful. So that’s what I tell them start how you want to finish. And when

Jodi Daniels 21:11

you are not advising and thinking and reading all things privacy, what do you like to do for fun?

Nia Castelly 21:18

Well, I have three kids. So any fun time is completely devoted to them. So anything they find they want to do at the moment, which is often Disneyland, which is not my favorite place. But once I get there, it’s usually a happy place. So you bet. Yes, that’s what I usually do for fun.

Jodi Daniels 21:39

I’m a big Disney fan over here, but not Disneyland Disney World, so I could use a little extra Disney fan myself.

Justin Daniels 21:46

Yeah, she’s a fan of Disney’s privacy policy on their cruise ships.

Jodi Daniels 21:51

Strike the draft? Yes, of course went and edit. It was, as you should say, I took your advice. That’s what I did.

Justin Daniels 21:57

And of course, like you said me, she’s annoying.

Jodi Daniels 22:02

Okay, now Nia, if people would like to learn more, and especially to check out Checks. Where should they go to learn more?

Nia Castelly 22:14

Oh, it’s very simple. checks.google.com.

Jodi Daniels 22:21

Wonderful. And we thank you so much for sharing all this information with us today. Everyone building mobile app, make sure you check it out.

Nia Castelly 22:31

Thanks for having me.

Jodi Daniels 22:32

Thank you.

Outro 22:38

Thanks for listening to the She Said Security/He Said Privacy Podcast. If you haven’t already, be sure to click subscribe to get future episodes and check us out on LinkedIn. See you next time.

Privacy doesn’t have to be complicated.