Click for Full Transcript

Intro: 00:01

Welcome to the She Said Privacy/He Said Security Podcast. Like any good marriage, we will debate, evaluate and sometimes quarrel about how privacy and security impact business in the 21st century.

Jodi Daniels: 00:21

Hi Jodi Daniels here. I’m the Founder and CEO of Red Clover Advisors, a certified women’s privacy consultancy. I’m a privacy consultant and certified information privacy professional providing practical privacy advice to overwhelmed companies.

Justin Daniels: 00:36

Hello, I am Justin Daniels. I am a shareholder and corporate M&A and tech transaction lawyer at the law firm Baker Donelson. Advising companies in the deployment and scaling of technology. Since data is critical to every transaction, I help clients make informed business decisions while managing data privacy and cybersecurity risk and when needed, I lead the legal cyber data breach response brigade.

Jodi Daniels: 01:00

And this episode is brought to you by whoop! That was a really wimpy whoop. Red Clover Advisors we help companies to comply with data privacy laws and establish customer trust so that they can grow and nurture integrity. We work with companies in a variety of fields including technology, e-commerce, professional services, and digital media. In short, we use data privacy to transform the way companies do business.

Together, we’re creating a future where there’s greater trust between companies and consumers. To learn more and to check out our best-selling book Data Reimagined: Building Trust One Byte at a Time, visit redcloveradvisors.com. Well, we’re a little sleepy this week because we’re recording after the whole spring forward thing, and we had spring break last week, and so our view is just not pretty. Snow and mountains. How are you doing there, Mr. Colorado?

Justin Daniels: 01:50

I’ve come back to a pile of work that needs to be done. So vacation seems like a long time ago.

Jodi Daniels: 02:01

We’ll just have to have another vacation. Well, I wish everyone listening. A wonderful spring break. Hope you had a good one. Or hope you have a fun one coming up here.

So today we have Brian Mullin, who is the CEO and co-founder of Karlsgate. He is also a creator of Karlsgate Identity Exchange, a groundbreaking solution for zero trust, remote data matching and integration. Brian has over 30 years of experience in data privacy and security, with leadership roles at companies across the data driven marketing ecosystem. Well, Brian, welcome to the show.

Brian Mullin: 02:33

Thanks for having me. Good to be here.

Jodi Daniels: 02:35

Absolutely.

Justin Daniels: 02:37

So why don’t you tell us a little bit about your career journey?

Brian Mullin: 02:41

Yeah, I’ve got a lot of experience in the industry because I grew up in a family business. So that means even in high school, I was already working with large data sets. So it’s a strange background, but it kind of gave me an early start at looking at data processing, data engineering and all the issues around data privacy and such. So with that early start, I ended up getting opportunities to, you know, lead software development at Dun Bradstreet or head platforms at Acxiom. So I’ve kind of been around the industry at the big data players.

So I’ve got a lot of experience on the data engineering side, making data products and understanding regulations and compliance issues around privacy. So that was a big part of my evolution in the space. And when you work in this industry for a long time, you start to realize that working with data in this type of way means trafficking a lot of personal identity of other people. And that’s sort of an ugly truth of the fact that, you know, data needs to flow to be valuable. So there’s a bit of activity around your personal identity. 

So it really came to me that it’s like you just stay in this industry. I really need to come up with a way that we can work with sensible insights around consumers and marketing, but do it in a way that doesn’t necessarily have the traffic in people’s personal identity. So that was like the start of a journey that started for me around ten years ago. And looking at technologies that can really address the problems associated with connecting data without trafficking in your identity. And that’s what led to the creation of Karlsgate and the technology that we’re pioneering that basically allows two different companies that have data about people to basically understand that they have shared identities, but not actually reveal those identities in doing so. 

So that’s sort of what’s brought me all the way to this point.

Jodi Daniels: 04:35

Seems like we could take inspiration from this in high school, helping in the family business. Maybe. Maybe I should find something for our kids to be doing here.

Brian Mullin: 04:45

Yeah. Early start.

Jodi Daniels: 04:47

They have to listen to a lot of what we say, but I like that friend. Maybe there’s something that I should leverage. Well, let’s talk a little bit more about all these data sets. Also requires between how companies are protecting them. And a lot of times they’re using things like firewalls and there’s policies and there’s legal contracts like Justin might be creating.

And there’s still there’s risks, though, that businesses have to work through. So what are the challenges in these different approaches?

Brian Mullin: 05:19

Yeah.

Brian Mullin: 05:19

Like the thing is, is that, you know, you definitely need the legal framework that makes sense. Everyone should have that. But in the same token is that when you share data, you give away a copy, right? And you sort of lack complete control at that point. So just relying on a legal contract like a BA in healthcare, you’re really delegating responsibility without any meaningful consequence.

So you’re looking for someone to blame if there’s a problem more than you’re preventing problems from happening. And certainly if you see what’s been happening, let’s say in healthcare and healthcare breaches, you know, patients are getting hurt because their privacy is being breached. And some vendors, in a sense, can’t even handle the liability of what happens. So it’s just not a great story to show that, like you had a legal framework, but the building burned down. So I just feel like more active controls over sharing data can really help and supported with the legal, you know, frameworks. 

But I think we need more technology in that space as well.

Jodi Daniels: 06:20

Well, Mr. Lawyer, what say you?

Justin Daniels: 06:23

Well, we often categorize data as either public or private and assuming private data is secure. So why does this binary approach, in your view, no longer work? And how does it contribute to today’s privacy and security challenges?

Brian Mullin: 06:36

Yeah, I think that’s true. I think there’s everyone is looking at it like this is private data and this is public data. And public data can be shared without, you know, any concerns. It’s always been public and private data is, you know, is protected and is not shared. But that’s not actually true, right?

There are plenty of very sensible ways that say in healthcare, your provider needs to share your health records with other providers. They need to share that information with your insurer, like so. There’s definitely data, you know, collaboration that needs to go on. So if it’s private and being shared, it’s you know, that’s not that, it’s a dual nature. That’s not really helpful to understand what’s happening. 

So just realize every copy is a loss of control. So private isn’t really private. It’s intended to be private. But after that loss of control that’s a transition to another state. So I think that’s just something that we need to be very careful about understanding that there’s some data that’s public, some data that’s private. 

But private data is still shared in sensible ways. And we just want to think about what that looks like in both our legal frameworks and our technology frameworks.

Jodi Daniels: 07:46

So let’s talk more about what that technology framework is going to be, because you just shared that there’s a lot of places where data collaboration needs to happen. I need to get data from point A to point B, and how can companies do this? I think you have a couple different ideas.

Brian Mullin: 08:05

Yeah, sure. So like the way we focus that is to say, hey, there’s a new category called protected data. It’s sitting in between. Right. It’s overall it’s private data that there are some means that you need to share it.

So that doesn’t mean it’s a free for all at that point. It doesn’t mean open the doors. It means have a controlled and protected relationship with where you’re sharing that data. So in that realm, we have a lot of different options as far as technologies these days. So once you start sharing data, like a lot of the privacy enhancing technologies or pets is where people are focusing their energy on. 

So being able to use like some sort of advanced cryptography to transform your data. So when you hand it off to somebody else, they don’t get to see everything. They can only use it for different types of analytics. So in those analytic use cases we see things like fully homomorphic encryption. We see federated learning or we see like differential privacy. 

These types of technologies help that when you share, you can still control what gets revealed. And I think that’s an important aspect that says if data originated in this like private category and then transitioned to a protected category, what’s the protection right, and what you kind of do. My focus in my career has been on the linkage problem, though, which is basically defined by two different data sets that are living at two different entities that need to understand the intersection of them. So I can’t just hide the analytics, I have to reveal how I know how to match information. So if I have a patient that’s covered under insurance policy, and a provider wants to reach out to find out if I have, you know, health coverage, they need to sort of inform about on an individual patient level basis that I have something. 

So that’s the problem that we’re really focused on at Karlsgate is basically the linkage problem. So it’s basically how do I create technology that allows a transaction between two entities, where I’m not revealing anything new to the other party in finding a match. So picture I had like an address book and you had an address book, and we wanted to find out how many people do we both know at the same time, right? Either I hand you my book and you see all the names, or you hand me your book and I see all your names, which is not what we intend. Right? 

So these types of technologies like approach that we have, which is called partition knowledge orchestration, which is basically, you know, a pet that allows us to do this transaction where you keep your book. I keep my book, but we create these cryptographic artifacts that are sent to a third party. This blind, neutral facilitator compares what we’ve generated, and that’s neutral gender. That neutral party in the middle doesn’t know any of the cryptography we use to generate these fingerprints, but uses it to compare. And if it compares and sees a collision between two parties, it can communicate that, hey, you have a match there. 

I don’t know how I know that. I just know that you have a match because you generated the same code for each of those individuals. So what we do is allow companies to basically merge or match or intersect two different data sets, but they don’t know how, right? They don’t know the names that were used to actually do that matching. So that’s been a big focus of what we’re doing. 

And it’s kind of a new frontier space, which the other pets don’t really kind of go after. But the linkage problem comes up all the time, whether it’s in healthcare, whether it’s in online advertising. All these ecosystems are built on transfer of data, but they’re transferring identity in a way that other people learn, not in a way that keeps them from learning new information that wasn’t a part of the collaboration, what it intended.

Jodi Daniels: 11:45

Right. And we’ve talked a lot on this show before with about clean rooms and you mentioned online advertising. Can you share a little bit of what you’ve just described and how it’s the same, or it’s different than a clean room because I think, you know, a number of people have started to move it with clean rooms because online advertising is such an important factor in their company. I think that would be really helpful.

Brian Mullin: 12:10

Yeah, it’s a really important one too. So Data Clean Room is a way to accomplish some of this matching in a controlled or controlled space, right? So two parties agree with who they trust to do this operation for them. They sort of surrender their identity information to that data clean room. That data clean room does the matching and then reports back.

And I think that’s a much better approach than what we have in many ways. Right. But what has always bothered me about that is that now we’re just moving where the trust lives, where we’re just we’re moving it to a third party platform and saying, oh, you do it right, but you’re still you have a still a sort of fundamental, you know, delegation of responsibility to a third party. Another thing that’s problem with the data clean rooms for me is that if I trust a certain data clean room, but you don’t, maybe I still want to do business with you. And I have to pick another data clean room that you trust. 

So now I have two data clean rooms. Right. And then another partner, you know, they use somebody else. So I have a third data clean room. Now my data is being propagated to all these other rooms. 

So my risk of breach keeps on multiplying. So what we do is what we call a data, you know, a protected data pipeline which says don’t stop at a new room, be able to draw a network like between each of your parties so that there is no third party that takes this. So in our approach, there’s no room because there’s no like intermediary that’s actually taking custody. You keep custody the whole transaction and you signal over your network protocols. You’re matching. 

So similar approach overall as far as outcome. But it’s a fundamental difference that we don’t delegate to somebody else that takes possession of identity. We say everyone should keep they shouldn’t trust anyone basically, because that just makes it so that there aren’t any other legal aspects of knowing you’ve given custody somebody else. And then what? What’s the consequence? 

How about you just never give it? So that’s why we’re really focusing on this area. Because we think that’s sustainable. Because you don’t need any clean rooms. You don’t need four. 

You don’t need ten. You just don’t need any because you just signal in a way that does that. So if I was going to use my browser and use a secure transaction like over SSL, right, I’m not setting up a place, you know, that I need to trust. I should go direct to every website I go to. It’s a similar idea where you really want to have a connectivity tool more than you want a place.

Jodi Daniels: 14:42

Thank you. That was really, really helpful and a good comparison.

Justin Daniels: 14:45

So many companies rely on policies and contracts to secure their data, but as we know, sometimes this creates an illusion of security. So what technology solutions do you think exist today that ensure real protection just beyond this amorphous idea of trust?

Brian Mullin: 15:04

Yeah, some of the technologies we just talked about, I mean, focus on that. I think that if you’re practices, you know, start with you sending a copy of the data to somebody else. I think those are the ones that you want to look into first and say what is needed, like in that transaction, can we limit the number of individuals that I’m transmitting, meaning that can we know who’s already matching between us before we start? Another thing is like what? What attributes are you sharing and cutting down on what information you’re revealing?

But basically, whether it be, you know, cryptography or some of these other more advanced pets, I think those are the ways that people should start looking into saying, how do I maintain as much control over protected data as I can through a transaction instead of saying, here’s all the data. Please don’t lose it.

Jodi Daniels: 16:00

So one of the other questions that I had as I was listening to this conversation, depending on the partner, they’re trying to match something a little bit different. You might have cookie IDs, you might have the insurer conversation. Right. There’s lots of different kinds of matches that someone’s trying to do in this approach. How does that happen?

How do you have all these different situations with sort of the same single piece of data?

Brian Mullin: 16:30

Yeah, that’s the tricky part of some of these ecosystems is they’re sort of established with a lot of middlemen and a lot of persistent identifiers. So whether it be your cookie, that’s a way of tracking you. It’s your in healthcare. It could be your Social Security number. A lot of internet services use email or hashed email to communicate between each other.

But I think one of the problems with that is persistent IDs. They really lead to a situation where every middleman becomes a data aggregator. And I think that’s what we should all be worried about, is that middlemen collect these IDs as they transmit them, and then they build a data product of their own that uses your identity. So the more places that you stop along the way, that includes an ID that someone else could look up. We think that’s really the area that we want to minimize. 

So again, if we can come up with end to end connectivity instead of through an ecosystem where we don’t drop an ID, we don’t drop a cookie to somebody else. They don’t need to know the cookie. If you know who I am and I know who I am, we don’t need cookies to do that. So for like in Australia, there’s new privacy laws that are coming out in Australia and they’re shifting the sort of the advertising landscape there. So we’ve been working with a lot of their large, large TV broadcasters, and what they’re doing now is they’re directly connecting to brands themselves. 

So what that means is that instead of this whole ecosystem of tracking cookies and stuff like that or emails, that means that, you know, brands have customers and they can say if that customer is watching a program, they’re a viewer. Right now, I want to advertise to them. And they do that because the broadcaster knows the identity of the person signing into their streaming service, and the brand knows the customer that they have on their file. Right. So they can go direct. 

And there was never an ID in that. So this is some of the ways that they can reengineer these patterns, where you’re arming brands, going directly to the activation platform and basically be able to get like a direct relationship. But just remember, every time that there’s a persistent ID that you’re uploading to somewhere else, that’s an aggregation, right? That’s another loss of identity. It’s another collection that you don’t know what happens to it. 

Where is it flowing after that. So so we like to think of like, if you can build direct relationships where you make it a technology that’s easy and automated to connect directly from the person you know to the person that they know, instead of a whole layer of I’m not sure, but maybe the cookies will match. I think that’s how we improve this industry.

Jodi Daniels: 19:02

Fascinating. I really like the Australian example, so thanks for bringing that one up.

Justin Daniels: 19:09

So with data kind of moving through these complicated relationships, you know, cookies becoming IDs, passing through multiple partners in that kind of a complicated ecosystem. How can organizations ensure privacy and security when data flows across what third, fourth, fifth parties? Changing so many hands.

Brian Mullin: 19:32

Yeah, I think it’s a part of the industry I think we need to be most concerned about is because I don’t think there’s any way to limit or control that or even get visibility into it. So, so if I go to a brand website and I get a consent message, right, that pops up and I and I agree, what am I agreeing to when they don’t control the data anymore because it went somewhere else. Right. So even if I trust that brand completely and they have the best intentions once they’ve lost control of that data because they uploaded it to an advertising platform, it’s not like they can enforce that consent anymore because they don’t have physical control over the data. They don’t know what’s happening to it.

So again, I think if we can keep on moving towards privacy enhancing technologies that allow direct connectivity, which says if I know somebody and you know somebody, let’s just understand that fact and not let intermediaries collect information along the way. I think that’s the way that especially, let’s say, online advertising that people are pretty aware of and know that their online privacy is tracked by a lot of big companies. I think that’s one of the places where I think if we have what we call like authentic audiences, authentic audiences, someone like I go to a website, I know I’m using it, I’ve consented to it. I like this brand. I trust this brand. 

That’s an authentic audience, right? Like someone’s interacting with them in an authentic way. But when they start flowing through the ecosystem. Right. There’s all these players you’ve never even heard of, and you’re like, I don’t even sure what this is about. 

I think when we talk about like, consumer privacy, it’s just a matter of understanding what that should really mean. And sometimes people want this notion of perfect privacy. But if you really break it down, there are some things they expect to happen. Like we talked about the health records, like you expect to be able to have your, you know, health insurer pay the bills, right. But if you don’t tell them about it, they couldn’t. 

So you definitely imply that there’s some sharing going on. I think having more controlled flows and having this, this definition of protected data, I think is the way to go where it’s not exactly private anymore, but it shouldn’t be a free for all. So how do we come up with those mechanisms to kind of track that?

Jodi Daniels: 21:49

So Brian, with all this discussion, privacy pros listening might say, okay, I get this, this, this might make some sense, but trying to convince other people to change what they’re already doing might be kind of tricky. What might you tell them to say? To say no. This is going to be really beneficial for the company. This is going to help create good, trustworthy discussions, you know, for both businesses and our end users.

Brian Mullin: 22:15

It has a great question. It’s the question, right. So, you know, I’ve been at this particular mission for, you know, ten years now. And it is very hard to get people to change their practices from what they do. It’s not that they don’t want to do something better, it’s that they’re trying to find out what that better should be for them.

I do think that that’s something that we’ve kind of learned. That is a big obstacle. And what we’ve been doing is trying to really make sure that our product doesn’t just help privacy, it makes the workflow easier than your current tools. Because if I can make your workflow easier, more basic, more streamlined, more efficient, you’re likely to want to use that anyway and the privacy that comes along with it. So I think all of these technologies have to make sure they’re not trying to make a complicated mousetrap. 

You know, like one of the technologies that has a lot of promise is federated learning. But that’s not a very structured project to implement right now. Like what does that mean for you? Like, can I install something today and use it today? No. 

What does my partner have to do? Well, they have to figure out the process with you too. So I think having these complicated journeys with these two technologies is a non-starter. IT people can’t switch to that. You need to give them to say, install a new box, push a button. 

 And I get what I was going to do anyway. I think that’s really the future. We really have to make privacy as easy as not having privacy. And I go back to the browser example again to, you know, if you go to a website with Https, right. You have you have a secure transaction. 

 It’s sort of a zero effort way to at least have encryption on the internet. That’s the type of level of effort that I think we need to go towards, which is just like, okay, add a letter to my URL and I have better security. Yeah. That’s it. So I really think that we have to focus on that because extra burden is not in it is not what anyone is looking for. 

So we’re really focused on how do we make the efficiency, you know, so powerful that that’s the message. And of course you want to modernize your, your, your security and privacy practices along the way. So it’s been a big issue to make sure that people have a good incentive to upgrade to this new approach.

Jodi Daniels: 24:39

I really like what you said about how you’re trying to bring it into the business, and not just for privacy, and I encourage everyone anytime they’re trying to help implement some privacy, obligation, requirement, Compliance or idea is what’s in it for the business, how can it be better for them? And oh, by the way, you also happen to have complied with the good privacy measure.

Brian Mullin: 25:00

Yeah, absolutely. That’s a big one.

Justin Daniels: 25:04

So can you share with us a favorite personal privacy or security tip you might have for our audience?

Brian Mullin: 25:11

Sure. I mean, I think that what I’ve seen, either in my personal use or in my professional use, is that some of these ways to enhance your security can be a pain in the butt. And I think the technologies that you use that protect you better are the ones that are automated up front. So I would encourage people to, you know, set up their privacy practices upfront, do a little work, whether it’s being things like password managers or or key management, whatever it is in their business that is a hassle around security is to do the extra effort one time because it is easy to use once it’s all automated and set up in your systems. But man, like if there’s extra work to be done, we all skip it.

So I think it’s really to take a look at what you want to accomplish and say, hey, it’s worth a little effort this one time, but if you don’t automate it, no one’s going to use it. So that’s mine. Is that like it’s better to have something to actually use than know what’s out there and just avoid it? Because I’m not going to do that today.

Jodi Daniels: 26:13

100%. Now, when you are not building a company and working in privacy and security, what do you like to do for fun?

Brian Mullin: 26:21

Two. For fun. So I guess like for outdoor activities, I’ve been really loving disc golf lately. That’s been a lot of fun. As far as inside fun, we play a lot of board games, and we’ve had this board game club that’s been meeting for 30 years straight, so every week for 30 years.

So that’s been a lot of fun to keep up with.

Jodi Daniels: 26:44

Wow. How fun. I haven’t heard of disc golf since I bought you some. Mr.. Justin, you’re collecting dust now.

Justin Daniels: 26:51

Indeed, it was a pandemic activity.

Jodi Daniels: 26:54

Well, Brian, if people would like to connect and learn more, where can they go?

Brian Mullin: 26:59

Really just Karlsgate.com just kind of learn about what we’re doing and you can, you can download and tinker with the software yourself. This isn’t a call us and or wait for some big demonstration. You can play with it, tinker with it, use it in 15 minutes. You could be using it.

So we encourage people to try it out, learn it, understand what it is. Because I think good technology is that way, that you can grab it and tinker with it, and it doesn’t take some big project. So that’s what we focus on. And yeah, we encourage people to just try it out and see it for themselves.

Jodi Daniels: 27:27

Amazing. Well, Brian, thank you so much for joining us today. We really appreciate it.

Brian Mullin: 27:30

Thank you. Thanks for having me. Great talking to you.

Outro: 27:36

Thanks for listening to the She Said Privacy/He Said Security Podcast. If you haven’t already, be sure to click subscribe to get future episodes and check us out on LinkedIn. See you next time.

Privacy doesn’t have to be complicated.