Click for Full Transcript

Intro 0:01

Welcome to the She Said Privacy/He Said Security Podcast. Like any good marriage we will debate, evaluate and sometimes quarrel about how privacy and security impact business in the 21st century.

Jodi Daniels 0:22

Hi, Jodi Daniels here I’m the Founder and CEO of Red Clover Advisors, a certified women’s privacy consultancy, privacy consultant, and certified informational privacy professional, providing practical privacy advice to overwhelmed companies.

Justin Daniels 0:36

Hello, Justin Daniels here I am an equity partner at the law firm Baker Donelson, I am passionate about helping companies solve complex cyber and privacy challenges during the lifecycle of their business. I am the cyber quarterback helping clients design and implement cyber plans as well as help them manage and recover from data breaches.

Jodi Daniels 0:58

This episode is brought to you by Red Clover Advisors. You really had coffee today, the non coffee drinker coffee shop,

Justin Daniels 0:58

I actually had tea at the coffee shop because my guest was like, “Yeah, I’m good with water.” I’m like, well, one of us needs to buy something or sitting there.

Jodi Daniels 1:15

Well back to our regular scheduled program. So Red Clover Advisors helps companies to comply with data privacy laws and establish customer trust so that they can grow and nurture integrity. We work with companies in a variety of fields, including technology, e-commerce, professional services, and digital media. In short, we use data privacy to transform the way companies do business. Together, we’re creating a future where there’s greater trust between companies and consumers. To learn more and to check out our new best selling book, Data Reimagined: Building Trust One Byte at a Time, visit redcloveradvisors.com.

Justin Daniels 1:50

All right, well, how’s your what is today? Tuesday,

Jodi Daniels 1:52

all day.

Justin Daniels 1:54

I see you’re in very much Red Clover colors.

Jodi Daniels 1:57

I am I felt like it was a Red Clover kind of day. Okay, matching you and your red checkered shirt.

Justin Daniels 2:05

Yep, just kind of like that. But anyway, well, let’s, let’s introduce our guest, I will do that. So, Andrew Hopkins believes data security has to start from the data and work outwards and not start from the perimeter and work inwards. Each record can be individually secured, Andrew leads a startup that tackles data security by eliminating the ROI data criminals earned by stealing data. Andrew, welcome. How’s your day going?

Andrew Hopkins 2:37

Going? Great. Thank you. I’m also on coffee. So this could be fun. Okay,

Jodi Daniels 2:42

we are recording in the morning. So we all appear to be very happy and chipper today. We’re on it. Well, Andrew, we always like to start with how you got to where you are. So if you can share a bit of your career evolution, that would be great.

Andrew Hopkins 2:55

Yeah, how I got where I am is really by luck. And by chance over the past 20 years, and you can see the gray hair. That’s not my entire career. I’ve done two to seven years since at Accenture with various different things in the middle. Had some fun at Accenture, but decided I needed to change a couple of years ago. And out of the blue, I had a call from a guy I last worked with more than 20 years back. And the short version of the story is he asked me to look at what he was he built. I in my entire career have never seen anything quite so disruptive. And so I didn’t plan to join a startup. But here I am. And we are taking and building this platform out and bringing it to market. Hopefully as we speak. And it’s too much fun. I couldn’t say no to doing it.

Jodi Daniels 3:44

That’s always a good thing to have too much fun. I do I ever have too much fun. You have too much. That is true. I can imagine and you already get started earlier in our prior discussion on Jet.

Justin Daniels 3:57

I know I had to ask every show. I know. So Andrew, can you talk a little bit about what is PrivacyChain and what is its mission?

Andrew Hopkins 4:08

I term so PrivacyChain is a distributed data management system, which means that we can secure store and secure and manage data, any kind of data on any device anywhere. And the big difference is is that we do everything at the individual record level. So think of a record as a contract, a document, a video, an employee, record a piece of source code, we really don’t care what it is. But we will store each of those records as a separate, self contained, self aware object that knows everything about itself, and that can be seamlessly delivered from one system to another. From a security standpoint, each record is individually and uniquely encrypted with That’s key. And we also can set access privacy usage controls at the individual record level. And so the implication from just from a data security perspective is that we continue to protect data after exfiltration. And how we do that is, is that if somebody steals 20 million records, they have to run 20 million separate decryptions, each of which yields a single record. So in your introduction, you talked about impacting and attackers ROI, it’s not feasible for somebody just steal 20 million records, and then individually decrypt each of them separately. And each decryption delivers a single record. And the short term talking about mission, this is this is a data security play. We’re hugely focused on it, we think it’s really, really important. And we want to make a difference. And so initially, we were looking to get this into the market and get it into the hands of people who can make best use of it. Data security today is very perimeter based, as you all know. And to your introduction, Justin about helping people recover from data breaches. That’s what we’re trying to deal with as well. How do you avoid, how do you you’re not going to avoid a data breach because people make mistakes, but how do you then minimize the impact. And part of it is making the stolen data completely unusable to an attacker. And the other part of that is to accelerate recovery and resilience. And that’s what we’re after. In the longer term, we want to give you control back of your own data on your own devices. And you have complete control over who can see your data, what they can do with it, or howling, they can do it. And you can do it from your own devices, including a smartphone, our vision, ultimately used to change the world of data today, and move it from everybody else owns it. I want to own my own data.

Jodi Daniels 7:03

So Andrew, you talked about this idea. I’m gonna call it micro encryption. I think most of us are used to the overall encryption concept. Can you dive a little bit deeper and explain this idea of the micro encryption?

Andrew Hopkins 7:17

Yeah, so the encryption itself a standard, our default is 256 bits. So we can use any level of of encryption that you that somebody might want to use. The differences is that we encrypt each record separately. And so what that means is that, as I explained that, each record has to be uniquely uniquely decrypted. But all the tech and the encryption, as I said, is standard. So the difference is, is not only encryption and how it works, but in how we deploy it at the record level. And let me take a step back on this, if you think about just data generally. And centralized data storage, that word made an awful lot of sense. In the days of mainframes, were all the processing power was centralized. And therefore the data storage made sense to be centralized as well, that will, that’s not what we live in anymore. This is a decentralized world. If you think of IoT, mobile phones, the processing power is out there, that astonishingly level of processing power. So we want to put the data there. And in order to do that, we have to be able to take data down to its most granular level. And in the context of usability that is an individual record. And so that’s what we’re doing. Does that help explain it?

Jodi Daniels 8:36

It does. Thank you. I appreciate it. It’s always fun to learn something a little bit new.

Justin Daniels 8:42

So Andrew, one of the things that is a theme of our show is one of the challenges with both privacy and security, when it comes to user adoption is it can be inconvenient. And so in that context, I wanted to learn more about well, how easy is it for people to use data with your tool that is encrypted, and not suffer through? You know, I have to wait for it to decrypt and it takes too long. And I’m just going to do a workaround.

Andrew Hopkins 9:11

But it’s a great question. And it’s something that we really focus on you as an end user. Let’s say that you are working on your tablet or your laptop. When you open a file, let’s say it’s an Excel file, for the sake of argument, you are opening the file just as you would open it from file on file explorer or from finder on an Apple device, but you’re opening it an Excel right? So nothing changes in the context of how you use an Excel file. And then you can make your record so you can do everything you would normally do and you click Save. And it goes back on to privacy change because we’re setting on your device. So from that perspective, using a file is no different. but not from the security standpoint, you have your own individual profile. And when you open a file or access, or make a request to open a file, we’ll match your privileges against the requirements of that file. And we will give you access if in fact, you are entitled to habit. That’s the only difference, you’re gonna see, you may not have access to all the files that you originally had, because we’re also removing the onus upon you, as an individual user to make a security decision. Because we can set those controls up front. And we can say, for example, that I’m working on IP, that by definition should be secret. So anything that I create anything that I edit is automatically classified a secret, which means that anybody who wants to see it has to have a secret clearance. I haven’t done enough, I haven’t changed a thing, my user interface is the same. So we’re trying to make this as as as invisible to the user as we possibly can, because of the inconvenience factor.

Justin Daniels 11:07

So Andrew, what you’re saying is, is this tool also handles identity access management. But if I’m Justin and I need to go get the sales forecast, because I have to update it. If I have the, if I’m allowed to have access to it. It’s seamless, I don’t realize that PrivacyChain is decrypting it in the background. However, if I don’t have access to it, now I’m going to run into a situation where I have to call the IT team. But that’s by design, because it sounds like you’re addressing the least privileged access idea with identity access management, is that correct?

Andrew Hopkins 11:46

Yes, it does. I mean, we’re essentially taking your access controls today and bringing them into the platform. And we can add more as you go. In another way to think about is we’re doing zero trust at the individual record level. Right, because the other thing that I will add to that is yes, if you have access to it, we will give you that file, but we may give it to you in a format that blanks out personal information. Because so we can control the privacy and what you can see in a file as well through the controls both at a sort of role level, but also at the individual level. So when you open that file up, and there’s an interesting conversation around AI here, which we’ll get to as we talk about AI, but when you open that file up, you can go so you’ll have access to what you have been granted access to not necessarily all the information in that file.

Jodi Daniels 12:40

In the spirit of talking about AI, I don’t think we can have a podcast or conversation anywhere these days without putting those two little letters in. How will AI impact PrivacyChain and deliver value for customers?

Andrew Hopkins 12:54

Well, it’s a couple of things here. If I take that example that I’ve just given you on the control of output, or the control of what somebody can see from a file based upon privilege, think about putting that in front of a chat GPT interface. So let’s take an example. Let’s say that an organization uploads all of their employee records into their into their model. But within that employee records that is sensitive information like date of birth. Now, let’s say that you you the two of you and I are different levels in the organization, but all of us ask the same question. And the same question is is how old is Joe? And his? Does he? Is he eligible for a commercial driver’s license? No, I’m a Lodi supervisor. So I asked that question. And all I’m gonna get back as Yes. Joe is eligible for a commercial driver’s license. Justin, you baby may be at the manager level. And you will get back. Joe was 28 and eligible for commercial driver’s license. And Jody, you as the as an executive make it all that information.

Jodi Daniels 14:05

In that story.

Andrew Hopkins 14:07

I know how this works.

Justin Daniels 14:10

Why is that a surprise?

Jodi Daniels 14:13

I just thought we should highlight. Andrew, please continue. Yeah,

Andrew Hopkins 14:18

so the point is, is that the underlying information is the same. But even on a ChatGPT or an AI environment, you can control the output based upon rights and privileges for the people making those requests. And so we think that’s really, really important because you don’t necessarily want to limit the input to a model, you want to get as much in there as possible. So the model is, is completed or possibly can be in the results the best, but you can control the output. You can also prevent people seeing the output as long as you hide it behind, you know, you bring it into sort of private V chain environment, from anybody seeing the app, seeing it from the outside if you so choose to do so. So that’s one example where we think displays really brilliant and an Important manner in the context of of AI. The second thing is, and I purposely not dive into how we do what we do, because that would be a longer conversation. But because we’re doing everything at a record level, and we’re storing each record with its own metadata, we can tell, we can tell each record to calculate, let’s say, two cents a string, for example, or let’s say it’s data off a firewall, we can tell that piece of data or that stream of data to calculate average, calculate mean, calculate median, calculate standard deviation into alert, when a reading comes in at more than one standard deviation. So we’re essentially moving what we call machine learning as far off to the edge as you can possibly get. And so the way we think about it is we’re essentially supercharging AI, by making the information that feeds it more intelligent, and pushing some of that functionality out to the individual record level. So we think AI is a game changer, I mean, for how we will use it, we will use it in the way I’ve just described, the fact that we can do some fairly interesting analysis and machine learning and alerting at the end of the day, your record level, we think plays very heavily into into things, frankly, like IoT, monitoring equipment, but also monitoring, security, monitoring via monitoring firewalls, you’re getting the data off the firewall, we’re picking it up, we’re securing it, we’re tracking it, we’re keeping an immutable history of it. And we can, we can analyze it at that level as well if we need to, but doesn’t have to all be pulled into the center for you to think about what’s going on and make decisions. Make sense?

Justin Daniels 16:46

So Andrew, as we talked today, can you share a little bit about where you’re at in the development of this tool? Is it fully developed, you have a beta customers? Where exactly are you in the process?

Andrew Hopkins 16:56

So we built the back end, right. And the way this works is the back end core functionality stores the data, the front end tells it what to do. So the front end essentially, is the interface with an end customer or users. It’s the ingestion process of building the rules that define what the back end should do. We’re in the process of building our first version of that. Ideally, what we’re going to do is, is basically do this on Microsoft files, initially. So that we can go into an organization that helped them and say to them, Look, you’ve got Microsoft files that are unprotected across all these devices across the organization. Why don’t we come in and capture them, secure them and give you visibility to what’s where, so that you can start to control these more, more accurately? That’s something that we think we’re going to launch with. At the same time, we are talking to potential channel. And that’s my background at Accenture. I know a lot about channel and who might be useful with this. And we’re building the front end. Ideally, you know, we’re bootstrapping this at the moment, we’re absolutely open to investment from the right source. I’ve looked at the VC market, and that’s a different conversation. But the right source and the right partner is exactly what we look for right now. Otherwise, we’re keeping going ourselves. There’s an awful lot of interest, we just have to get it there.

Jodi Daniels 18:17

Where do you think this fits within the stack of other tools that companies are using?

Andrew Hopkins 18:23

Mostly complimentary, right? If you think about firewalls, and all of those different things, and I’m not going to go into the mall, but most of this is going a step beyond where people are today. So I was talking to one of the largest zero trust organizations just just last week. And what they’re doing is applying zero trust at the device level, the application level. And the natural extension is let’s do it at the data level. Not all data necessarily, but let’s really do it to the data that you care about. So you think it’s an extension, having said that there are things that you’re currently spending money on or the organization that might become less important and less important. I mean, data backup is one of them, because we backup and clone at the individual record level. And we know where all these clones are all we can rebuild very, very rapidly if the device is compromised. Because we know what was on that device. We know which records they were the record is and the clones regularly sync. So we can rebuild it out. Because we’re doing it at the record level. So there are some things were somewhat challenging to but most of this we think is complimentary and just expands what we have and to a level of detail that doesn’t exist today. A level of granularity sorry. Well, thank you for that.

Justin Daniels 19:44

Given your background, with privacy and security, could you share with our audience what your best privacy or security tip might be? Maybe it’s encrypt all the records at a at a

Andrew Hopkins 19:55

data level? Well, yes, at that at that level. Yes, absolutely. But failing that, I would say don’t click on anything. It’s that simple. I mean, it’s not that simple. But that’s one of the things that I it’s really interesting what’s going on even on LinkedIn. Coming back to LinkedIn, full circle on LinkedIn, I get some awful awfully interesting requests from people who have done this and done that. And

Jodi Daniels 20:21

don’t click on anything. You have to share more of a story. I think many people listening are all using LinkedIn. So can you share a little bit of sounds like you have an interesting experience that someone shared with you? Well, clicking

Andrew Hopkins 20:39

submit a request from a, let’s say, a very good looking individual, who is an investor and investors are interesting to me at the moment. And claims to have done this and done that, and great track record, connect, and then get subjected to all sorts of interesting things that are taking conversations offline and looking at other sites. Just don’t do it. I mean, it’s common, I mean, virtually every day, I am getting that kind of request and just deleting them. And it’s, it’s, it’s rough, because there are people out there probably that are legitimate and want to talk to me that I just say no to it. Because I can’t tell. It’s really challenging. Even on LinkedIn.

Justin Daniels 21:25

We haven’t even talked about deep fake yet, or how AI weaponizes. That whole process.

Andrew Hopkins 21:29

Oh, okay. Do we have time on that one? Sorry, go ahead. All right. So that is a whole interesting story for us as well, coming back to the whole AI thing, because because the way that we bring a record, and then let’s say a record of the video that you’ve taken on your phone, we’re bringing in the metadata. And we’re capturing who, when, where whites and all of the stuff that comes with it. And we’re storing it together, and screen as an object. And then we are creating an unusable history of everything that happens to that video. So every edit, every change is recorded. So when that video was published, let’s say through the web or a media outlet, you can click on it and see where it came from. So let’s say it came from me as a human. And I’m a journalist, and I only want people to believe in what I I have created, they can click and say, Oh, this is really Andrews and Immutables is that I can see the history of it. And if Andrew who built this, I can trust it because I trust Andrew, as a journalist, I’d want it on the other hand coming from AI and do exactly the same thing. Where did this where was this created? How was it created. And so you so we can create because we’re doing everything at that individual record level of usable history, and provenance of everything I’ve published out there. And we are in fact talking to CTC to EPA and others that are trying to tackle this problem about helping them achieve that.

Jodi Daniels 22:55

It would be very interesting to see where that goes the idea of sort of a digital certificate, a digital stamp I’ve similar similar concepts to what you’ve just described are in conversation circulating.

Andrew Hopkins 23:07

I’d love to compare notes. We think it’s another very very important thing by passionate about it because most of us were media

Jodi Daniels 23:15

is a very big thing. For a lot of different reasons value of the content and the source the idea of deep fakes, you know, taking pictures and video and sourcing it back all that literally could be a whole other episode. It could. But Andrew, when you were not talking all things privacy and security or building the company, what do you like to do for fun?

Andrew Hopkins 23:38

Well, I like to hang out with my wife, and we’re very relaxed, but typically a homebody, although we do like good food and good wine. We are surrounded by animals, dogs, a cat and sugar gliders. And we pretty much still like to travel but haven’t had much time to do that. I’d have a pretty relaxed and you could say boring life, but I love it.

Jodi Daniels 24:00

relaxes relaxed did good. You might have heard our dog in the background. He often likes to sometimes make himself known on our podcast episodes. Andrew, where can people learn more about PrivacyChain and connect with you

Andrew Hopkins 24:15

on LinkedIn is the best way to do it. The accept them

Jodi Daniels 24:18

though, on LinkedIn is the question. Well, I

Andrew Hopkins 24:21

do you typically can get me if you if you DM me, but yes, that’s a favorite reference to podcast. I will absolutely pick up and respond for sure.

Jodi Daniels 24:32

Put that in the message.

Andrew Hopkins 24:33

Yeah, put that on the message. Yeah, it’s there was a website. But it’s not terribly informative for some fairly obvious reasons. We’re fairly selective about what we tell people. I mean, there’s nothing I’d really share today about the how this works, which I could do on a different podcast if you wanted to. But we’re fairly careful on what’s on the website, but I’m happy to talk to people about what we’re doing and how

Jodi Daniels 24:57

it works. Wonderful. Just Any parting thoughts?

Justin Daniels 25:01

No, this was informative. When I think about the debate around not paying ransomware or making it illegal Andrews tool would be very helpful in the argument for

Andrew Hopkins 25:12

why it could be outlawed. So yes. informative. Yes.

Jodi Daniels 25:16

Well, Andrew, thank you so much for joining us today. We really appreciate all the expertise that you share.

Andrew Hopkins 25:22

Thank you for having me. I really enjoyed it. It’s good to meet you both.

Outro 25:29

Thanks for listening to the She Said Privacy/He Said Security Podcast. If you haven’t already, be sure to click Subscribe to get future episodes and check us out on LinkedIn. See you next time.

Privacy doesn’t have to be complicated.