Click for Full Transcript

Intro  0:01  

Welcome to the She Said Privacy/He Said Security Podcast. Like any good marriage, we will debate, evaluate, and sometimes quarrel about how privacy and security impact business in the 21st century.

 

Jodi Daniels  0:20  

Hi, Jodi Daniels here. I’m the founder and CEO of Red Clover Advisors, a certified women’s privacy consultancy. I’m a privacy consultant and Certified Information Privacy professional and provide practical privacy advice to overwhelmed companies.

 

Justin Daniels  0:35  

Hi, Justin Daniels. Here I am passionate about helping companies solve complex cyber and privacy challenges during the lifecycle of their business. I’m the cyber quarterback helping clients design and implement cyber plans as well as help them manage and recover from data breaches.

 

Jodi Daniels  0:50  

And this episode is brought to you by Red Clover Advisors. We help companies to comply with data privacy laws and establish customer trust so that they can grow and nurture integrity. We work with companies in a variety of fields, including technology, SAS, ecommerce, and media and professional services. In short, we use data privacy to transform the way companies do business. Together, we’re creating a future where there’s greater trust between companies and consumers. To learn more, visit redcloveradvisors.com. We have a fun episode for you in store today.

 

Justin Daniels  1:27  

Indeed, especially since you’re wearing your special penguin socks really

 

Jodi Daniels  1:31  

like to talk about my penguin socks, you know, and I am called, it doesn’t matter that we live in Atlanta and it is warm outside. I am called. And I have cute feet now.

 

Justin Daniels  1:41  

Okay. Well, today we have a great guest, we have David Kruger, who is the co founder of Absio, corporation, co inventor of software defined distributed key cryptography. He has 30 plus years of experience in two disciplines safety, engineering and software design and development. David, welcome to the show.

 

Jodi Daniels  2:04  

Well, David, we always like to start with understanding how people got to where they are. So if you can share a little bit about your career and how that evolved to Absio?

 

David Kruger  2:15  

Well, as I said, I’ve had kind of a weird dual track career. Back in the back in the 80s, I was beginning to do some software development work and got contracted by a company to develop a safety management system for them. So I got into safety and software development, essentially, in the same time in the same project. And that dual track has kind of continued since then, I’ve been largely involved in safety and security, in the physical industries, and transportation and chemical plants, mining, you name it. But all during that period of time, I was also having to develop software or manage software development teams, because we never could quite find the tools that we needed to do what we needed. So we ended up writing a bunch of, you know, custom software, some of which was just used in house and some of which was sold to third parties, or resold. So that’s, that’s sort of a background. So I’ve always had those two things running in parallel with each other with each other, which does give you kind of a different viewpoint around safety and security problems. Absolutely.

 

Justin Daniels  3:30  

So, David, why don’t we start from the top and talk a little bit about your thoughts around what is data ownership mean, in terms of what you’re doing? And what is securing data at creation? What does that really mean?

 

David Kruger  3:44  

Okay, so So this is where the safety engineering portion comes in. And this is the thing I think that is sort of the fundamental message if anybody listening to this gets this, hopefully this is their, their takeaway. Data is physical, right? We don’t see it. When you’re a safety person, you’re always looking at some kind of hazardous substance, some kind of hazardous, tangible material. Sometimes you can see taste, feel, touch, that type of thing that’s inherently hazardous, and you want to be able to to always keep that under control. That’s the goal of safety. It’s based more preventative than mitigating. Right? You want to keep the the genie in the bottle at all time, because you’re dealing with a hazardous substance. Well, we don’t tend to look at physical I mean, data is hazards. I mean, it’s, it’s physical, it’s something that’s physically hazardous. But when you stop and think about it for a moment, data is, you know, to coin a phrase quantum small, right? It’s not something that we can perceive with our eyes, but it’s, you know, patterns that are in impressed binary patterns that are impressed on on matter in the case of say, a DVD or a DVD or a pattern of energy, but it’s a physical Little as a brick. So if you start from the sort of the basic thought is, this is a hazardous material that this information gets into somebody’s hands or the physical that the digital physical representation of information that we can perceive in our minds gets in somebody else’s hands, or is this not supposed to have it, or is used in a way that I don’t want it to use, it becomes hazardous to me or to my company. So we set out from the get go, and we started this company to treat data as a physical hazardous object with a goal of never the safety goal of never, ever letting that data get out of control. So that’s the that’s the basic thinking behind our approach.

 

Justin Daniels  5:41  

So in a follow up to that, when we talk about securing data at creation, what is meant by

 

David Kruger  5:47  

that, okay, we got data is always in one of three states, right? It’s either it’s either in storage, or it’s in transit, or it’s being used, right. And if there is something that’s inherently hazardous about data, and let’s say it’s a piece of intellectual property that the company owns, that they really, really want to keep secure, keep secret to them. That control needs to be able to start securing needs to be able to start at the moment that that digital object is created, whether that digital object is a, you know, semi structured or structured is irrelevant. It’s there’s it’s information bearing, and that information can be hazardous. It’s used by the wrong people are used in the wrong way. So if you’re going to control it, because it’s hazardous, you really need to do it at the moment that it’s created. And then you need to persist that control throughout its lifestyle lifecycle, no matter what state it’s in. Because if it ever gets out of control, it can do you or your company harm.

 

Jodi Daniels  6:49  

That’s a really interesting analogy to the hazardous arena, I think it’s

 

David Kruger  6:54  

actually not an app, it’s a monitor that this important point, it’s not an analogy. Data is physical data is real. Right? It has to be physically controlled.

 

Jodi Daniels  7:04  

Well, I understand I think a lot of people, though, might not have equated the that to if it was a hazardous chemical. And what would happen, I think the explanation that you’d provided were data. If you thought about not data, if you thought about a hazardous chemical, and you let a hazardous chemical out, significant physical damage can happen to a human and people, as you had in your safety engineering background, would protect that data or not data be very confusing, would protect that situation to never let that chemical out. And as you’re explaining that, if people understood that data is physical, just like in this other arena, they would look at it, I think, in a very similar fashion, whereas I think today, they don’t necessarily think of it that way. So when I say analogy, that’s, I think the explanation to me is analogous. I understand what you’re saying where it is physical, I think many people that’s a new concept for them. And I think comparison to the chemical piece is very unique, and will help people really understand that concept.

 

David Kruger  8:20  

Yeah, there’s, I’ve done a series of articles that are ongoing in Forbes, that all start with the physicality of data and something else because that, that basic construct, I think we found that to be most helpful for people to understand that you approach this from, again, more of a safety than a security standpoint, security tends to be reactive, rather than proactive, oh, you know, bad things are gonna happen. So we got to do something about it. Whereas safety, the basic construct, mental construct is, if a thing is hazardous, you never you start from the get go, you never ever let it get out of your control. And then you have to define what that actually means based on the hazard of the object that you’re trying to control. Right. So that’s the difference in mindset. And it’s subtle, but it’s in our view. Anyway, it’s a it’s a critical distinction that people unfortunately don’t make.

 

Justin Daniels  9:12  

So we know that web three has been in the news a lot lately. And one of your co inventions is really talking about distributed cryptography. And it would be really helpful if you could explain for our audience, what is distributed cryptography.

 

David Kruger  9:29  

So Software Defined distributed key cryptography. What we saw that was problematical about cryptography and goes right back to your question, Jodi and the physicality of data, if you were going to control something, some particular piece of data throughout its lifecycle, right? If you were ever again that the basic goal was to never live, let it get out of control. You had to be able to create the key at the moment that you created the data. Because, you know, again, you got three states of data. It’s either in store or did some transit or to use. So distributed key cryptography was designed to allow the application or enable the application to create the keys that it needed to be able to make sure that that data never got out of control. So you couldn’t have a reliance on a connection, you couldn’t have a reliance on a piece of hardware, right, you needed to be able to create really strong keys, and then you know you to do that. But to meet that goal, you had to break one of the cardinal rules of cryptography of cybersecurity, and to figure out a way to safely store the keys with data. So one of our patents is for a obfuscating file system. So when we create a piece of data, a piece of software that uses our product creates a piece of data creates the key for encryption, sticks it in storage, there was a on a local drive or are up in the cloud, wherever we’re agnostic as to where that’s located. If you are able to break into that computer and look at the file store, what you’ll see is just this bunch of randomly named randomly located nonsensically named digital objects, you can’t tell what’s bought, you can’t tell which is a key you can’t tell which is content, that type of thing. And the reason that we we did that was so that you can safely store keys and content together. And then the keys would always be able to accompany the data. But you couldn’t independently discover each which one was which unless you did some kind of brute force attack, which would be computationally very expensive to do. So the distributed portion of this is that the keys and the data can safely coexist. The software defined portion of this is that were agnostic as far as the architecture of the application, or the architecture of the for the software language that’s used, right to be able to do this. So that’s the fundamentals of software defined distributed key cryptography. No dependencies on architecture, no dependencies on language, no dependencies on hardware.

 

Justin Daniels  12:09  

So it sounds like when you’re talking about this distributed cryptography as it relates to deploying if your product, if the key is with the data, then I assume then there’s a separate key that mates with the key that’s associated with the data to unlock it when you want to use the data or, or otherwise view it,

 

David Kruger  12:26  

the relationship between the keys and in the data is kept in an encrypted database. Right, so and again. So there’s, again, go back to these three states of data, actually, security for data and storage and use is pretty simple. I mean, we know how to do cryptography, we know how to do public key infrastructures, and things like that, there’s nothing that’s really new about that. The challenging portion comes from being able to maintain control of the data when it’s being used. Because with a couple of exceptions, you have to be able to, to decrypt data to be able to use it, right. So part of what SDKC Software Defined distributed cryptography does is it binds a control layer to data so that you can give that data instructions, right and say, This is who can use it, what they can use it on, where, when, for how long, and for what purposes, that particular piece of data can be used. Those type of things we do with instructions all the time, for instance, you send a simple email, you’re telling somebody a and you ask them in his email, Hey, don’t order this, anybody else, you’ve given them instructions, you’ve, you’ve made a copy of a piece of data, that’s hazardous, you’ve sent it to him. But you don’t really have any way to control. The data itself doesn’t have its own controls. It’s not self protecting, and it’s not self controlling. What our technology does, what STP said does is it binds a control layer to that data. And then the software developer in this case, can provide controls that will that a user can say, hey, I go go back the simple email example. I’m only gonna just kind of put a thing says don’t forward this to anybody, I’m going to put a control on the data. That means that they can’t afford that email to anybody else. Right. So part of this is binding the controls the data’s lets you control what happens to that data when it’s being used when it’s decrypted. Not just protecting it when it’s stored or in transit. That makes sense.

 

Justin Daniels  14:31  

It does. I guess one other follow up of just wanting to learn more is a lot of times, security and cryptography is the enemy of efficiency. So if you have to decrypt the data, does that happen in a way where it doesn’t impede on the user’s ability to quickly unencrypted and use the data? Yeah, it

 

David Kruger  14:50  

does. And again, this is one of the reasons why we do things locally. Typically, when you see decryption taking a long time, it’s because you’re having the text keys from you know A hardware security module that’s stored up in the cloud or something like that. It’s latency in the communications, because we’re encrypting and decrypting, wherever the data is, right? Wherever the data is, it’s very, very fast. We originally developed this and did all of our rebranded all of our core packs, and we were under contract the US Army intelligence. So we were if you can remember this far back, our first instantiations, were on EVO, 4g phones, which were god awful bricks of phones, and very slow, very low computational power. And we’re decrypting you know, images and things like that are out of good, good, good guy, bad guy databases, that type of thing. So a fairly heavy computational load. And we had to figure out how to do this fast enough on these low power mobile devices. AND, OR, and NOT really change the user experience, right. And you had to be able to do it because you were in a battlefield environment, you had to be able to do it where you couldn’t rely on a good fast connection or connection at all. So there’s no noticeable cryptographic lag that the The one exception to that would be if you’re if you’re encrypting and decrypting, a very large file. But that actually doesn’t have anything to do with the time to fetch a key that just has to do that it is computationally expensive to say, encrypt or decrypt a two gigabyte, two gigabyte file, there’s not going to be done, don’t wait a few seconds, right, it’s not going to be done instantly.

 

Jodi Daniels  16:26  

Can you share a little bit about how your technology would work in practice. So maybe, for example, identity access management, or secure messaging, or if there’s another example that you want to share. So let’s, let’s take the

 

David Kruger  16:39  

identity, the I Am portion of the first what we would do is embed the IAM rules in the digital object itself. So that the rules, the instructions for what you can do and that data is decrypted are always cryptographically bound with the data. So you can always make sure that they’re respected. In our company, we don’t write software, we write software, Developer Tools, SDKs, and a broker application for being able to, to to manage keys and content, synchronize them across devices, you know, have a backup that type of thing. So really, any kind of application where you’ve got control of the both sides of the application, and you want to make sure that that data is secure. And you want to make sure that it is own can only be used in the way that you want it to be used. That’s an application that would use our technology and of course, that ensure that that could include any kind of secure messaging app, but but really our our, our reason for being is to give software developers, right that the ability to be able to protect and secure that data from the moment it’s great, and until the moment it ceases to exist, and never have it out of control since then. So we’re really in the customer development space and and giving developers and enabling technology to secure and control the use of that data.

 

Justin Daniels  18:05  

When you mean out of control. As we’ve talked about it today. Do you mean somebody accessing the data, whether inside the company or outside who shouldn’t be or when you mean out of control? What does that mean?

 

David Kruger  18:16  

Okay, so basically, the the object of every cyber attack, I mean, we talk about protecting the network and things like that. But you know, cyber attackers aren’t after our network, if they want one they can go to by their own, right. I mean, the the object or every cyber attack is being able to gain control of data and do something that that data is owner doesn’t want done. Normally, the most common thing is to make a copy of that data, you know, we talk about sending people a file, we don’t send people file. So it may it’s not like you take a canopies off of your shelf, and you can see that it’s missing, you know, we make a copy, and we send them the copy. Right. So the the goal, say in stealing data is to be able to make an illicit copy and transport it out of there. So the the goal, again, is to never have that out of control. Even if they get a copy of that data. They were using actual technology, it would be an encrypted file with its own unique key. If they get a copy of that key. I mean of that file, they can’t do anything with it. I mean, theoretically, they could brute force encrypted, but that’s very, very unlikely to happen, right? So it’s about being able to control that that a copy that’s gotten from storage or a copy that’s intercepted in transmission and be able to control it when it’s decrypted when it’s being used to be able to give the instructions and have them bound to the data so that the software that has permission in the user that has permission to decrypt that data can only use that data in ways that the owner of the data has permitted them to do. That’s what we mean by staying in control.

 

Jodi Daniels  20:00  

That’s fine to make sure you’re happy.

 

Justin Daniels  20:02  

I’m just trying to

 

David Kruger  20:05  

think, oh, ants ask you a question. Does that make does that definition of control? And then that definition of security and stuff? Does that make sense to you?

 

Justin Daniels  20:14  

It does, I think the only place I would suggest is one of the, when you talk about data, you know, a typical ransomware attack, when they encrypt the network, they’re basically denying you access to your data, they may not exfiltrate it, but again, you’re not controlling your data, because the threat actor has come in and now is denying you access to what you need in terms of your data to carry on your business.

 

David Kruger  20:41  

Yeah, yeah. Ransomware is sort of a separate issue and sort of not, you know, part of a thing that’s maybe useful to think about, and we always recommend, so difficult to do, I’ll tell you that, but but there’s three attack surfaces in general categories, right? You, you, you have to compromise the data object, you have to compromise the user, you have to compromise hardware. So when you look at ransomware attacks from our two cents, it’s always a failure of authentication, who failed to authenticate the user, or the software or the hardware unless you’re subjected to ransomware attack? And we really can’t help with that. That’s a separate issue.

 

Justin Daniels  21:18  

No, that makes sense.

 

Jodi Daniels  21:20  

Sure. So David, if someone wanted to learn more about all of these are there, tools that you might suggest for someone to learn a little bit more? Are there some communities or other resources that people could learn?

 

David Kruger  21:34  

Well, our market, our target market is really software developers. So for software developers that we’re going to have a pretty typical structure on our website, go to docs.absio.com. And all of the developer documentation is right there, it’s and it’s pretty well done, even if we do say or saw ourselves. And, of course, if people want to code demo, or, or something like that, or want to know if we can help them in a particular use case, they can reach out to us and we’ll be delighted to talk to him.

 

Jodi Daniels  22:03  

Excellent. And I always like to ask someone, now, we’ve talked a lot about securing data objects, perhaps a different tip. So what might be your best privacy and security tip, perhaps you’re out with friends, they hear what you do. And you would tell them to do what with all of their data,

 

David Kruger  22:22  

get yourself a secure messaging app, but well done secure messaging app and abandon the email to the extent necessary. SMTP email was the giant sucking chest wound of cybersecurity. You know,

 

Jodi Daniels  22:36  

it’s really like that that’s

 

David Kruger  22:39  

a 40 something year old protocol that was never designed to be secure. And you can put patches on it. And you can, you know, put a, you know, shiny coat a new paint on it, but it’s it is, it is the Swiss cheesiest of all communication protocols. So to the extent that you can your, you know, do signal or something like that, for your personal communications, if you can go out and get a third party secure messaging app, it’s capable of managing and controlling files, and there are a few of those around for your business use them. But to the extent that you can abandon email, it is just an accident waiting to happen. And I’m saying that as a security as a safety guy, it’s an accident waiting to happen. It’s gonna hurt you if you use it long enough.

 

Jodi Daniels  23:25  

You mentioned on the business side that there’s a couple out there, can you share what a few that you think highly of?

 

David Kruger  23:33  

Yeah, I think one of the ones that I think pretty highly of is wired. They’re a pretty good one. And there’s another one who’s named who’s also that’s also quite good, that’s name just went right out of my head, I’d have to go look it up to tell you what the name is. But, I mean, if you go into the secure messaging app space, and you look for specifically, you want to look to see how their cryptography is done, and it’s well accepted, and you want to look to see how they manage and how they secure information stored on end devices. There’s a pretty good variety of them out there, pick one, use it right, and try to discourage the use of email as much as possible. Several of them will have a free client, web client that people who don’t pay for the service can use them and then that’s fine. But if you’re doing any kind of serious business information, such as you know, sharing where you as a company, are liable to what people that you transmit your information to do it, you know, where you have sort of an audit relationship and stuff like that. Just it has to be painful, maybe a little bit expensive, but get off of email. We know that that’s the primary vector. Most cybersecurity attacks start with email. So again, simple safety guy right here. It’s dangerous. Quit fooling around with it.

 

Justin Daniels  24:53  

There you go. Email as toxic waste. I like it. David, thank you so much for being on our program today. If people want To connect or learn more, what’s the best way to do that?

 

David Kruger  25:03  

Just go to our website, we have a contact form. It’s Absio, Alpha Bravo, Sierra India oscar .com little pilot lingo there for you. And just reach out to us, you got our phone numbers and our email in the contact form, we’d be delighted to talk to you. You know, I

 

Jodi Daniels  25:21  

remember when you actually used to call an airline and talk to someone and they would give you the confirmation number. And that’s exactly how they would do it. So anytime I do the same, I always think that I’m an airline person.

 

David Kruger  25:35  

Well, I’m a pilot, so we take that we think in those terms, so.

 

Jodi Daniels  25:39  

Absolutely. Well, David, again, thank you so much. We really enjoyed the conversation.

 

David Kruger  25:43  

Thanks, you guys. You guys. Have a good day and I hope your feet stay penguin and warm. Thank you.

 

Outro  25:49  

Thanks for listening to the She Said Privacy/He Said Security Podcast. If you haven’t already, be sure to click Subscribe to get future episodes and check us out on LinkedIn. See you next time.

 

Privacy doesn’t have to be complicated.