Protecting Consumer Data From Third Parties
Ian Cohen is the Founder and CEO of Lokker, a company committed to protecting businesses from third-party privacy risks. Before Lokker, he served as CEO of Credit.com, where he transformed the company into a trusted high-growth hub for consumers seeking guidance on credit and finance. Ian is also a Board Member of Uqual, an Industry Advisor at Long Ridge Equity Partners, and an Advisor and Investor at PolyScale.
Here’s a glimpse of what you’ll learn:
- Ian Cohen’s inspiration for focusing on consumer privacy and founding Lokker
- How Lokker collects and analyzes data
- Ian shares the surprising results of Lokker’s data privacy research project
- The compliance challenges of third-party data
- What privacy risks should companies address when deploying data collection technology?
- How privacy laws affect business compliance
- Ian’s advice on third-party data sharing
In this episode…
Data collection has become increasingly obscure, and companies like Meta and Oracle are facing lawsuits for unauthorized data tracking and sharing across third parties. With data sharing largely unregulated among companies, how can you protect customer data?
When collecting consumer data, companies often struggle to interpret the data and lack knowledge about its location and usage. With the emergence of GDPR (General Data Protection Regulation) in the US, businesses must go beyond internal privacy programs to regulate external data sharing and comply with the law. Ian Cohen stresses the importance of establishing awareness campaigns and fostering transparency and visibility among third parties.
In today’s episode of She Said Privacy/He Said Security, Jodi and Justin Daniels host Ian Cohen, Founder and CEO of Lokker, to discuss protecting consumer data from third-party access. Ian explains how Lokker collects and analyzes data, discusses the compliance challenges of third-party data, and offers advice on third-party data sharing.
Resources Mentioned in this episode
- Jodi Daniels on LinkedIn
- Justin Daniels on LinkedIn
- Red Clover Advisors’ website
- Red Clover Advisors on LinkedIn
- Red Clover Advisors on Facebook
- Red Clover Advisors’ email: firstname.lastname@example.org
- Data Reimagined: Building Trust One Byte at a Time by Jodi and Justin Daniels
- Ian Cohen
Sponsor for this episode…
This episode is brought to you by Red Clover Advisors.
Red Clover Advisors uses data privacy to transform the way that companies do business together and create a future where there is greater trust between companies and consumers.
Founded by Jodi Daniels, Red Clover Advisors helps companies to comply with data privacy laws and establish customer trust so that they can grow and nurture integrity. They work with companies in a variety of fields, including technology, ecommerce, professional services, and digital media.
To learn more, and to check out their Wall Street Journal best selling book, Data Reimagined: Building Trust One Bite At a Time, visit www.redcloveradvisors.com.
Welcome to the She Said Privacy/He Said Security Podcast. Like any good marriage we will debate, evaluate and sometimes quarrel about how privacy and security impact business in the 21st century.
Jodi Daniels 0:21
I Jodi Daniels here, I'm the founder and CEO of Red Clover Advisors, a certified women's privacy consultancy. I'm a privacy consultant and certified informational privacy professional, providing practical privacy advice to overwhelmed companies. Hello, Justin
Justin Daniels 0:37
Daniels here I am an attorney and passionate about helping companies solve complex cyber and privacy challenges during the lifecycle of their business. I am the cyber quarterback helping clients design and implement cyber plans as well as help them manage and recover from data breaches.
Jodi Daniels 0:55
And this episode is brought to you by Red Clover Advisors, we help companies to comply with data privacy laws and establish customer trust so that they can grow and nurture integrity. We work with companies in a variety of fields, including technology, ecommerce, professional services, and digital media. In short, we use data privacy to transform the way companies do business. Together, we're creating a future where there's greater trust between companies and consumers. To learn more, and to check out our Wall Street Journal bestseller and new book Data Reimagined: Building Trust One Byte at a Time, visit redcloveradvisors.com. During their giggle monster,
Justin Daniels 1:38
I think I deserve a new co host chair. Poster decided I want a more comfortable chair.
Jodi Daniels 1:48
I see we'll have to see what the podcast budget has been structured,
Justin Daniels 1:51
or something. Anyway, moving along to today's lovely guests. We have Ian Cohen, CEO and founder of Lokker dedicated to providing solutions that empower companies to take control of their privacy obligations. before founding Lokker in 2021. Ian formerly served as CEO for Credit.com and CPO for Experian, where he focused on consumer permission data. Welcome to the show with giggle monster and, Jodi. Hi, Ian.
Ian Cohen 2:30
Hi, hi. Nice to be here. Well, we're sorry about your chair, I can't see it. But I imagine I don't see any back to it that doesn't offer you much support, I would suggest getting something that's more appropriate.
Jodi Daniels 2:44
He just has the you know, extra folding chair, whereas I have the really good office chair. After all, we record this in my office space. So I'm the short one, and I get the taller office chair, you're taller, and yet this is shorter.
Justin Daniels 2:58
This used to be my gym room, and it's gone.
Jodi Daniels 3:01
Long gone by All right, we're here. Back to the
Ian Cohen 3:06
nice to be here. Nice to be here.
Jodi Daniels 3:08
Absolutely. Well, so we can we shared a little bit with your intro of founding Lokker in 21. But we always like to help our audience understand a little bit farther back. So if you can share, how did your career evolve to founding this company? Today? Sure.
Ian Cohen 3:26
So way back kick Credit.com, we figured out something very simple that if you give a consumer access to their data, one they like you, which is always a good thing. And two, they generally can make better decisions. So early days, we tapped into the market of consumer permission data. And that's a very simple business model, you give somebody you buy data, you give it to the consumer, they own it. And then suddenly, if they're applying for a credit card or a loan or something like that, that converted five or six times the rate as a customer that doesn't know their credit score, for obvious reasons. So it started there. And then it emerged into all kinds of other data. So if you unlock data and give it to consumers, what else could they do with it? Could they make smarter decisions, so it was really about breaking up in that black box. And then I after experience, very interesting case there where they were opening up products that give them relationships or creating relationships with their customers. I started looking at different kinds of privacy models, both direct to consumer and b2b. And over the years, had cracked the, you know, obviously, Mozilla and the folks over there, the brave browser, or DuckDuckGo. And I still saw the same problems growing with the use of third party and cloud software, which we'll get into. So I just just tried to figure out what would actually work because the complexity of what's going on in the privacy side and, you know, and the lawsuits. They're very complicated and I just don't think most consumers boomers have the ability to manage that I think the burden is falling on the companies as it probably should. So my thought was just coming out of my background was, let's create consumer tools, let's create a consumer marketplace. But all the companies that I saw doing that you own your own data, Blockchain services, great Trump browser, they're awesome companies. But it seemed to me that the real problem was going to fall on the companies taking responsibility for their own privacy. And so that's why we found a lock or to give companies the tools to protect their customers, and by proxy, protect their reputations.
Jodi Daniels 5:41
So for those who aren't familiar, can you give us a little glimpse a little bit more about what locker does?
Ian Cohen 5:47
Jodi Daniels 8:07
Very helpful metaphors and analogies. So thank you so much. I really appreciate that.
Justin Daniels 8:14
So in a recent research project that Lokker conducted, can you share more about how your company collects data to draw its conclusions?
Ian Cohen 8:22
Yep. Yep. So when you look, when I show people what you do, they ask how did you get access? And it's always a funny moment, because I have to explain, probably not to the people listening to this podcast that, oh, no, we're just grabbing the data coming out of every browser session, and running it through all kinds of tools. But this, this data is all out there all the time. So we technically we do it through headless Chrome. So we set up browsers in different areas of the world. And we go to the website, and we we scan it as if we're user. Obviously we have an automated way of doing that. And so I can get into what we found out and what we were trying to discover, but the how part was very simple. We have a scanning tool, and what happens behind the scenes is a bit more involved. But we're resolving what customers are really seeing the data customers are really losing their endpoint the browser.
Jodi Daniels 9:22
So I imagine that what they're really seeing, and what's really leaking is going to surprise some people. So can you share a little bit about what are some of the conclusions that the research has drawn? Or what are some examples that have surprised people? What what do they not know?
Ian Cohen 9:40
Oh, I think the thing that surprised me was more than 40% of the health care companies in the in the country have not just met a pixel on their site, but Snapchat, Facebook, tick tock and just about everything else, and nobody expected that and I think people immediately want to You know, find a bad guy and they turn to the organization. In reality, a lot of these organizations have no idea what's going on behind the scenes on their site. So what's up prices, we knew we were going to find a lot of stuff, because we see behind the scenes all the time. So first, what a lot of people, when they first see it, don't understand is the breadth of what happens behind the scenes with trackers. And once you see it, you totally get why all these lawsuits are happening, you get why GDPR is in place. And you sort of get a different sense of outrage. Because you're not just sharing with, you know, the service you contracted with, but every third party and so forth. So I think what really surprised us, if you just look at the lawsuits, like Meta got sued for grabbing pH, I protected health information. And, you know, I'll say alleged, but it's there in plain sight, excuse me. And then the hospitals that they were dropping a pixel on, also started getting sued. And these are good hospitals that are in the business of saving lives. And maybe they had an awareness campaign, maybe they had a fundraiser, but at some point, they dropped these social media tags on their site. And they just sort of spread like viruses. Okay. Then the next lawsuit that we tracked, and so we launched in July, and then the second we launched, the lawsuit started. And so yeah, I don't like seeing these things happen. But we built the company Exactly. For this reason, because we saw this coming. I mean, we didn't think it was gonna come right at our launch. We weren't that smart. But okay, the second lawsuit was Oracle. So Oracle, it's a big database company, but one of the things they do is they're a registered Data Broker. And Oracle has four and a half billion consumer records. And for reference, there's 8 billion people on the planet. So it's a big bunch of data. And so the first lawsuit I mentioned, a meta was against using HIPAA, there was a HIPAA violation, to share the data. And I'll get into the specifics. But then oracle was a whole different use case, it was there was so much data there, that people were able to use the data as a proxy for sensitive data that customers had opted out of. And then I think it kind of gets to the heart of the issue when you have this much data. And people say I have nothing to hide, and you know, who cares. And the problem is here, and we found this out way back with AOL that once you have enough data, you can D anonymize or re identify anything. And that's exactly what they're getting super. Then you have the wiretapping lawsuits against a bunch of companies for using what a lot of people use these session recording devices that marketers use to track heat maps on their side and user feedback. And it all sounds fine. But at the end of the day, they're violating wiretapping laws. And then the fourth lawsuit was about CPRA. But I'll hold off on that, I think the biggest surprise for me was and for most people was that medical data was getting shared with Facebook. And the reason that happens, and it's not difficult, in the reason it can happen inadvertently, is if I go to a site of a hospital, and I type in cancer symptoms, and then I land on a page, find a doctor request appointment. And then I look up contraindicated medications, these are all in front of the law, again, not behind the law again. But that data is all getting shared with the social media company, and oftentimes with your social graph, so it's way more problematic than than I think people understand. So I'll wrap it up is saying, modern day advertising and retargeting meets healthcare, and the two don't mix. Long answer.
Jodi Daniels 14:04
I had several agencies who are inquiring about the that megapixel and healthcare related lawsuits. And at the same time, part of the problem was also that you have people who very well might have wanted to just put the pixel on for really good purposes and awareness campaign, like you just pointed out the diligence and understanding what's happening and the almost kind of QA process, the quality assurance process of once it's on to understand what else is it picking was missed. There's a lot of companies, for example, who will do analytics pixels, and they'll do a check to make sure personal information is not going over. And in this situation, what was also happening was you'd have my name, get sense over so right, not even just the tracker is trying to connect all the dots, but actually Jodi made this kind of appointment with this kind of doctor at this kind of time. That's a lot of information that isn't part of an awareness campaign.
Ian Cohen 15:00
We actually see it, we can see not only the picture, we can see what data it's gathering, oftentimes, because we're running the data through a DLP service. So it's, you know, a lot of people run data loss prevention on their server side, we do it inside each session. And what happens to your point was exactly right, that the hospital is trying to do the right thing. And a lot of the breaches that have happened are similar, like somebody sponsors a nonprofit marathon, right? And they use a third party. Anyway, in this case, yeah, so specifically, what's happening is one, the word like, if you search for diabetes, diabetes, and I make an appointment get passed in the URL string, so they can see all that then you have these form fields that pre populate with your browser, right? And they're all picked up to, I mean, you can configure your way around this, but nobody, I mean, you're talking about a humongous job to manually do so people. I agree that companies oftentimes don't know most of the time don't know that they're doing this. They're well intentioned, but they can't see it. I mean, they literally can't see it, it's happening off premises for them.
Jodi Daniels 16:09
An interesting time that we are in.
Ian Cohen 16:12
Indeed, that's another way of saying no your data.
Jodi Daniels 16:16
It is but it's complex, it's I think what we're talking about here is the we need some tools to be able to scan because even if you know the data, you don't realize that the form and the fields in the form we're getting get collected and sent off.
Ian Cohen 16:33
Well, I can tell you, like when you visualize this for somebody the problems kind of overwhelming because you can go to a single page, I'm not going to name the names here. But you can go to a single page of a hospital website, that's similar to the one we were just talking about, if you look at all the third parties, they might have only placed well, they actually probably placed a lot more third parties in their Tag Manager than you would have thought. But when you get out to all the people that had shared with you talked about hundreds, or 500, different third, fourth, fifth parties that are getting the data shared with, and again, it's just cloud software, uses other cloud software uses other cloud software, so you get an exponential growth. And when you visualize that for somebody, suddenly it makes sense, then it comes down to so now what and I think it really is just a problem with First of all, if you can't see it, you obviously can't manage it. And it's traditionally you've been in this business a long time, it's been privacy. The relationship between the chief privacy officer, the GC and the CI, so there's a lot of space, and there are a lot of stuff happens, right? And sometimes it's diffuse about who owns that responsibility. Right. And so I mean, I think their problems really interesting. And I think the words privacy and privacy controls have been and now I guess her you know, synonymous with compliance for a while. But all of the things we're talking about all of these lawsuits, all the different ones I just named and you could bring up many more have nothing to do with Cookie consent. nothing whatsoever to do is that's really the point. I think that my big takeaway that I share is that these companies are getting sued for really bad things. Most of them didn't know. Some of them were a little bit arrogant, but most of them didn't know it. And it had absolutely nothing to do with Cookie consent. They had their cookie consent policy up. Now the regulations can nudge things in the right direction. But ultimately, it's the companies that want to pick up the reins and do better. That drives the change I think
Justin Daniels 18:49
so what privacy risks are companies failing to address as they deploy new technology that gathers data we're talking about their failure
Ian Cohen 18:59
to address it so there's a lot of unknown knowns to go back to that you know, known unknowns are
Justin Daniels 19:05
not going to go Donald Rumsfeld Aria,
Ian Cohen 19:07
I'm not gonna go down a Rumsfeld, but I can't, I can't help sight like a good line. A good line is a good line. It's the list is so long. The problem is that business as usual, just isn't working suddenly. And it's not working because the plaintiffs bar is an opportunity. So once you crack open the world we're talking about and show it to people. It's a field day for regulators to feel safer plaintiffs bar and you know, you're an attorney, so you can address this better than I can show on the defensive side. So what's the question again? Let me try this again. I want to give you a shorter answer to it.
Justin Daniels 19:47
Not at all just your thoughts around the kinds of privacy risks that companies are failing to address when they just deploy new technology bit, their website or a new product? Just see Humans privacy continues to be an afterthought for many of the reasons kind of we're talking about here where companies just don't know.
Ian Cohen 20:06
It's that they just can't control what's going on between themselves and the browser. They're just not in control of it. And so everything, you're sharing data with a bunch of people, so you have to ask yourself at any given moment, am I content? Is there any data on this page that the rest of the world shouldn't know about?
Justin Daniels 20:27
You're basically saying, If I understood you correctly, is if I have a medical condition, say I'm looking at prostate cancer, or something like that, and I go to a hospital website, I could start getting ads for prostate care or whatnot, because I don't even realize my data was being shared that I was on the site with some third party, who's now targeting ads to get me to buy something related to what I
Ian Cohen 20:51
was researching. Exactly, right. That's right. And even if you can say, well, the ad company promises it doesn't work suppresses this kind of data, layers of algorithms deep, the data still out there. And then later, you look at, you know, I'm not implying that Oracle is doing this, but you look at a data set that's big enough and a machine learning algorithm, and you just can't, you can't send that data out there and say, oh, we'll figure it out later. So, you know, there's a bunch of stuff, they really need a view into what's happening behind the scenes of their page. So what I say what they have to worry about is that the browser is, for all intents and purposes, a new endpoint. Okay, so people spend all this money on endpoint detection and response and internal privacy and security for their employees. But what's happening on the other side of the equation is the problem. And you can even see in terms of the breaches that have happened and the kinds of incidents, people are targeting that side of the house more than the server side now because it's easy or easier to target.
Jodi Daniels 21:59
So to help change and influence company behavior, how do you see the privacy law landscape potentially impacting that change?
Ian Cohen 22:11
Jodi Daniels 23:24
curse, free podcast, maybe the like, PG curse level.
Justin Daniels 23:29
Oh, shit, censorship, get her off.
Ian Cohen 23:33
Oh, shoot, you know. So anyway, we're gonna lean as a go. Holy crap. This I mean, so that's probably my most polite ways. So you see this. And I think that's the big change. So the big change is, you know, in a nutshell is suddenly a company says you got to inventory everything. And when you do that, first, they're trying to figure out how and spend a lot of money doing it, it's actually very easy to do. And the next one is, once they see it, there's a wake up call. So I think, you know, any, any company that, you know, sees that the nurse suddenly asking a bunch of other questions. So that's what I see is the big difference. Just the awareness, the visibility, and it's coming into focus. So the lawsuit, yeah, they're gonna happen. But a lot of people think what happened to me, they're starting with the biggies. It'll work its way down. And the other thing I think that's happening, that's just, you know, huge for giant companies is GDPR is getting real teeth, right. I mean, heavy fines every single day, you know, there's a tracker, you can log on to it's called, I think, the GDPR enforcement tracker, and you can log on to the site and just see every single day, you know, multiple people getting fined. And the fines are one thing directly. Aviation loss is is quite another. So I think most companies here that are multinationals have to build a business that complies with California and have to build a business that complies with GDPR. And those two things alone drive a lot of change. You know, even if the laws aren't perfect or they talk to about cookies. And people interpret that we've got a really there's this sea change, or suddenly, people get a sense of what's going on. And you know, there's always good actors that step up are the best actors, the early adopters that step up, and everybody else tends to want to copy them. So that's, I see that change happening very quickly, between now and you know, q2 of next year, which is two quarters after the law goes into effect. So if I give you an inventory of everything, it's kind of like a full body scan, when they did that was in the 80s, or 90s, you could get these full body scans, they were horrid. Because he would go to the doctor, you'd get a scan and come back and tell you anything and everything that could possibly be wrong with you. And so that would make most people freaked out and miserable. So I don't think you know, full body scans are getting that genetic, you know, cross section test has been a lot of, you know, work about what should you tell people with shitting? You tell people, if you tell me everything that could go wrong, and a scan? Yeah, you know, it's it's a pretty big wake up call. And it does not make people happy campers. So then the question of what you do about it comes into play very, very quickly. And once you're asking that question, you're off and running to doing the right thing, right. And so I think that these nudges, which is, I think how regulations are supposed to work, these nudges, have this, this ripple effect, that gets magnified. And so what we're seeing is, you know, some of the largest companies are going to do better. And when they do better, they're gonna want to tell everybody about how much better they're doing. And when they do that. Other people want to copy them be because it comes to this becomes the standard. And I think what Apple's done, you know, whether it was through self interest or not, when they send up their private relay. It's great for end users. If you're a company, you can never rely on everybody been on private relay or everybody using a Safari browser. So you have to come up with something that is browser agnostic, system agnostic. And yeah, I see a lot of change. I'm very optimistic. I do not see. I don't know how you feel about it just based on Congress, I do not see the American data privacy and protection act as passing anytime soon. And even if it did pass, I mean, they're real organizations like Cisco that have to do their day jobs, regardless of whether there's a new law or not. But I'm not holding my breath down to federal wall, even though most companies would prefer it, I think
Jodi Daniels 27:33
we we talk about this a lot with a number of different people. And my answer is always it's a political Dart. I think thoughtful legislation is not that particular version. And I think many privacy professionals would say a thoughtful version is still a couple years away. But politics or politics, and neither side could throw a dart you never know. So I don't know. I think we'll have more states. I think in this next legislative session, I think we'll have more states that will be introduced, and we'll probably keep picking some up overtime.
Ian Cohen 28:07
We work with some of the AGS, former EGS I agree. It's a winning political issue. First of all, for both sides. Right. It's one of the few issues you can pick up and privacy, whether the reason is, you know, be fair or get out of my private life, whatever it is, everybody wants it for some reason. It's kind of foundational in the US. And as it should be. So yeah, it's a winning political issue. And there ain't many of those these days. So why wouldn't you take it?
Justin Daniels 28:39
Unless it's a political ad being texted to myself, and they won't get rid of that. Now,
Jodi Daniels 28:46
they won't get rid of that. So So Ian, with all this information that you know, about how data is used and collected, what is your best privacy or security tip that you might offer your friends when you're gathering together at a party?
Ian Cohen 29:01
Jodi Daniels 29:05
It's interesting, you say that I recently was talking to a company and we did a very light touch cookie scan, Tracker scan, and came across some different trackers that we know are part of the ad advertising ecosystem. And the company, answer no, we don't do any advertising. We don't do paid advertising. And they had no idea about these different trackers. And it was a very interesting conversation. Needless to say, there'll be some investigation as to where these tags are, what they're doing what's happening, it was likely, maybe one campaign one time, the tags were left on the site, multiple people in since that activity, and now they have to figure out what is this and why is it here? And what do we do now?
Ian Cohen 32:32
Look, you know, you put I don't want to again, implicate anybody on the call specifically, because they don't have a chance to defend themselves. But I will anyways, there's a bunch of social media tracking companies that put you know, the, the little logos for all social media companies that you can configure and just drop it on your site. And if you look behind the scenes, there's dozens of trackers they're sharing with and triangulating with. So, hey, I just put some social media tags on my site. I'm not advertising. Yeah, but they are, you know, and, and by proxy. And that's the problem, right? It's the these third parties, as I started with have other third parties. So that sounds complex. But at the end of the day, when you just see it, it's like, Hey, I dropped this little nice thing on my site. And all of a sudden, five advertisers in the blue Kai tracker, the retargeting companies in there, and it's just, it's a mess. It's a mess. And if all those trackers were people standing around you, I think you'd get the idea. If you had a bunch of paparazzi could find your paparazzi around you rather than a bunch of dots on his screen. That's what it is. We say all the time.
Jodi Daniels 33:45
We say that all the time, when we often do presentations, I'll say imagine you have someone behind you stalking and tracking every single thing we kind of do a day in the Life, you probably would turn around and question Why is this person following you and notating every single thing you've ever done. And that's what we do in the digital space.
Ian Cohen 34:03
I saw I love that analogy, because it just really drives it home. Like somebody's shopping on a street corner with a crowd of people following them around taking note and you know, and I think that blocking it, so I won't get into what exactly and how we do it but blocking this stuff. So drawing a fence around the site. And starting with a clean baseline because withdrawal really trying to do is start with a clean baseline, say, hey, I want all these services on my site. I want to have a modern site that runs fast, and has the newest and best services for my customers. And so you're using a lot of tools to do that. And those tools introduce everything we're talking about. So what do you do about that? How do you draw? Just create a clean baseline. So I can say hey, look, I share this data with Visa. And I don't want anybody else near that data. Whether you're talking about really serious stuff like skimmer attacks, major heart attacks, or if you're just talking about the more commonplace occurrences of Why is the data when you go to what he says, Why is the data are going to 910 14 different countries? You know, it's, you just need to start with that clean baseline. That's the goal. So I should probably should have started there. And how do you get to that goal while you do your spring cleaning and all the stuff we talked about, but it really is about just getting a clean baseline. And from there, you know, for newer companies, it's easier. So if you're starting a company, you're in an awesome position to not crowd up your site as much junk. You're an existing company, it's harder, but it's doable.
Justin Daniels 35:34
So when you're not helping companies, put a fence around their website and learn all this stuff. What do you like to do for fun? Travel.
Ian Cohen 35:47
I've been so I've been on the road for the last week. But it was mostly for work. I love traveling my, my wife is from the UK. So we travel in Europe, in the UK. We actually have an office in Dublin small one. But it gives me a really good excuse to go over there. I did surfing for a while. I'm from New York City. But I took up surfing when I got out here to try to be more California, like, I'm awful at it, but I love it. jet skiing, anything on the water is great. I'm way too all but I decided to read like take up rock climbing in the gym. So I'm doing that and, you know, writing, I also like to write articles and publish those. So I just love getting around. I love talking to people meeting new cultures. You know, we went to Mauritius, the island of Mauritius, of you know, the east coast of Africa couple years ago. And so um, yeah, you know, it's a good life. And during COVID We live up here in northern California, when or not giant fires threatening to burn down our house is beautiful. And, you know, we live in a very nice part of the country. So we also get out and do a lot of a hiking and yeah,
Jodi Daniels 37:02
I have your favorite activities, hiking in the outdoors. Yes. And where can people go to learn more about you and Lokker?
Ian Cohen 37:10
Sure. Well, to get the study I just talked about just go to lokker.com lokker.com. And we call it a research paper, we were told don't call it a white paper. Nobody will read it, call it a research study. So it's a research study. But everything you just talked about, I think it's a quick visual guide to what's going on in from there as a narrative is all kind of spelled out in that research study. But go to lokker.com Watch the videos and we've done a couple articles recently for any information we can forge you can check those out too, but or reach out to me. Excellent. Well, Ian,
Jodi Daniels 37:47
thank you so much for sharing all of this great information. We really appreciate it.
Ian Cohen 37:52
My pleasure. Great to meet you guys and you know, get a better chair. Have some recommendations reach out to me later and it good to meet you guys. Thank you very much. Likewise.
Thanks for listening to the She Said Privacy/He Said Security Podcast. If you haven't already, be sure to click Subscribe to get future episodes and check us out on LinkedIn. See you next time.