Understanding Privacy and Security Regulations in the Ad Tech Space

Yacov SalomonYacov Salomon is the Founder and Chief Innovation Officer at Ketch, a coordinated set of apps, infrastructure, and APIs designed to build trust with customers and grow with data. He is also the Chief Technology Officer at Stanza. As a recognized authority in machine learning and AI and a seasoned tech expert, Yacov has built industry-leading innovative technology and teams at startups as well as Fortune 500 companies across many verticals. Before Ketch and Stanza, he was the Head of AI and Innovation at Superset Venture Studio and a Lecturer at the University of California, Berkeley.

Available_Black copy
Tunein
Available_Black copy
partner-share-lg
partner-share-lg
partner-share-lg
partner-share-lg
partner-share-lg

Here’s a glimpse of what you’ll learn:

  • Yacov Salomon shares how his career evolved to founding Ketch
  • The three privacy issues surrounding ad tech
  • What are the implications of consumer data collection?
  • How companies can track their ecosystem to control and protect consumer data
  • Yacov explains the challenges companies face when complying with federal privacy laws
  • Advice for businesses developing privacy programs
  • The future of privacy laws and regulations
  • Yacov’s best privacy and security practices

In this episode…

In the evolving privacy and security space, advertising technology is becoming increasingly invasive, with major companies like Sephora facing settlement actions over how they process consumer data, especially in digital advertising. So, how should you manage consent and develop privacy programs to control and protect your customer’s data?

When navigating federal privacy laws surrounding ad tech data, Yacov Salomon recommends establishing a permission layer. This provides customers the option to consent before releasing their data to third-party systems. By implementing automated technology into your company’s ecosystem, you can identify confidential data and regain control over it to maximize customer trust and comply with privacy guidelines.

In today’s episode of She Said Privacy/He Said Security, Jodi and Justin Daniels sit down with Yacov Salomon, Founder and Chief Innovation Officer at Ketch, to talk about how companies can manage ad-tech, privacy, and consent requirements. Yacov discusses the three privacy issues surrounding ad tech, the implications of consumer data collection, and how companies can track their ecosystem to control and protect customer data.

Resources Mentioned in this episode

Sponsor for this episode…

This episode is brought to you by Red Clover Advisors.

Red Clover Advisors uses data privacy to transform the way that companies do business together and create a future where there is greater trust between companies and consumers.

Founded by Jodi Daniels, Red Clover Advisors helps their clients comply with data privacy laws and establish customer trust so that they can grow and nurture integrity. They work with companies in a variety of fields, including technology, SaaS, ecommerce, media agencies, professional services, and financial services.

You can get a copy of their free guide, “Privacy Resource Pack,” through this link.

You can also learn more about Red Clover Advisors by visiting their website or sending an email to info@redcloveradvisors.com.

Episode Transcript

Intro  0:01  

Welcome to the She Said Privacy/He Said Security Podcast. Like any good marriage, we will debate, evaluate, and sometimes quarrel about how privacy and security impact business in the 21st century.

Jodi Daniels  0:22  

Hi, Jodi Daniels here. I’m the founder and CEO of Red Clover Advisors, a certified women’s privacy consultancy. I’m a privacy consultant and Certified Information Privacy professional, providing practical privacy advice to overwhelmed companies.

Justin Daniels  0:37  

Hi, Justin Daniels. Here I am passionate about helping companies solve complex cyber and privacy challenges during the lifecycle of their business. I am the cyber quarterback helping clients design and implement cyber plans as well as help them manage and recover from data breaches.

Jodi Daniels  0:52  

And this episode is brought to you by who I get a symbol of my head, Red Clover Advisors, we help companies to comply with data privacy laws, and establish customer trust so that they can grow and nurture integrity. We work with companies in a variety of fields, including technology, ecommerce, and media and professional services. In short, we use data privacy to transform the way companies do business. Together, we’re creating a future where there’s greater trust between companies and consumers. To learn more, and to grab our new book, data reimagined building trust one bite at a time, go to red clover advisors.com. Well, today is a super fun day. But before we dive in to our amazing guests, do you have any idea what today is Justin? September 29? Yes. Do you have any idea what happened on September 29? Jim, you’re going to tell me, I am 20 years ago, we went out for our first date.

Justin Daniels  1:54  

Wow, you really remember

Jodi Daniels  1:57  

I do I am an elephant.

Justin Daniels  1:59  

What did we do on her day,

Jodi Daniels  2:00  

we went to dinner at a restaurant that sadly is no longer here. And it was a very special restaurant because we went back for many celebrations, including that’s where we got engaged. So now that we have that underway, you can switch gears, because it’s a happy day. And it’s well, and we’re going to chat, marketing and privacy. So we have Yacov Salomon, who is a recognized authority in the fields of machine learning and AI, and a seasoned technology executive who has built industry leading innovative technology and teams at startups, as well as fortune 500 companies across multiple verticals. Before Ketch where he is now. He was the VP of Engineering at Salesforce leading the AI and ML and analytics engineering group, and VP of data science at crux, which was acquired by Salesforce, Yacov lectures on machine learning and AI at UC Berkeley and holds a PhD in mathematics from the University of Melbourne, Australia. We’re so excited that you’re here to join us today

Yacov Salomon  3:02  

Thank you very much for having me. I’m looking forward to the conversation. Absolutely.

Justin Daniels  3:07  

So why don’t we dive in and ask you how your career evolved to founding Ketch?

Yacov Salomon  3:14  

Great question. I have two answers to it. There’s the external and the internal kind of reasons. External is circumstances as you described, we got acquired. All the founders here were part of this startup called crux, a data management platform operating in the marketing and ad tech space. I was leading the machine learning and AI efforts there, we got acquired by Salesforce in 2016. And I found myself running the machine learning AI and analytic functions, a large portfolio of acquisitions and product that were built in Salesforce. And my CTO from crocs was running then the CTO of marketing cloud, and it was 2017. And the number one priority was to make Salesforce GDPR compliant. So we rolled sleeves, and we went to work. And we build something that the industry held as a gold standard for what to do with GDPR. But we all felt deep inside that, you know, we’ve we weren’t sure we’ve done a good job. We felt like we left the system a lot more brutal. It was costing a lot more to run in, you know, AWS bills, we knew that if another regulation came around the corner, we will have to go back and do all the work again. So we all felt like there’s got to be a better way to do it. And if we were to build a solution for our previous self technologists, what would it be and how we can make it simpler. That’s the external the circumstances reason. Their internal that each I wanted to scratch personally, so to speak, and founding Ketch coming off two decades of I often tell myself an accidental data scientist, and when I started it wasn’t it wasn’t a thing, but I happened to be at the right place at the right time to ride this wave. And I’m very fortunate for that. I think We’ve, as an industry, machine learning AI is grew phenomenally over the last decades, we have cutting edge technology cutting edge algorithm. But we’re still delegated to solve the less important questions. And I often say that, that if you know, if somebody goes to the doctor, and she orders them to go into all these invasive procedures that reveal all the sensitive information, we never think twice, we know that there is an absolute value that we’re going to attain from that information. And that’s all there’s going to be done with it. If my machine learning algorithm ever has the chance to deliver the same, it’s clear to me that in many places that level of trust need to exist. So my personal kind of wish for what we’re doing here at Ketch is to unleash data in trusted circles, enable this trust technology to be able to for machine learning intelligence systems and data to go and solve the real hard problems that we have.

Jodi Daniels  5:56  

Well, congratulations on the huge run that you had to lead up to Salesforce and the the great work that you did there, you’re absolutely right, people are always referring to Salesforce as the gold standard, oftentimes when they’re looking at compliance, so part of what we’re gonna be talking about, and where people get confused, is ad tech. And I shared in our little pre work and pre call here that some people think ad tech is the hardest part of privacy to understand. So I was hoping that you might be able to, in our brief time here today, be able to share with our audience, what makes ad tech so complex to understand. And then we’ll kind of dive into how that ties to, you know, privacy laws and regulations and GDPR.

Yacov Salomon  6:46  

Yeah, I think there are three things that make attic difficult. The first one is the complex ecosystem. The second one is what I termed the identity problem. And the third one is the consent problem. So we can dive into each one of them, the ecosystem problem is, I think, in most people’s mind, you know, you watch a sporting event and an ad comes on. And for the consumer, there’s really only these two parties involved the context where the ad is displayed, the publisher, as we call it, usually, and then the brand, the ad, right, so the the advertiser in this case, and it feels like this, this is only a two player game. The reality, of course, of the ad tech industry is that there is a plethora of systems and services, and provide us behind the scene to enable that interaction, right. And we’ve built that over many years, especially with the advent of programmatic advertising. There is the supply side, a platform and the demand side platform and the exchanges and the optimizers, and so on, and so on, and so on. And it’s, it’s amazing. And it’s all geared around delivering the right message at the right place at the right time, or the right person. So we’ve evolved there a lot. But that complex ecosystem creates a very complex data flow. Reality, I call it supply chains. But data flies around in many, many, many hands, and many, many systems delivering value and producing some incremental intelligence of where it goes. But at the end of the day, it’s hard to say, you know, at any step of the, of this big data supply chain, where who owns the data? Where does it come from? What are the what are the regulations that apply to all of it, and so on, so on, so on. So it’s really that complex ecosystem, that is the first part, the second one is tied to that is the identity problem. I don’t know how many folks are aware of it. But you know, we all present ourselves in the internet, through our browsers, or mobile devices, we have getting attached all sorts of identities, you know, there’s ones that are attached to the browser and to the mobile device. But there’s also ones that we attained by logging in in places, we have our email, we have a username. And then there’s other ways in which identity gets manifested in the form of cookies, first party, or third party. And those identities are what marks the data and where it goes to be able to do the work. It’s a it’s what we call the device graph. It’s a complex again, network of identities. And to be able to associate data from one place to another, you need to have a good understanding of that identity world. And that is very tough, especially now that there is third party cookies going away, for example. And the third problem is this notion of consent. We think of consent, and I think we’ll talk about it more is giving permissions. And there is regulations that necessitate giving those permissions at a certain granularity so far. Example. GDPR says, Okay, you can’t just say Do you agree or not agree on collecting my data, you have to be a lot more specific. What are you going to do with the data? Okay? So when it’s just a company, speaking to a consumer directly, that can be quite dynamic and flat like that, they’ll say, Okay, we’re going to do these five thinks, do you agree or not? When you go into an ad tech scenario, as well as just saying, what you’re going to do, there is this plethora of who’s going to do it. And, you know, if you were to expose to an end consumer, the full spectrum of optionality, these, this provider can do these things. And this provider can do these things. It would be a phone book, a long, long leg, so flicks that you have to do a start permissioning and no consumer will be able to make those decisions clearly. So it’s very tough to strike the right balance with ethic of harmony permissioning, you know, levers you can afford a consumer at the end of the day, and how do you translate that to the rest of the system. So it’s difficult and I mean, there’s there’s tactical examples I could Chava is a great example, there was the FTC versus Coachella, Coachella is a is a location service that exists in many mobile devices. For many of us, it collects data in the background. And of course, you can cross reference that information with things like real estate data. And then it becomes extremely problematic when it comes to Planned Parenthood and other issues that we have in the agenda now, and being able to identify users where they, where they visited for different purposes. So all of that manifests in real, you know, order complexity may manifest itself in real world problems.

Jodi Daniels  11:40  

I really like how you summarize that into those three different buckets. I think that was super clear. Justin, I think you’re gonna help us maybe DIVE INTO THE CONSENT piece a little bit?

Justin Daniels  11:49  

Well, I guess I’m gonna throw this to the two of you when I think about what we’re talking about. And I’m going to bring up Sephora, something that Jodi was talking about a little bit earlier this week in The Wall Street Journal. And so Jaco, if I go to support his website, and I’m tooling around, because I want to get my wife, some goop or something perfume. Sephora knows I’m there. And then is what really happened that caused them to get a fine is then Sephora. So soon, we can take Justin’s user information and sell it to Facebook, and then Facebook will then run an ad selling Justin’s on his Facebook feed, he’ll get another ad for Sephora to go buy Jodi, something else. And the bottom line is, I never consented to Sephora selling my stuff to Facebook, so they can throw me ads from an ad tech perspective. Is that what happened here? Or can you help enlighten our audience? Yeah, so

Yacov Salomon  12:46  

the short answer is yes, that is what’s happening there. And like I said, again, here, we’ve extended the example from you and Sephora now to another third party, in this case, Facebook, but I want the audience to understand that there’s a lot of times third, or fourth, or fifth or sixth or seventh even order. So it’s everything I’m about to say just gets extremely magnified with the complexity of the on the ecosystem. But in a sense, yes. And these days, it sounds ridiculous that we were even talking about this as being something that should be or should you not afford the right but you we have to understand and when we started down the track of marketing and ad tech, right, the idea was to really benefit in consumer advertising was a knee marketing was a mean, by which we’ve paved the road of building the internet, right? I mean, like we’ve we’ve created all these services. And that’s what companies used. And the idea behind there was to how do we get you a better message. It’s only now after years of doing that we waking woken up to the reality that there are negative externalities to it, which is, you know, this, this idea that they are in the invisible side of what happens to my data. And so at the very least, what regulations demand is that companies are being transparent about what happens. So if you think about GDPR, what from a technical perspective, I usually say what it introduced, is what I call the permission layer. So no more just data running around free and going into systems and doing what it needs to do. But now going forward, we need Oh, as well as all the pipelines that for where you know, the data flows through, is we need pipelines for permissions. And so in that example, what happened with Safar is a few things that permission layer needs to do three things. Again, the first thing is it needs to collect those signals, it needs to ask users, what do you want to do? The second thing it needs to do is it needs to propagate that to all the other systems that you know, hold the data that collect, store, analyze the data. And the third thing it needs to do is it needs to make sure that those system do the to the effect that is required based on the permission. So if you say don’t collect my data for this purpose, it really doesn’t get collected. If you say delete my data in the system, it really gets deleted, right? Where it became a little bit gray area is in point number two and three, Sephora collected consent. I mean, I don’t think that that was the issue. But to the extent that you get propagated there was under the CCPA, at least, and I’m not a lawyer. So this is you guys a lot more experts in this field. But but the way I interpreted the CCPA was was not as explicit about what does selling of data really needs. And a lot of companies were in the, you know, had the opportunity to maybe interpret, moving the data around to deliver more value to the consumer, presumably some better ads as not really selling, but rather providing a service? Well, the Sephora ruling of the court says, No, that’s not the case, every time you move the data around, every time you include another third party that leverages the data to do some work, we will consider that under a cell. And therefore you can do, you need to apply all the required permissioning through, you know, all the different levels. But the problem is that I hoped

Jodi Daniels  16:10  

I was gonna add in that part of it. So especially under CCPA. And here in the US one of the distinctions, and this is part of the challenge with how companies handle it, I don’t have to collect consent to drop the different trackers, that can be hundreds. So anyone listening, if you’ve ever tried to go to a recipe site, I think it’s a great example. And it takes a long time to load. That’s because that recipe site is free giving you great free recipes, but it’s being paid for by ads. And those ads are trying to be tailored to you through a long list of Ad Tech, but there’s no requirement here to say yes, it’s okay to drop those trackers, or pixels or tags or whatever flavor word people want to use. They’re different, but synonymous in my little ecosystem. So here, I could go to Sephora, or any other site in the US, and I could do my shopping. And then those pixels could still fire. And they could share the information, the modern laws for those who have it like in California CCPA. And what we’re moving towards for the other four states coming in 23 will let us say, No, I can opt out of this cookies. Now in theory, we also have the digital advertising Alliance, it’s been around for 10 years, that’s supposed to say all sites are supposed to let me opt out of that. But it’s complex, as we’ve been talking about.

Justin Daniels  17:40  

I guess here’s what I want to ask you as me being kind of the audience today is I handled data breaches. And so when I’ve come in almost every time, the company doesn’t know where their data is, isn’t the kind that is going to require breach notification. And so what I’m trying to understand is, and I think of code you’ve been dealing with this is on your second layer about the identity. And how our companies if they can’t even figure out where their data is for breach notification purposes, when a breach happens, which is worst case. How are they able to implement systems so they can track in this ecosystem, you talk about identity and consent, so that all of this is done? It just seems to me from my perspective that that isn’t happening. And it’s just going to get worse. Because now we have all this other technology coming onto the scene, Blockchain autonomous vehicles pick your flavor, where the date of creation is a 10 or 100 times x, what it’s been

Yacov Salomon  18:43  

prior to that. Yep, no, you’re absolutely correct. It is. And that’s the sometimes we talk about the when we talk about the evolution of the market in terms of privacy, we talk about comply, control and secure. And I think the first mental model shift that companies have to go through is this idea that privacy is important. And I think, by and large, we’ve been beating this drum for for, you know, for years now, coming up to five. And I think a lot of the people have picked up already that okay, privacy is not just the fear that is going to come and go this is something that we all are going to have to stop respecting the dark side of privacy, as far as companies are concerned is the fact that yes, some of this flies straight in the face and against the grain of how we’ve been handling data up to this point. You know, the advent of Big Data meant that we all collected first and ask questions later. It was this idea of cheap storage and tip cheap processing. So just collect as much as you can. And you know, the example of all these third party JavaScript tags firing on the site, collecting all that information and shoving it over to a third party services to be used later on for something and has been the model that we’ve been operating under. Absolutely one of the things that any organization needs to do as it starts trying to grapple with how they’re going to implement, the notion of complying with privacy is to take control of the data they have. And sort of Ketch one of the first thing we’ve built is this, what we call data discovery. And classification is the technology that goes into your systems searches through the mountains of data you have, and peaks the schema, as we call them, the main themes of your data, the structure in it, and where the sensitive data lives so that you can start planning what where do you want to go first? How are you going to apply it first, which system needs your attention? Absolutely. And a lot of companies are behind the eight ball here, and they’re not yet in control of that. Definitely all the companies with a lot of legacy systems find a lot more difficult, because you know, and we talk to financial institutions that have terabyte of data that haven’t been structured is just sitting in flat files everywhere. And getting control of that will take time. So this is the almost like the digital transformation requires companies to change fundamentally how they operate. think now we’re in the what I call trust transformation, that will be a big shift, again, in how we control data and how we understand the data we have.

Jodi Daniels  21:26  

While I’m part of that transformation, we have different regulations, I hinted right, California doesn’t require me to opt in first. But under GDPR, I do have to opt in first. And there’s 150 privacy laws, all with their own flavor. So that’s really challenging for companies to do. Can you talk a little bit about, you know, what are the the key problems you’re seeing companies have and how you’re helping those companies solve them? Absolutely. And,

Yacov Salomon  21:55  

again, we have the hindsight now, in 2022, that we didn’t have in 2018, I remember, even Marc Benioff at Salesforce was okay, GDPR is out, we just gonna apply GDPR globally, you know, this is going to be the, there’s going to be the standard, let’s not worry about this region versus that region. And, you know, first kind of thought first principle is maybe kind of makes sense. But when you think about it over time, you see a system that is evolving all the time. It’s not that feasible, because if I’m a, I’m a citizen of a certain country, or certain state or certain region region, and you know, some other part of the world change the regulations, I don’t want necessarily my services to change all of a sudden, right? If GDPR 2.0, come out, comes out in a couple of years. And I’m, I’m in Texas, why do all of a sudden I need to do something different? So I think the reality is that, as a humanity as a society, we’re going to keep evolving notions of what privacy is, and because they’re going to be a lot of experiments about what is the right set of regulation? Is it opt in? Is it opt out? Is it that there’s some situation that this whole situation is that adults? What are the regulations require? What are the limitations, this will keep evolving, and I suspect that everywhere is going to be a little bit different? I mean, even under GDPR, I know that countries like Israel have a different flavor of GDPR than say, for example, countries like Germany, so it is not, there’s not going to be the one one size fits all, unfortunately, in this kind of world. So the reality is that what we need to do then is we need to make it easy for companies to apply global policies through what we call click NOC code. How do we enable, you know, like the TurboTax kind of scenario? How do we enable professionals and GCS and policy makers to go into a system and be able to regional regional legislation or legislation define the policies that accompany the posture that a company wants to take with respect to the regulations in that particular jurisdiction? And that that I think, at the heart is where a lot of the technologies and the privacy enhancing technologies are kind of evolving this ability to front this simple UI, where professionals can define the regulations and change them in code in sorry, in clicks not code.

Jodi Daniels  24:15  

I like clicks not code. I haven’t heard that one. I like it. I mean, our next t shirt,

Justin Daniels  24:21  

it will just be on the back.

Jodi Daniels  24:22  

Yeah, we could do we could have we have a lot of T shirts that keep coming out of the show, we can have a list of here’s all the

Justin Daniels  24:27  

T shirts. So Yacov for a company starting their privacy program, what advice would you give them? That’s a great question.

Yacov Salomon  24:38  

Um, there’s a lot of practical advice that I can get to in the moment of, you know what to do first and what to do second, but I wrote a piece a few years back and I don’t think get a lot of attention then or maybe it will be time that says, Don’t worry about regulation. And what I meant by that then is that first, like all companies, the first thing you need to worry about is your customers, your, your, your users, and you need to connect with them and what, where they’re at and what they want. And I think that if you do that, right, you know, almost always, you’re going to be on the right side of the fence, as far as all these things come to be. So in terms of how transparent you are about what you do with either in terms of how much do you follow through on your statements, or what you do and don’t do with data, how much you’re clear. And when things go wrong about, you know, acknowledging that something went wrong, and you’re going to change that. This all builds, we’ve seen that again, and again, and again, builds trust builds, brand, and so on, so on. So the first thing I will say is, think about your consumers think, think, think about your customers and think about what they would prefer, and how would they like you to deal with the data? The second thing I would say, very importantly, is sorry, I don’t know if you’d want to jump in with a question, but that

Justin Daniels  25:57  

I was just gonna say, maybe you should repost your prior post, it might now be relevant that people will be in a different frame of mind to consider what you’re saying.

Yacov Salomon  26:06  

That’s right. The second thing I would say is remember that, especially in 2022, you’re not alone in the sense that, you know, we’ve we’ve established quite a few by now Hatton’s of how you can efficiently become compliant. Using technologies that are already out there available. You might still be, you know, an organization might still contemplate do we build with we buy, but at the very least, there are templates, there’s blueprints, about, you know, the set of technologies that you need to be compliant, okay. So if it’s a content management platform, it’s a right orchestration platform, if it’s propagations, of consent, and signals, and so on, so on, so on. So we have, the tools are there and you you, you can implement them. And the good news about these tools is that you can go ahead and implement them, and worry about how you want regulation to affect you later. Because a lot of these tools, if they’re done correctly, should not just apply for regulations. Today, you’re not buying the GDPR solution, or the CCPA solution, you’re buying a technical solution that you integrate with your system that has to prove the test of time, as regulation changes, policy changes, you’re going to natively adopt, because he had the configurations to do so. And once you have these two, once you thought about your customers what they want, and you and you started implementing some of the capabilities to be compliant with your data stack, then you can come together and figure out what are the policies that matter for the company. As you apply those technologies that help you there, the first thing you’ll do, as we discussed, is discover and classify your data. So you’re going to start having a picture also of what your data is. And from there, we move forward.

Jodi Daniels  27:56  

Thank you for that. I love when we put customers first. And then we think about the regulations. And when you were saying we don’t buy GDPR CCPA solution. That’s exactly what so many companies have done. There was the GDPR. Now I need another one. And when we have four new laws coming into play in 2023, we have this may be national law here in the US, we do kind of like to ask our guests, you know, for their crystal ball per view into the future. So what do you think is going to happen with privacy going forward? And for everyone listening, I really hope that you heard the idea. Regardless of these different regulations, you have to build a fundamental base, and then any of these new laws that come will be much easier to be able to apply, but Yacov What is your crystal ball say?

Yacov Salomon  28:51  

Yeah, I mean, and yeah, I do think I have a crystal ball, of course, like everybody, but I don’t know if it’s the right one. So I think we’ve already touched on one thing, I think from from the layer of privacy that is to do with society is going to continue to evolve, both because more and more places are going to start contemplating what is the right laws and regulation and they’re going to have the benefit of regulations and laws came before them and and shortcoming of those current regulations are going to be updated is going to be GDPR 2.0, there’s going to be a GDPR 3.0. And we keep going we’ll keep going I think on on the on the society kind of level. My hope is that very soon, and I wrote a piece about that as well. Pretty soon we’re going to move from what they call command and control regulations paradigm, right where regulators come in and and it’s really about fines and defining the, you know, the bad behavior. And we’re going to move in more to an incentive base kind of framework, right? Like we’ve done that with the, with the environmental movement where all of a sudden recreated market incentives that made the industry We rise up and come up with solutions. So I really want to see the privacy and trust world evolve beyond, let’s not get fined into how do we evolve, the intricate of the conventions we have in society is around data, you know, fiduciary relationship and circles of trust, and enable those on the internet to create all the scenarios if I link back to the top of how data is going to get used through intelligence systems. So I think in that world, incentives will come up in one way or another, and we’re going to start building towards the upside. On the other side, I think from a technical perspective, I’ve mentioned already, we always saw that as a three step kind of evolution, comply, control, and secure, I think the control is where the world is right now complies, kind of accepted. Now we’re trying to figure out how to make it. So I think the next step is really about securing the data. And that’s where it’s cybersecurity, going to meet privacy in many ways, right at two sides of the same coin. And what I mean by that is, we want to make sure that all these we call them permits, permits are the most fundamental object here is the contract that is signed for every piece of data interviews and every use case, what can income be done? How can they be clickable by default, that includes securing the data, in many ways, how it goes through the system, and we already see, you know, a lot of techniques coming out of academia. They’re getting implemented in various places, what I call modern cryptography, things like homomorphic encryption, and differential privacy and multi party computations, zero knowledge, proofs, and so on, they’re going to be some of the tools of the machineries that we’re going to implement, to really make permissions stick. And in the same time, that data do what data’s day job is, which is to, you know, fuel intelligent systems and deliver value. So I see, technically, we’re going to bring a lot more of that machinery into the fold under the privacy enhancing technology. And from a society perspective, I hope we move towards an incentives rather than a control kind of program.

Jodi Daniels  32:05  

I certainly agree on the incentive part, I think that’s why we’re starting to see privacy as a big part of ESG these days. And I you are smirking over here with your privacy enhancing technologies

Justin Daniels  32:20  

You mentioned cryptography, and then you get all happy. fascinating topic, variety of industry. But speaking of Privacy and Technology, do you have a tip that you’d like to offer as a best practice for friends when you’re, you know, at a cocktail party outside the Eastern hills of Oakland.

Yacov Salomon  32:43  

In the context of privacy. I, we’ve we’ve said that several times in this conversation, and I think you’ve pointed pointed it out first that it all begins with the data. And and at the end of the day, understanding what data you have is the pivotal. And most important foundational here, whether you’re a data scientist or in your analyst, or your engineer, or your privacy lawyer, knowing and understanding what data you have is the key. So if you’re do nothing else, finding a tool that helps you discover and classify the data you have is the best thing you can do to start moving forward.

Jodi Daniels  33:27  

Now, when you’re not providing privacy tips, and building privacy companies, I do like to do for fun

Yacov Salomon  33:36  

So I have two daughters, aged six and eight. And the family really loves hanging out outdoors. In fact, last Thursday, Wednesday, Thursday, we were in Yosemite, with friends and we’ve been fortunate enough to make it to the top of Half Dome and back. There was an amazing trip, I highly recommend it. Even if you don’t make it all the way to the top. It’s beautiful, beautiful part of the world. But yeah, looking forward this weekend, I’m going climbing with friends as well. So I tried to that some of the joys of being in Northern California is the access to beautiful national parks and fantastic outdoors.

Jodi Daniels  34:19  

Absolutely. Well here in Atlanta, we’ll just take our colorful leaves right now and in the fall season. Well, Yacov it’s been a true pleasure to talk to you. If people want to connect and learn more about you and Ketch where can they go.

Yacov Salomon  34:32  

So ketch.com is our website. You can go there. You can find me on LinkedIn, or on Twitter. And if you want to email me directly, then it’s yacov@ketch.com, very happy to strike a conversation or to pick up. I will also say there is one more project that I’m involved with in this space. It’s called the ethical tech project. And you can look that up as well. Where we’re trying to do This is a nonprofit interdisciplinary kind of effort to try and and advocate for some of these ethical data usage. We have some interesting work going on. They’re putting together a reference architecture. What does the privacy stack looks like? So if you want to get involved with that as well, by the final stare as

Jodi Daniels  35:19  

well. Excellent. Thank you so much for sharing. We appreciate it.

Yacov Salomon  35:24  

Thank you so much. Thank you for having

Outro  35:32  

thanks for listening to the She Said Privacy/He Said Security Podcast. If you haven’t already, be sure to click Subscribe to get future episodes and check us out on LinkedIn. See you next time.