Debra Farber is the CEO of Principled LLC, a privacy-first tech advisory. Debra is a global privacy and security advisor, investor, and privacy tech enthusiast. She has over 16 years of privacy and security leadership experience at companies like Amazon Web Services (AWS), BigID, Visa, and IBM. She currently serves on multiple advisory boards for organizations including The Rise of Privacy Tech, D-ID, and Taskbar.
Here’s a glimpse of what you’ll learn:
- Debra Farber describes her winding career path in the world of privacy and security
- How can companies start setting up a privacy by design program?
- Bridging the gap between the business and privacy sides of your company
- Debra discusses the emerging updates in the privacy tech space
- What is Web 3.0?
- How Blockchain can be used for privacy purposes
- The looming privacy challenges companies face today
- Privacy pro tip: remember that privacy is about protecting people
In this episode…
Once your company has checked off the basic privacy requirements, how can it continue to move forward? What should you be implementing next?
According to Debra Farber, the first step is to do an inventory of your current practices. Where are your potential privacy problems? Are you over-collecting data that may be causing compliance issues later in the process? By mapping your biggest privacy challenges, you can begin to work backward and prevent problems from happening. This way, you can create a privacy program that is uniquely designed to meet your company’s needs.
In this episode of the She Said Privacy/He Said Security podcast, Justin and Jodi Daniels sit down with Debra Farber, the CEO of Principled LLC, to talk about building a better privacy plan for your company. Debra discusses how to recognize your weak spots, the key to bridging the communication gap between different departments, and the new trends and updates in the privacy tech space.
Resources Mentioned in this episode
- Debra Farber on LinkedIn
- Debra Farber on Twitter
- “TROPT Defining the Privacy Tech Landscape Whitepaper” by Debra Farber
- “Architecting for Privacy and Data Protection on Hedera” by Debra Farber
- Jodi Daniels on LinkedIn
- Justin Daniels on LinkedIn
- Red Clover Advisors
- Red Clover Advisors on LinkedIn
- Red Clover Advisors on Facebook
- Red Clover Advisors’ email: firstname.lastname@example.org
Sponsor for this episode…
This episode is brought to you by Red Clover Advisors.
Red Clover Advisors uses data privacy to transform the way that companies do business together and create a future where there is greater trust between companies and consumers.
Founded by Jodi Daniels, Red Clover Advisors helps their clients comply with data privacy laws and establish customer trust so that they can grow and nurture integrity. They work with companies in a variety of fields, including technology, SaaS, ecommerce, media agencies, professional services, and financial services.
You can get a copy of their free guide, “Privacy Resource Pack,” through this link.
Welcome to the She Said Privacy/He Said Security Podcast. Like any good marriage we will debate, evaluate, and sometimes quarrel about how privacy and security impact business in the 21st century.
Jodi Daniels 0:21
Hi, Jodi Daniels here, I'm Founder and CEO of Red Clover Advisors, a certified women's privacy consultancy. I'm a privacy consultant and a Certified Information Privacy professional, and provide practical privacy advice to overwhelmed companies.
Justin Daniels 0:36
Hi, Justin Daniels. Here I am a technology attorney who is passionate about helping companies solve complex cyber and privacy challenges during the lifecycle of their business. I am the cyber quarterback helping clients design and implement cyber plans as well as help them manage and recover from data breaches.
Jodi Daniels 0:55
And this episode is brought to you by Red Clover Advisors. We help companies to comply with data privacy laws and establish customer trust so that they can grow and nurture integrity. We work with companies in a variety of fields including technology, SAS, ecommerce, media agencies, and professional and financial services. In short, we use data privacy to transform the way companies do business. Together, we're creating a future where there's greater trust between companies and consumers. To learn more, visit redcloveradvisors.com. You ready for a fun discussion.
Justin Daniels 1:32
You had me at web three and NF T.
Jodi Daniels 1:34
Oh my goodness, it's gonna be so much fun. I am delighted to introduce Debra Farber, who is the CEO of Principled LLC, and is a global Privacy and Security Advisor, investor and privacy tech enthusiast, with 16 plus years of privacy and security leadership experience, including at AWS, big ID, Visa, and IBM. And the fun fact is that Debra and I last remember seeing each other when we used to get together at in person conferences, at IPP, having coffee at a big round table. I'm so excited to be able to see and chat with you again today.
Debra Farber 2:14
It's a pleasure to be here. Thank you so much for inviting me. And it has been too damn long.
Jodi Daniels 2:19
Ariel. Thank you zoom for reconnecting us. So with that in mind, so many interesting companies that you've worked at, and a wide swath of experience. So we always like to ask everyone, all of us understand your career journey that got you to where you are today.
Debra Farber 2:40
Yeah, so it's it's definitely a winding maneuvering, like a weird path. But I guess getting into privacy 16 years ago, it would have to be an odd path. So I originally went to law school with the hopes that I end up in intellectual property law. And while there, I took my copyright law class was being taught. Well, the professor was also Teaching Privacy Law, and I really enjoyed his style of teaching. So I, I ended up taking privacy law, and then realize that so much of it was addressing, like, new technology, and instead of it being about, you know, well settled law, it was around new laws, statutes that would, you know, cover the space, and at the time, it was sectoral like, you know, HIPAA for healthcare and gramm leach Bliley for financial services, but I kinda was looking to, you know, eat up anything technology related, and thought, this is the future, right? Like this is this is where we're trying with privacy law, we're trying to capture, prevent mistakes as products are being built, so that we can protect people. And so I really, that jazz me up, they got me excited. It felt like a very forward. I don't wanna say forward thinking it's lost, specifically, but it felt like it was set up to allow future innovation through a lens of or through guardrails that were set up by the law. And so yeah, after I graduated law school, there were really very few law firms, if any, that were focused on privacy law, it would really be more sectoral, like a healthcare attorney would be doing HIPAA stuff, right. So I ended up going into industry and started my career path at American Express, managing online privacy and kind of, you know, it maneuvered around, you know, after that I ended up going into working for a very large well funded startup called Revolution health, founded by Steve Case, the founder of AOL, which was ended up being a massive failure because it was a little before its time, before smartphones, it was trying to do too much social stuff. But at that company, that's where I really like learned around about the software development process, and how you know, how do you fit privacy around The the SDLC process so that we're building, you know, compliant and ethical apps, and, you know, whatever we're building. And so yeah, and then it's been that I went into working for IBM, where I was working with government agencies, specifically. So this was in DC and was working mostly with federal government agencies like the Veterans Affairs and, and they're, you know, learned a lot about the very peculiar specific public sector privacy requirements. And then I'm like, you know, what, DC is too slow. Everything happens so slowly, because policy has to happen. First, I really want to go out west where the innovation is, and I moved out to the Bay Area, I was there for about eight years. Kind of, I've done a lot. So I don't I, you know, I would say, half my career has been in the on the consulting end. And on the other half has been kind of more operational GRC and strategy. Mostly, I want to stress the strategic part, how required that is to be able to advise product teams on how to build compliantly
Jodi Daniels 6:07
completely agree on the need for strategy and understanding how privacy is gonna fit into the whole company. And not just, here's you need checklist, because these are the things that have to actually get done. But then once you're done with them, it doesn't go in the digital dust file it needs to actually live in in brief, and I bet it's hard to imagine 16 years ago, when you started and you had stared law firms, there was barely a privacy practice. And now there's such a fascinating war for talent, and so much need for privacy these days, it's been really fun to watch.
Debra Farber 6:41
Well, that's great to hear, because I'm definitely not in that legal like space, I actually have never practiced right. So I still have never practiced it's, it's but I absolutely think my law degree has been super helpful in especially being so early in privacy in navigating this field, especially on the regulatory side. I mean, there's a lot of legislative stuff that I've I'll look at and all advise, but I'll let the advice comes through a lens of consulting and to standards and to write, I want to interpret law. So it's kind of a fun way to use my law degree without actually practicing law is kind of through consulting. And then just to round it all out, I guess, I the innovation side is really what's been the impact, like I've been impassioned by ever since. And so after, you know, I've gotten some other companies I like big ID was on there, you know, originally, I was an advisor to them when they were still pen on paper and just building out the product, a few people, and then came on board as the as their head of privacy strategy, once they raised their money, and now that the second privacy unicorn, so as a result of and big ideas is just for anyone who's listening is in the data discovery space, trying to find all that sprawl of personal data across a large organization and then being able to surface it for various purposes, including giving it back to an individual when they request it. And that began my kind of career in what is now being defined called the privacy tech space, where we're building products and services for, to specifically for privacy problems and kind of rally, you know, that's, it's, we're about 15 years, maybe 20 years behind security, in that, in that vein, and so I think there's a lot of people who don't realize that privacy and security still are separate things and need their own set of tools to address and processes and, and talent.
Jodi Daniels 8:35
Well, that's why we have the She Said Privacy/He Said Security Podcast, because I think I spent all day long trying to get people to understand that privacy and security are not the same things all day. And then it's always very fascinating. All right, well, I'm also outnumbered here by the attorneys.
Justin Daniels 8:54
What do you mean, everyone says that you should have they wonder if you are an attorney
Jodi Daniels 8:59
now, but I did not go. I thought you were an attorney.
Justin Daniels 9:05
You spent a lot of time at the law.
Jodi Daniels 9:07
I married a law school. Why don't I get it? We're good. No, I'm not but but I did really. When I went to business school. It was across the street from the law school at Emory. And so I an undergrad and grad and genuinely spent a lot of time studying at the Emory law school library. And people around town genuinely thought I went to law school. And I write some great social engineering. I did a lie been fascinating. So now I've just come full circle myself into understanding the law and operationalizing it into business. Yes.
Justin Daniels 9:45
I think it's funny what you said about using the Law Degree in your consulting work. So I do work around drones. So I have positioned myself as I was like, Hey, I have my pilot's license for drones and I can help you consult on this and you get the law part for Free. It's just included in the
Debra Farber 10:01
deal. Oh, that's so great. Yeah, that's I like that. I didn't do you actually have a pilot's license? Yep. Well,
Jodi Daniels 10:10
cool. Come hang out in Atlanta. And you will get to me traveler the drone, he comes out and takes aerial pictures. Yes. I'll have to take you up on it late, please do.
Justin Daniels 10:21
Or you can see us more recently when we're in San Francisco. So Jodi and I are speaking at RSA and our topic is drones and autonomous vehicles privacy and security versus surveillance.
Debra Farber 10:32
Oh, I love it. I think that's great. Amazing. Yeah. All right.
Jodi Daniels 10:36
So I'm gonna wheel us back in a little bit. Let's, let's see, we could chat. Yeah, chat all day long. Debra, from your view, how would you describe the current state of privacy programs in companies today?
Debra Farber 10:52
I would say that the current state for large organizations is maybe a maybe a lower maturity full on privacy program that really mostly addresses compliance, uh, mostly addresses notice and choice and things you can show easily show to a regulator you're complying with, but is probably really, really light on fixing any of the privacy problems in the first place. So probably doesn't have a huge robust privacy by design program, probably has maybe one or two privacy engineers, if that trying to, you know, figure out how to address a very large problem within the organization. I think most companies understand privacy tremendously.
Jodi Daniels 11:42
If you were to kind of break that privacy by design piece down a little bit more, what do you think is a first step that accompany if they've kind of done the compliance, check the boxes piece, and they're going to move up a little bit? privacy by design? What would you say they should be doing next?
Debra Farber 12:00
Well, first, I would say that they should be doing kind of an inventory of like their current practices, right? So like, you don't want to just start setting up a privacy by design program, that's completely a separate entity, like it's a separate thought process, from what's currently the design process, right? You want to fit it in to what the current procedures processes, way of working for engineers. And so I would say that first thing I would do is understand, like, what the product development process is, like, now, like, where are those potential privacy problems where there's over collection of data, and you don't really need it? And maybe it's causing more of a compliance problem down the road? So you'd want to basically map like, what are your biggest privacy challenges, your compliance hurdles? Are you getting so many requests for some one thing over another? Is it? Are you getting requests? Because your disclosures are unclear? Are you getting them because your product is opaque, and people feel like they can't trust it like, so I feel like you got to first understand the privacy challenges within your organization, and then work backwards from that. And, and, and then prevent those problems from happening with with whether it's tech, or whether it's, you know, a way you've architected your networks or your services or products. Um, so I do that, and then absolutely key is training. I mean, you, if you don't understand the principles of privacy, and how they're different from security, it's going to be very difficult to get behavioral changes within the organization. So I think training your product development managers, and training your engineers, you know, quite honestly, you know, Justin, I don't know how you feel about this. But personally, I want to see the day where you're not even allowed to push any code to production unless you've had a secure code training. You know, I think that there's a lot of mistakes and a lot of potential risks that are introduced in the code process that I know a lot of, there are companies now there are privacy tech companies now working to address some privacy problems through the development process. And so that I think is going to be a lot of help in in, in making in de risking the product development process. Oh,
Justin Daniels 14:24
Debra Farber 15:11
It's a really good point, because the sales process definitely slows down, right, when you're introducing privacy requirements, because there's just such a lack of? Well, I think, I think like what you're saying, like, it's almost like, Well, you shouldn't talk about that sooner it sounds like right, if that's going to be a sticking point, then maybe putting the terms up front make a lot more sense, even if you don't have the actual pricing down or whatnot. But I think, um, yeah, you kind of want to have a meeting of the minds like started earlier on where your privacy requirements are addressed. I think I don't really have a full on answer, like how to fix it. But I think you've identified a major problem there is that salespeople are kind of an afterthought a lot when it comes to privacy and security, they're like, we're just selling this product like someone else is going to deal with that, right. But if it's going to be stopping their sales process or slowing it down, or or make, you know, maybe making it a potential to lose the sale, I think that what I what I have seen is that the salespeople get pretty angry at or pretty annoyed at like the legal folks or the privacy folks or not, if they're going to have a requirement, that's going to put a bump in the road for them, not giving them a way to like easily jump over it. So what I've seen is salespeople get really annoyed at the privacy people for not having more sales ready. Information that would satisfy I guess, the prospect up front, instead of later on at the at the contracting phase. Now, I don't know whose job it really needs to be, there definitely needs to be some executives from the privacy side, and like marketing and you know, sales, to kind of come together and be friends and figure out how to get the right resources for the privacy team to to create that marketing content, because a regular marketer who's not in the space cannot just all of a sudden write a document around privacy, you can tell I mean, they can but you could tell that they did, because it doesn't. It's not written in a way that the privacy professionals need to hear information. So are expected to hear information. So I think, I think having a better relationship with sales and have and making sure that it is clear what the privacy team is capable of doing and what they're not given the number of resources they have. Because again, I think it always comes down to the privacy team being completely under resourced. And then sales have any expectations that like, you know, their sales, that they're, they're going to be supporting marketing. If that's the case, they need the resource.
Jodi Daniels 17:46
I love that you brought up the connection between privacy and marketing, and really creating those sales materials. i In talking with companies and working with them, those that really bring it up at the beginning, not only not slow the process down, but you actually can gain more customers. Because if you have all of that at the beginning, when the company is evaluating Company A versus Company B, and it's already there, we'd see all of it, it can turn or it can create more customers up front, which is better.
Debra Farber 18:15
Absolutely. Yeah. I mean, and that's it lead with trust lead with, like, if you don't hide the ball that makes it seem like you're, you know, trying to be sneaky, right? It doesn't it's not a partner, I want to go and put like lots of money, you know, buy their service and feel like I'm going to have like a good experience. You want to know that this, you know the person that the product folks are your customer success manager, after the sale is made, you want to know that you're still you know, being being respected by being told the truth and about the product about potential. I don't know, you basically want to make sure that you have like a good understanding of how you going to be treated upfront before you actually like buy in. And right now we're stuck with a new, a new vendor.
Jodi Daniels 19:00
Exactly. You mentioned that privacy is lagging security when it comes to technology. What are you seeing emerge in the privacy tech space? What can you share?
Debra Farber 19:13
Oh, gosh, there's an explosion of privacy tech. I. So I think the perception of privacy right now is at a very low, like at least the United States. I think there are many people who feel like, you know, genies out of the bottle, they can never bring it back. There is no more privacy, you know, surveillance capitalism is here to stay. And I'm here to tell them that like that's totally valid, but like times are changing. And, you know, I'm seeing the future in that like the company. So I'm an advisor to the rise of privacy tech, who has been defining the space. What is privacy tech, what's privacy tech adjacent, what is completely not privacy tag? We've cut out with a white paper that articulates and breaks down the area. I'm not sure if you've seen it yet. If not, I can send you a copy of it. in it where, you know, we've really broken down like the privacy tech landscape as it's shaping up right now, the different kind of sectors so to speak, or we're trying to build a tech stack, but that That'll come later as we're talking about different use cases. But so here we have examples of like, the SDLC process and where privacy is fitting in. For instance, I'm working with a company, I'm an advisor to a company called provato. That's doing static code analysis for for privacy workflows, as opposed to security. So we haven't really seen that on the market yet. Then there's, there's the post production, you know, how do you manage privacy once you have established a relationship with a customer over that? The lifecycle of the data, and so kind of we, we broke it out based on the lifecycle of development and the lifecycle of the data. And then kind of broke it up that way. i There are hundreds of privacy tech companies right now. Many of them b2b, some of them B to C, which is interesting to see, like, you know, tools made for individuals to try to take more control over their data when they're dealing with other companies. I'm seeing tools, let's see. You know, obviously, we've got tools around data mapping, data governance, finding all that data I'm seeing now, like, big IDX came on the scene with that. But now I'm seeing companies that are doing that for mid market. I'm sorry, I'm seeing privacy tech vendors doing that for mid market companies. Because in the past, the discovery capabilities of a big ID were really reserved for enterprise based on the pricing. And so you know, small businesses and medium sized businesses that were trying to comply with GDPR and CCPA, or, you know, didn't really have tools at the time. So a lot is coming to market for the small and medium sized businesses. I'm seeing people using, believe it or not, Blockchain, and you know, as access control, or, or, you know, I actually do think that in the blockchain space is going to be a real good tool for privacy. In that you, it allows you to then be able to use like zero knowledge proofs to be able to extract data, you know, from an individual, like, Are you over 21, for being able to gain access to a physical space or, or digital space, and without even, you know, compromising any of the other information about you. So instead of having to show a license to a bouncer, that has my address on it, I don't want them to see my address, I just want them to know have over 21 I think there's a lot of deployments based on new architecture like blockchain that will allow actually be, you know, privacy enabling. There's a lot going on right now it's an explosion, it's really hard to even keep track of it all.
Jodi Daniels 23:06
Justin Daniels 23:07
So you mentioned the blockchain. So let's talk a little bit about that. Can you educate our audience a little bit as to what web three is? or web 3.0?
Debra Farber 23:19
Sure, yeah. So in so when one we had just static content on the web, right, you just put up a page and someone else can read it. In the web to space, we have dynamic content, which made the web pretty social. And we were able to have this bidirectional conversations and much more we called social web. And that's brought us also the rise of big tech and surveillance tech. And so as a result of that, because the the original web was designed to share data, it didn't have protocols put in place to secure data, any of the security stuff was kind of added on since the original web was put in place. But that didn't always address the privacy problem. And the fact that surveillance capitalism, you know, people have lost kind of control over their own data. So web three is a set of protocols and new technologies that are going to enable that are being developed now some is in some is actually in place, some is being developed. And it will allow people to have control over their own information through technology like decentralized identities. So we'll be able to go and, you know, find a virtual space and plug in there and have managed our own keys to our own identity. We might even manage several identities because we don't only just have to have one, privacy wise in one particular space, maybe I like to share a lot of details about my health. So my identity is going to be a little different. Maybe I have an avatar maybe you know who depends on the space. But then I want to plug into another space where I just you know, maybe I'm in anime and like, I don't know talking about something else. I mean, I'm not an enemy, but just giving an example. And I could plug in with a completely different, it's still me, but different aspects of me that I want to share with this organization. And so and I'd own the keys to each of these identities, and instead of having to rely on on, you know, third parties, and intermediaries to facilitate my identity, like, whether it's an Octo or Facebook log in, or you know, any other Google login, it would be the, I'd log in myself, and I'd have the keys on the hardware that I'm showing up with. So decentralized identity is a huge part of web three. But there's so much more, right? There's so many other technologies, it's going to enable us to basically cut out intermediaries. So when we talk about being decentralized, we mean that power is not centralized in one organization anymore. It gives us a lot more freedoms to I mean, that's, that's the stated goal to basically be able to give people more freedoms to to have control over their own identities, and how information about them is shared. It was a follow up to that.
Justin Daniels 26:06
It sounds like what you're really talking about what's exciting in the privacy space is how can we use decentralized digital ledger technology to enable people to interact with each other, but also protecting their privacy? Are there any other kind of specific examples that you can share? Because quite frankly, one of the challenges that I see when I work with companies in the Bitcoin space, non fungible token spaces, how do you know that the person with that digital wallet is actually Debra or God? So enter maybe some of this decentralized way of authenticating people's identity without having to your point, give the bouncer the address? Can you talk a little bit about some of the things you're seeing or the excitement in that particular area?
Debra Farber 26:55
So when you say that particular area mean, like, the authorization and like KYC. And like,
Justin Daniels 27:01
I guess what I mean, is when you talked about putting technology around, instead of you giving the bouncer your license that has your name and your address, talk a little bit about just giving the audience an inkling of the opportunity with Blockchain on how that could be used for privacy purposes.
Debra Farber 27:18
Yes. Okay. So. So just just to set it up first, for the last six months, I've been working with a company called Hadera, which is actually a competitor to blockchain in that it is hash graph technology. So the founder actually created something that we believe is better than blockchain. It hits every button it solves the blockchain trilemma in that you can have, you know, all three speed, decentralization and security, you know, all of them without compromising any of them. The technology is different from blockchain, and its patented, but it is a distributed ledger tech. And I do believe that it's going to be the future trust layer of the internet. So my view of blockchain is through learning Hadera hash graph first, and then comparing everything, it's a very unusual view. It's just how I got into the space. So distributed. Ledger's, I wrote a white paper for Hadera. It can be used by anyone, anyone who's adapt developer, a decentralized application developer or you're building a Dow or whatever you're building on on a blockchain. This is for you, even though I wrote it for Hadera. But it basically this white paper, and I'll send it to you. So you can share it with your audience if you'd like. It pretty much goes through how do you architect in a web three environment? How do you architect for privacy and being a privacy enabler, and it gives a crash course on data protection law for developers without the legalese, just kind of hear the requirements. And here's why it's important what you need to do. So that said, I would say that my advice would be don't put personal data on the blockchain. The whole what you can do instead is put hashes on the black blockchain that or hash graph that actually correlate with the data that you're holding, you know, in a database off off chain, and you've obviously got to secure that like normal. But what you're doing is the data will becomes to not synonymous. Because this way, you know, the company collecting the data and putting it on the blockchain is only putting hashes there. The point of the hash is to for verification, and, you know, time stamping and like when did you know when did this happen? So Hadera has a consensus service capability, that's timestamps, every single order fair ordering of transactions. So it's then able to kind of put every anybody any developer can kind of like query the mirror node and find out what, what transactions have happened in and what order As a result, I've seen this used for dating for consent, like basic, okay, I'm getting into too many things at once here, I've seen this technology used as you could use an NF T to store someone's consent. And so for you, Justin, this, this would be a native NFT, not one that is meant to store value and share amongst other people. But Hadera has an NFT capability that's just native to their, to their chain. And you could use that non fungible token, which means it's unique. So to store like, Debra harbors consent, in the end, and in this moment, and so let's say it's a clinical trial for healthcare, and you want to make sure that every where every every area to get my consent along the way, I've gotten it, maybe I've revoked it, maybe I've, you know, give my consent, again, for clinical trial. But then you have to update that consent, again, to say, you know, for for another medicine trial related to it, or something else, like, basically, my understanding is for clinical trials, it's a constant negotiation process of consent, not just a one time thing. And as a result, you can keep updating your system to store that consent has been of t. And it's 100, like 100%, auditable in real time. So what I think what I'm super excited about is we're gonna start seeing real time systems, you know, being able to show us our personal data, but only to the right people only to us using using NF T's as access control to data even. And then being able to
have instant audits. So with all of these discovery processes, we see to try to find the data that has been let out of the barn. Instead, we'll be able to have like, this is what your current state is a personal data right now. And it's instant. And you know, you can pull it out, you just basically can pull it instantly. And you don't need all this complex software, trying to search for where that data is. So if you build it, right, this is what I'm saying shift left. If you build things right, and you architect for privacy problems today, then we can kind of prevent all that that Paperchase and that the, the constantly trying to, you know, tag the data later on and try to figure out and sorted out and find that data later.
Jodi Daniels 32:22
Though, Debra, if you think about companies today, and you started sharing, many of them are at a low maturity level, we just talked about some really cool and fascinating sophisticated privacy technology company. What do you think are some of the big challenges that companies are facing today?
Debra Farber 32:44
Well, I think it depends on the size of the company. So with enterprises, I think the biggest challenge is that we've surmount we've, in the privacy space. So much technical debt exists, that is a really big project to do compliance for privacy in the large, larger companies. Because, you know, if you've been around for like, 3040 years, and you've got, you know, hundreds of 1000s of employees, and data is everywhere. cleaning that up is a huge mess, and finding talent that wants to clean that up. And really, like honestly, like, you know, it's the developers that have not just developers. But if developers had written code and things are now like stuck in the code over time, it's it's really been an uphill battle to try to get them to want to work on fixing their code versus innovation. And it's the same privacy professionals I burned out from amazon for this, you know, I was I was working on trying to help onboard companies to data deletion tools. And I it's my numbing work, and it it truly burnt me out. So I think it's going to be hard to hire continuously to find good people to clean up after people who've moved fast and broken things for years. I think that's a big challenge right now. I think you got to hire a lot more privacy engineers, a lot more privacy, product managers, and privacy consultants and advisors to actually like, get a team together to come up with strategy around maybe re architecting some things rather than cleaning up exhaust at the end because it's just a constant Paperchase. You're never going to stop the exhaust that way. You're just getting, you know, it's just a it's a constant problem. So fix that problem up front. It's, it's it yes, it's going to cost money now and it's going to cost more money. But it is better than keeping the status quo, which will most likely you know, result in in fines and potentially lawsuits, depending on whether we get a federal privacy. Yeah, so I would, I would say, for small businesses, it's they don't know where to start. And a lot of the technology that was out there for privacy is not exactly, you know, easy for a one engineer shop, you know, or so. So what I'm seeing now is a lot of no code solutions coming to market that are going to be helping small and medium sized businesses that have a smaller tech shop or don't have the funds for, you know, a large Privacy project to hire consultant. So that's what I'm seeing address that gap.
Justin Daniels 35:36
You know, what I think? I think if Jeff Bezos is listening, you should be offering more Amazon people in the privacy department, a seat on the next spaceflight
Debra Farber 35:47
space, if they want it. Yeah, I mean, that's fascinating. I, you know, I'd love to see if you can make that happen.
Justin Daniels 35:57
Might be a little above my paygrade. But anyway, as a privacy Pro, what is the best personal tip that you might give that a cocktail party?
Debra Farber 36:08
Yeah, so my, my favorite tip, whenever I'm asked this is that, you know, very often when we're talking about privacy, in a company, we're talking about, you know, how do we shield the company from liability by being non compliant. And I just want to remind everybody that privacy is about protecting people. And even though we get a lot of pushback within our jobs at times, that is the one thing that I think we can always emphasize, and even get others to, to kind of wrap their brain around the whole problem when you reframe it from, you know, privacy is about data protection to no privacy is I mean, it is, you know, obviously about data protection. But reframing it as privacy is about the people that we're servicing, they're not just consumers, they're not just, you know, our product is being consumed by some entity, it's no privacy is about making sure that we're doing our obligation as a company to make sure that this data is protected. Because we want to protect this person. We don't want them to feel surveilled. We don't want them to have exposure we don't want them to right. And so everything should be working backwards from that framing. And it humanizes it really makes people realize like, oh, I guess I really should fix the code because it might actually harm a human, as opposed to is a regulator gonna come knock on our door? To which, you know, why does it develop chair? I think it just really will spur more. Humanity you know, you just you really see yourself in in that when you when you're talking about human beings, instead of like some
Jodi Daniels 37:47
capitalistic consumer. Completely agree. I tell people all the time you're talking to humans companies are buying because one human decided to evaluate another company designed by humans, it's humans making decisions and choices all the time. Now, when you're not busy advising companies and trying to change the privacy landscape, what you'd like to do for fun. What do I like to do for
Debra Farber 38:12
fun? Well, I'd say two things right now. One, we got a puppy recently, and honestly, he's been taking my fiance and I, on a lot of walks. So it's definitely I've gotten a lot more active and we've done a parks and you know, just play all day with the puppy. His name is Jamison. He's he is the best. Yes. And did you would you say I'm sorry, what kind of puppy what kind of puppy he is a beagle leader. So he's part Beagle and park Cavalier King Charles Spaniel. So it's just great, friendly personalities and like, part hound and wonderful. And then the other thing that I've been doing for fun lately is really diving into the world of NF T's at least in the Hadera space, where they're only like a fraction of a cent, like half a penny to send. So it makes it economically viable and really learning what is it that people are finding interesting about trading non fungible tokens? So I believe that Justin and I could probably have a discussion for hours on this, I can even see the wheels turning right now I know, we don't have time for that. But it is a definitely a hole that I've fallen into over the past few weeks. And I'm actually enjoying way more than I thought I would.
Jodi Daniels 39:28
If you were to create a little community of people who like to do this and find this fun. You have at least a group of two.
Debra Farber 39:35
Excellent. Oh my gosh, okay, great. Well, maybe we, you know, we'll talk about that offline or another, you know, venue.
Jodi Daniels 39:44
Well, I'm so glad that you joined us today. If people would like to learn more about you and connect and what you're doing where, and how should they do that? Oh,
Debra Farber 39:52
that's a great question. Um, so let's see to connect, uh, you could find me on LinkedIn, and I'm everywhere all over the web pretty much is @privacyguru that goes for my email addresses, Gmail prep cigarettes, email to privacy guru, you know on Twitter, LinkedIn, whatever. So find me on social media I love social media. When done right. And yeah, I don't really have a website right now I'm I'm doing like contracting work. But it's it's most of the work I'm doing. I didn't even actually say this at the beginning is I'm an on the advisory board of nine privacy tech startups. So especially if you're in that area, if you're endeavoring to build something and privacy tech and or want to figure out how to get involved with the rise of privacy tech, the organization that is defining this space and bringing the whole space together, then yeah, let me know. I'm here to help. And I love when people reach out so.
Jodi Daniels 40:48
Well, thank you again, we really appreciate such an enlightening conversation and a really fun emerging space in the technology world.
Debra Farber 40:55
Oh, it's so great to be here. Thanks for having me.
Thanks for listening to the She Said Privacy/He Said Security Podcast. If you haven't already, be sure to click Subscribe to get future episodes and check us out on LinkedIn. See you next time.