Click for Full Transcript

Intro 0:00

Welcome to the She Said Privacy/He Said Security Podcast, like any good marriage, we will debate, evaluate, and sometimes quarrel about how privacy and security impact business in the 21st Century.

Jodi Daniels 0:21

Hi, Jodi Daniels, here, I’m the founder and CEO of Red Clover Advisors, a certified women’s privacy consultancy. I’m a privacy consultant and certified informational privacy professional providing practical privacy advice to overwhelmed companies.

Justin Daniels 0:36

Hi. I am Justin Daniels, I am a shareholder and corporate M and A and tech transaction lawyer at the law firm, Baker Donaldson, advising companies in the deployment and scaling of technology. Since data is critical to every transaction, I help clients make informed business decisions while managing data privacy and cybersecurity risk. And when needed, I lead the legal cyber data breach response brigade. And this

Jodi Daniels 0:58

episode is brought to you by Red Clover Advisors, we help companies to comply with data privacy laws and establish customer trust so that they can grow and nurture integrity. We work with companies in a variety of fields, including technology e-commerce, professional services and digital media. In short, we use data privacy to transform the way companies do business together, we’re creating a future where there’s greater trust between companies and consumers to learn more and to check out our best-selling book, Data Reimagined: Building Trust One Byte at a Time. Visit redcloveradvisors.com. Well, hi. What’s up? We haven’t done this in a little while, indeed, and that is because to our to all of our listeners, long time and short time, we’ve made a change. In 2026 we are going to be releasing every other week, and we got a little bit of a snowed and ice didn’t start in January. So we’re, we’re kind of catching up. There might be a week or two that might have been a little lost. So after five and a half years of recording almost every week we are we’re mixing it up, and we’re going to every other so if you miss a week, you didn’t miss anything, just go back and re listen to your favorite episode. And thank you so much for joining, for joining us. And it is a little cold where we are, so all of us are in dark. I guess we’re in like winter blues. Literally, if you’re watching the video, it’s kind of, yes, we’re both wearing dark blue. Okay, there we go. All right. Well, today we have Amy Worley. Oh, wait, no, you’re doing Oh, I’ll do the introduction. I we haven’t done this in so long. I’m so confused. Had a lot of coffee today. It was a little bit more regular. That’s true. That might explain it, or my brain is frozen. I don’t remember what to do. Okay, we have Amy Worley today, who is a seasoned executive and thought leader in cyber security, privacy and AI governance. She leads BRGs global data compliance practice with a unique blend of legal, technical and strategic expertise. Amy brings a multi dimensional perspective to digital, risk management and value creation. Amy, welcome to the party.

Amy Worley 3:11

Thank you. I’m very happy to be here, especially after your break. So glad to be the first one back. Wonderful. Now it’s your turn. Really, yes, we just find

Justin Daniels 3:22

it funny because Amy is going to talk about AI governance and what is exactly what you’re doing in California.

Jodi Daniels 3:27

No, we’re not talking about me. We’re talking about Amy.

Justin Daniels 3:29

 Okay, Amy, can you tell us a little bit about your career journey?

Amy Worley 3:34

So I would describe it as swervey. So I when I immediately got out of law school, and I don’t like to tell anybody where it is, except if you’re watching the video, you can probably see my hair is gray. But I started out actually as a constitutional lawyer doing Fourth Amendment search and seizure like privacy in the context of the Constitution. I did that for a little while. I like to tell people I decided that I needed to be able to buy peanut butter and not charge it on my Visa card, and so I moved into a long larger law firm, and my career and the privacy statutes just sort of happened together. So I am a trial lawyer by background and by training, really, starting in the just before 2010 started moving more and more into privacy work data breach response, as the data breach statutes came into being, did the law firm world until 2016 then I went in house, and I was a global chief privacy officer for a multinational Pharma. And in 2019 I joined BRG, where I lead our information compliance practice. So I certainly did not set out to be a privacy lawyer. I set out to be a constitutional lawyer, and I sort of stumbled into privacy law,

Jodi Daniels 4:53

very fascinating, and we could probably have an entire episode on how those two pieces are intersecting at the moment. But. But instead, we’re going to talk about your book, because you have an exciting book called The Confidence Advantage, and a really interesting framework as well, sort of this confidence by design. So I was hoping you could tell us a little bit more about the premise of the book and the design framework, and then we’re going to get into into more

Amy Worley 5:19

of that? Yeah, so the confidence by design framework is really born of what I was seeing in my practice. So I work with clients on cyber security matters, on AI governance matters, and on privacy matters, and really just seeing a lot of silos, information silos, structural silos between those teams. And we’re at a point right now where everything is digital, right, where business is a digital enterprise, and so it’s my view that we should be talking about digital trust writ large. So what I did with the confidence by design framework is really aligned NIST privacy standards, NIST security standards and this responsible AI development standards into one framework, and following the 11 principles, organizations can hopefully use the same language to talk about digital trust, to manage it at the enterprise level, and to be able to educate like the basic workforce member, Like, what are the things that I need to know, to do and to not do to help us leverage this data to grow the company and also in a way that appropriately manages risk?

Jodi Daniels 6:34

You liked that? Yeah,

Justin Daniels 6:36

you were I thought that was interesting to put all three together, because I’m thoroughly familiar with two out of the three,

Jodi Daniels 6:43

interesting, because you can go learning for the third

Justin Daniels 6:45

one, honestly, I want to, I’d like to get a copy of the book. Anyway, I can send you one.

Jodi Daniels 6:49

You might happy to send you, yeah, you might be able to have, like, special access.

Justin Daniels 6:54

If I could get a copy, I’d be happy to give you a review. Awesome.

Amy Worley 6:58

It’s also available on Amazon, and the Kindle downloads are really inexpensive on purpose, because my goal here really was to start a conversation, not to sell a bunch of books.

Jodi Daniels 7:09

But, well, but yeah, my my daughter, our daughter, has the exact same thing, mommy, have you made? Have you made money on the book? And we said our goal is not to make money on the book. Our goal is to educate and inform so Exactly. Thank you to all of you who have spent 99 cents on Kindle. Thank you.

Justin Daniels 7:29

So you argue that trust is a competitive advantage in today’s regulatory environment. What does building trust look like in practice, and how should privacy and compliance leaders think about earning it?

Amy Worley 7:40

Yeah, so this is why the book is called The Confidence Advantage and not the trust advantage. So I argue there’s a distinction, right? So Jodi, if I say I trust you, I mean that in like, the nicest way, Jodi, I trust you. But if I say, Jodi, I have confidence in you, what I mean in the context of this framework in this book is all evidence would establish that you’re worthy of my trust. I have confidence because I follow your work and I’ve read your book, and I have confidence that you understand data privacy. And so when we talk about developing a competitive advantage with a confidence by design program, we mean establishing an evidence based program that proves that your customers, your stakeholders, should have confidence in how your organization manages data.

Jodi Daniels 8:30

Amy, what would be maybe one or two examples of evidence that would be really common in this kind of a framework? Sure.

Amy Worley 8:39

So the book actually goes through lots and lots of metrics, but the 11 principles are we statements, right? So we are proactive, not reactive. We design for transparency and explainability. And so we would look for things like on how many products do we have? Transparency notices, and in the case of AI, explainability metrics established, and we look at some specific evidence of explainability. In some cases it’s as simple as the Flash reading ease score. In other cases, it’s doing some surveys. Do people understand what they’re reading? In terms of explainability, data minimization is actually a core principle, and all my privacy friends are jumping up and down excited that I put data minimization as a core principle, but it’s also absolutely crucial for data security, because data is an attack surface. And so we would look at things like, how often is our data retention policy enforced? How automated is it? How much data beyond a certain age do we have sitting around things like that? So for all 11 principles, I offer metrics that you can measure each of them. Because one of the things that I’ve seen as a consultant is a lot of times we. See these not implemented programs, right? And I think one of the problems in the implementation gap is like, what are you measuring? How do you tell if it’s implemented? Well, well, I mean, really important.

Justin Daniels 10:13

So Amy, I have a question, and I literally was at a conference speaking on on AI, and I get a lot of phone calls where people are asking you about tools, or can you review Terms of Service? And I have to back them up and say, Well, we’re going to do that. We need to understand the technology, but where I have to back them up even further, and this is the question I ask is, well, this is tool, and how you handle suppliers is really part and parcel of your larger AI governance approach. And what I’d love for you to share is I’m still finding people are calling me very tactically to talk about tools and skipping of the whole AI governance part with the idea, I guess, of we’ll just figure that out later, but really, that should come, not later.

Amy Worley 11:01

Oh, thank you for that question. I have a whole chapter on this. I always say tools will magnify your mess. If a mess is what you start with,

Justin Daniels 11:10

that’s well said. I definitely want this book. 

Jodi Daniels 11:14

You might be borrowing my language. 

Justin Daniels 11:15

No Anyway, please go ahead. 

Amy Worley 11:21

A tool is a tool is a way to help create efficiency around a process. And so you have to have a process in place first. And the 11 principles are also AI, governance principles, right, safety, quality, explainability, transparency, and yes. So when we go in and I get asked this all the time. As a matter of fact, people will ask it when during the negotiation of the statement of work, what tools are we going to use? I’m like, I don’t know what you’re doing yet. So So yes, what we first figure out is, what do we want to do with the data? And then we say, Yes, we can do that if we and then we build the if we’s, and get all of those written down, and then we decide what tools will make those processes efficient. And in most cases, it’s not a tool, or, as we say in the south, a tool. It’s one, two or maybe three tools.

Jodi Daniels 12:18

That was funny, okay, we’re supposed to stay in privacy.

Justin Daniels 12:22

Well, I guess a follow up question I have for you is, when you’re on these engagements, and I just met with an AI committee just a week ago, and I’m like, I don’t want to show you guys tools, really, what we have to discuss is, what is your strategy around? What friction points do you have? What are you trying to solve? And can AI really help you solve that? And I’d love to get a sense for your approach, because again, I keep getting people who send me tools, and I’m like, Are you sure your first use case should be customer facing when you’ve never really implemented AI before?

Amy Worley 12:51

Yeah, oh my gosh, I feel like that you are the choir that I’m preaching to. So we have, and I this is also discussed in the book, the first thing we do on an engagement is a pre mortem, and that is this project, our AI Governance Project is dead on the table, and we are so sad about it. And then I ask everybody how it died, and they always know. They know that we went customer facing too soon, or we didn’t have appropriate resources, or we’re asking people to wear too many hats and doing this. And so we do this dead on the table pre mortem to scope it right, and get really clear on what are they trying to get done with AI governance. And part of the rules of the pre mortem is, the tool is not dead, the project is so we get really granular on the human part of this, and then when we leave, hopefully, we have a plan to implement AI governance without killing the patient. We know what the roadblocks are going to be. Everybody has said what winning looks like, and we can go forward from there, leaving our poor, our poor, pretend dead project where that one lays,

Justin Daniels 14:04

because it sounds like and, yes, I know I’m monopolizing with my co host, but you read the statistics around like 95% of these AI projects are failing, and it sounds like what your book is designed to do, and your process and methodology is to say, Yeah, we’re not going to fail because we did the right things up front and we just didn’t implement.

Amy Worley 14:23

Something, yes, and it really was driven from my own frustration for clients who were experiencing that failure, you know, part of what we’re seeing is that a lot of the delayed governance work that the security team has been begging for, or Jodi, that the privacy team has been begging for, for years that’s been de prioritized. Those are prerequisites to stand up AI. And so I have a client whose CEO was like, we’re building all the AI, and we’re building it right now. And the privacy team and the security team said, That’s awesome. Can we now do the things like access control? Tools, identity management, data classification, tagging that we’ve been asking for. And the CEO was like, we don’t have time for that. And then they learned really quickly that the AI won’t work without it, at least not in a not in a trustworthy way. And the book has some some pretty scary case studies where, for example, AI pulled up non material public information because it wasn’t appropriately classified in their system. And for those of you that don’t know, that would be the type of stuff that sent Martha Stewart to jail, that’s the good insider trading information. So I think there really is. Part of the reason for that failure is that we haven’t been doing the things we were supposed to be doing with data, and now we’re trying to, you know, optimize, and we have a Ferrari with no breaks.

Justin Daniels 15:47

So isn’t what you just said about we don’t have time for that. Another way of saying the race to market is what matters, and all of these things about privacy and security get shunted to the side, until, of course, it inevitably will blow up like it has with other technologies that have preceded this one.

Amy Worley 16:08

Yeah, and I argue in the book, you know, we have a $1.5 trillion investment in AI, right? So we can’t afford to get this wrong, and that’s part of the reason for the framework coming out right now. I also cite a McKinsey study that if you get digital trust right, you are one and a half times more likely to grow your bottom line by 10% so while I am certainly a compliance professional, you know, a recovering lawyer, there’s also really good business reasons to do this right now. I mean, the average cost of a data breach last year was $10 million and AI is resulting in data leaks and data breaches at astonishing rates.

Jodi Daniels 16:52

I was just going to add, especially for anyone listening who might be struggling to get their privacy program and security program the attention it needs. I have seen a lot of companies have success. Maybe not the case study Amy you shared, but there are other companies who are now able to wave the wand, wait, you have to get these things, and then you can have your AI projects. So piggybacking on your AI friends or AI projects and making sure there’s some AI governance is one of the ways to help move the privacy and security program forward, and I’ve seen success with that.

Amy Worley 17:27

And I would say that the first four chapters of the book, so the book is actually written not necessarily for privacy pros and CISOs, though I want them to read it and I want to have a conversation about it, but the first four chapters of the book are the business case for C, C levels, for board members, for investors, of why this stuff is so com it’s a commercial imperative at this point. It’s it’s definitely required from a regulatory perspective. It’s absolutely required from doing the right thing and to create customer trust. But it’s also business enablement. Love it.

Justin Daniels 18:03

So since you sit astride both privacy and security, would you like to share with our audience a special tip that you might have in one of those areas?

Amy Worley 18:14

So it’s funny, you should ask, because this is one that drives my children crazy. I have teenage, teenage kids. I always say, Buy dumb things so I don’t have a smart I don’t do Alexa. I’m probably going to make people’s phones blow up. I don’t do Alexa or Siri, or, you know, a smart refrigerator or smart nest, or any of those things, because, to my mind, they are all surveillance opportunities. And so my own sort of personal privacy hack is where it makes sense buy dumb things. Now I do have a smart TV because, to my mind, there’s a consumer balance there, right? And when it tells me what I want to watch, it’s usually right. The other thing I’ll say is I have MFA on everything, on everything. I have it on my credit cards. I have it on every account that I have, and both of those are just kind of simple ways, both to protect privacy and security. And when

Jodi Daniels 19:11

you’re not writing a book and advising clients and talking about frameworks, what do you like to do for fun?

Amy Worley 19:17

This is going to sound like the nerdiest thing ever. I will preface it by saying I’m from Eastern Kentucky, and so I’ll give you a reason why. But I make quilts, and I do it the old fashioned way, by hand, and I think it’s because I go so fast all day, every day, and I live in the tech world all day, every day, that there’s something just both creative and soothing and relaxing about sitting and making something with my hands. I love it well.

Jodi Daniels 19:44

Amy, thank you so much for coming. If people would like to connect, learn more and grab a copy of the book. Where should they go?

Amy Worley 19:51

The best place is to go to the book’s website, which is www.confidenceadvantage.io there’s a link to Amazon there. There’s. Also a resources page. So for all of us out doing this work, every study, I cite, every Reg, every statute, every case study, I put all of that up on the website. Because, again, the idea behind the book is to have a conversation and help folks in this space have access to this material. The Contact Me form comes directly to me. You have thoughts about the book, let me know I want to talk about it.

Jodi Daniels 20:23

Well, we are delighted that you could come here and share, and we’re happy to help continue and spread the conversation. So thank you so much for joining us, and thank you to both of you.

Outro 20:38

Thanks for listening to the She Said Privacy/He Said Security Podcast, if you haven’t already, be sure to click Subscribe to get future episodes and check us out on LinkedIn. See you next time.

Privacy doesn’t have to be complicated.