Jodi Daniels is the Founder and CEO of Red Clover Advisors, a privacy consultancy, that integrates data privacy strategy and compliance into a flexible, scalable approach that simplifies complex privacy challenges. A Certified Information Privacy Professional, Jodi brings over 27 years of experience in privacy, marketing, strategy, and finance across diverse sectors, working and supporting startups to Fortune 500 companies.
Jodi Daniels is a national keynote speaker, and she has also been featured in CNBC, The Economist, WSJ, Forbes, Inc., and many more publications. Jodi holds a MBA and BBA from Emory University’s Goizueta Business School. Read her full bio.
Justin Daniels is a corporate attorney who advises domestic and international companies on business growth, M&A, and technology transactions, with over $2 billion in closed deals. He helps clients navigate complex issues involving data privacy, cybersecurity, and emerging technologies like AI, autonomous vehicles, blockchain, and fintech.
Justin partners with C-suites and boards to manage cybersecurity as a strategic enterprise risk and leads breach response efforts across industries such as healthcare, logistics, and manufacturing. A frequent keynote speaker and media contributor, Justin has presented at top events including the RSA Conference, covering topics like cybersecurity in M&A, AI risk, and the intersection of privacy and innovation.
Together, Jodi and Justin host the top ranked She Said Privacy / He Said Security Podcast and are authors of WSJ best-selling book, Data Reimagined: Building Trust One Byte at a Time.
Here’s a glimpse of what you’ll learn:
- Jodi’s Top 5 lessons learned from IAPP GPS 2025
- Justin’s Top 5 lessons learned from inaugural Atlanta AI Week
- How state regulators are collaborating on privacy enforcement
- Ethical and national security concerns surrounding AI
- The importance of testing cookie consent platforms and honoring global privacy control signals
- How deepfakes present cybersecurity threats and impact trust in digital content
- Why privacy notices need to match actual privacy practices
In this episode…
From a major privacy summit to a regional AI event, experts across sectors are emphasizing that regulatory scrutiny is intensifying while AI capabilities and risks are accelerating. State privacy regulators are coordinating enforcement efforts, actively monitoring how companies handle privacy rights requests and whether cookie consent platforms work as they should. At the same time, AI tools are advancing rapidly with limited regulatory oversight, raising serious ethical and societal concerns. What practical lessons can businesses take from IAPP’s 2025 Global Privacy Summit and Atlanta’s AI Week to strengthen compliance, reduce risk, and prepare for what’s ahead?
At the 2025 IAPP Global Privacy Summit, a major theme emerged: state privacy regulators are collaborating on enforcement more closely than ever before. When it comes to honoring privacy rights, this collaboration spans early inquiry stages through active enforcement, making it critical for businesses to establish, regularly test, and monitor their privacy rights processes. It also means that companies need to audit cookie consent platforms regularly, ensure compliance with universal opt-out signals like the Global Privacy Control, and align privacy notices with actual practices. Regulatory enforcement advisories and FAQs should be treated as essential readings to stay current on regulators’ priorities. Likewise at the inaugural Atlanta AI Week, national security and ethical concerns came into sharper focus. Despite promises of localized data storage, some social media platforms and apps continue to raise alarms over foreign governments’ potential access to personal data. While experts encourage experimentation and practical application of AI tools, they are also urging businesses to remain vigilant to threats such as deepfakes, AI-driven misinformation, and the broader societal implications of unchecked AI development.
In this episode of She Said Privacy/He Said Security, Jodi Daniels, Founder and CEO of Red Clover Advisors, and Justin Daniels, Shareholder and Corporate Attorney at Baker Donelson, share their top takeaways from the IAPP Global Privacy Summit 2025 and the inaugural Atlanta AI Week. Jodi highlights practical steps for improving privacy rights request handling, the importance of regularly testing cookie consent management platforms, and ensuring published privacy notices reflect actual practices. Justin discusses the ethical challenges surrounding AI’s rapid growth, the national security risks tied to social media platforms, and the dangers posed by deepfake technology. Together, Jodi and Justin emphasize the importance of continuous education, collaboration, and proactive action to prepare businesses for the future of privacy and AI.
Resources Mentioned in this episode
- Jodi Daniels on LinkedIn
- Justin Daniels on LinkedIn
- Red Clover Advisors’ website
- Red Clover Advisors on LinkedIn
- Red Clover Advisors on Facebook
- Red Clover Advisors’ email: info@redcloveradvisors.com
- Data Reimagined: Building Trust One Byte at a Time by Jodi and Justin Daniels
- Baker Donelson
Sponsor for this episode…
This episode is brought to you by Red Clover Advisors.
Red Clover Advisors uses data privacy to transform the way that companies do business together and create a future where there is greater trust between companies and consumers.
Founded by Jodi Daniels, Red Clover Advisors helps companies to comply with data privacy laws and establish customer trust so that they can grow and nurture integrity. They work with companies in a variety of fields, including technology, e-commerce, professional services, and digital media.
To learn more, and to check out their Wall Street Journal best-selling book, Data Reimagined: Building Trust One Byte At a Time, visit www.redcloveradvisors.com.
Intro: 00:01
Welcome to the She Said Privacy/He Said Security podcast. Like any good marriage, we will debate, evaluate and sometimes quarrel about how privacy and security impact business in the 21st century.
Jodi Daniels: 00:21
Hi Jodi Daniels here. I’m the founder and CEO of Red Clover Advisors, a certified women’s privacy consultancy. I’m a privacy consultant and certified information privacy professional providing practical privacy advice to overwhelmed companies.
Justin Daniels: 00:35
Hello, I am Justin Daniels. I am a shareholder and corporate M&A and tech transaction lawyer at the law firm Baker Donelson. Advising companies in the deployment and scaling of technology. Since data is critical to every transaction, I help clients make informed business decisions while managing data privacy and cybersecurity risk. And when needed, I lead the legal cyber data breach response brigade.
Jodi Daniels: 01:01
And this episode is brought to you by ding! For the people who can’t hear that you pulled my hair. Red Clover Advisors. We help companies to comply with data privacy laws and establish customer trust so that they can grow and nurture integrity. We work with companies in a variety of fields including technology, e-commerce, professional services, and digital media.
In short, we use data privacy to transform the way companies do business. Together, we’re creating a future where there’s greater trust between companies and consumers. To learn more and to check out our best-selling book Data Reimagined: Building Trust One Byte at a Time, visit redcloveradvisors.com. And you’re pulling my hair again. So this is a special episode because this entire podcast started with The Jodi and Justin Show doing a variety of webinars and events and conferences together.
And then we decided, well, we should put it together in a podcast. Both of us. This week we are recording on a Friday and we’re really tired. This was the third take at trying to even get the recording button to work and we had respective conferences. So Jodi went to DC and Justin went. Where’d you go?
Justin Daniels: 02:07
Five miles from the house.
Jodi Daniels: 02:09
But we had really big weeks, and so we thought I would share my top five takeaways from IAPP Global Privacy Summit. And what are you doing?
Justin Daniels: 02:18
What am I doing?
Jodi Daniels: 02:19
That’s where you fill in where you were.
Justin Daniels: 02:22
No, I went to the inaugural.
Jodi Daniels: 02:25
Inaugural Friday coming.
Justin Daniels: 02:28
Yes. Atlanta AI week, a three day conference about all things artificial intelligence.
Jodi Daniels: 02:34
All right, so my biggest takeaway. I love the Global Privacy Summit. And one of my favorite reasons for going is to hear directly from the regulators. And the state regulators were on multiple panels. And the biggest takeaway is that the state regulators are working together.
There was a recent announcement where eight different states have officially kind of banded together. But more than just that, all of the states really are, and it could be anywhere from we think there might be something interesting at a company, so they haven’t even actually begun an enforcement inquiry yet. They just have an inkling all the way through. They’re looking at a particular company. How do you interpret a particular situation and what do you think? What are you seeing?
So the regulators are working together. What’s your top takeaway from the Atlanta inaugural week?
Justin Daniels: 03:28
Well, my top takeaway was, as many of our listeners have heard, is I ripped off one of Jodi’s phrases to do my keynote. And the keynote presentation that I did around, I was, just because you can doesn’t mean you should. And the theme of it was around this tool from Snapchat called my AI, because it’s basically a virtual friend that’s available 24/7/365. And wow, Isn’t that a great idea? But the point of what I was saying in the presentation, which is the key takeaway, is we are living through a time where the big AI and tech companies.
It’s an AI gold rush. They’re focused on profit. The same time in my 35 years. I’m sorry, 53. I’m really struggling today on this earth.
I have never seen the legitimacy of government institutions under attack the way they are now. And last, the regulatory environment around AI is in its infancy at best. And so, you know, ethically, should we be doing these things. How are we thinking about the consequences of AI use cases? Because if we don’t deal with this now, what’s coming down the pike, that might have even worse adverse consequences for society, especially kids.
Jodi Daniels: 05:02
Do I get to charge you my licensing fee?
Justin Daniels: 05:05
Only if you pay mine for ripping off the peanut butter and jelly of technology. Which, you know, I just want our viewers to know or our listeners to know. Like when I give presentations or I did a huge workshop, I get people who come up to me in the middle or say, hey, aren’t you Jodi Daniels, husband of Red Clover Advisors? Like, I can’t. Even do my own gig. And they’re like, hey, aren’t you the husband of Jodi Daniels? I was like, yeah, I, I, I wear that role too.
Jodi Daniels: 05:32
Aren’t you so lucky? Okay, but back to takeaway number two from IPW. Remember how I just said that the state regulators are working together? Well, one of the ways that they are going to send you a letter is through your privacy rights process. Maybe you have in your privacy.
Notice an email privacy at company.com and that’s how you receive them. They’re testing that process. So if no one is actually reviewing the email or the inbox, or the workflow where your privacy rights processes come or requests come, you’re going to miss a very important message from a state regulator. And the other piece to that is they often will give a certain time period that they want a response from. Maybe it’s within 30 days if no one is looking at that mailbox until the day before and you call them up and say, oh my gosh, we just got this, we’re going to need an extension.
They’re not going to give you that extension. They will give some extensions in certain situations, but you have to really work with them. The other piece kind of to those responses is making sure you are not vague. Vague responses don’t work. So go test your privacy rights process.
Make sure you know where they are going and test often. So just in case a regulator reaches out, you will actually get it.
Justin Daniels: 06:56
That was so exciting.
Jodi Daniels: 06:57
It was exciting. Do you know how many people actually put that? Many, many, many companies Put the email there and then they forget about it because they might not actually get a lot of responses, but they still might get an inquiry, in which case they’ll never receive it. It’s a test. It’s a specific test and activity that regulators are doing.
Justin Daniels: 07:16
I’m tested every day in my domestic skills and I fail spectacularly.
Jodi Daniels: 07:21
It’s true.
Justin Daniels: 07:22
Indeed.
Jodi Daniels: 07:23
All right. What’s number two for you?
Justin Daniels: 07:26
Number two for me is think what China does with your data when you engage with TikTok or DeepSeek. So I moderated another panel that really focused on AI and cybersecurity. And one of our panelists is former military intelligence. And we had a fascinating discussion through him where he really laid out that just because you might use something and says, hey, our servers are in the UK, doesn’t mean that that data isn’t ultimately going back to China. And he made a really great point, which is China through TikTok and now DeepSeek have been gathering information about citizens in the US for years, because they can then use that to create disinformation, misinformation campaigns.
And a lot of people, you know, go about their daily lives and they’re not really aware of the fact that artificial intelligence can really weaponize a lot of these trends. And so as much as some people would like us to hold back on AI from a national security perspective, we really have to grapple with it. And the issue around TikTok, as well as DeepSeek, is a really important one from a national security perspective.
Jodi Daniels: 08:42
And number three from YP is the idea of many of these regulations have been out for a while, where there’s not really any gray area and confusion on what some of the expectations are. And one of those was around things like global privacy control or the universal opt out mechanism, and making sure you can opt out of targeted advertising. So this means you got to get your cookies right, test your and make sure you have a cookie consent management platform and that it’s working. It’s functioning. If you hit reject is it actually rejecting.
Are you honoring the global privacy control signal? Are you honoring do not sell requests. Do opt outs function properly? So some of the newer ones where maybe we’re still trying to sort out exactly what the expectations are. Some of the regulators had some wiggle room, but in things that have been around for a while, they pretty much point blank said, nope.
There’s really no wiggle room. So go test your cookies, test your cookie consent platform and test it often. That’s a testing in mind. Test test, test might be the former auditor in me.
Justin Daniels: 09:50
Guess you’re having a testing day.
Jodi Daniels: 09:52
Okay.
Justin Daniels: 09:53
So next up on my list is again, I’m going to go to cybersecurity and talk about cyber threats. And in my mind. Foremost among them are the deep fakes, because every time I do a presentation, I roll out my own deep fake of when I gave a TedTalk, and IBM helped me out to basically have me say everything that was opposite. Although it was funny. Jodie, when I did this thing, I basically told the audience back when I first had IBM do this for me in December of 2023, all the LLMs are saying, hey, AI is so profound, you really need to regulate me.
And now on the day that I gave the presentation, what I was saying is hands off the innovators know best, which is what I said in the deep fake. I was like, heck, we should just have deep fake Justin create a fake resume using ChatGPT and he might be able to get a job in the PR department of OpenAI or Meta. But all kidding aside, when we talk about cyber threats, the most present one in my view, is the deep fake, because it really is going to completely transform how we just think about facts. And then if you combine that with using it through a deep seek or a platform like TikTok, think about the waves of misinformation you can now wreak on a society when people just, you know, look at a video and believe anything. And to make a finer point on this, Jodi, is I basically told the crowd when I spoke, I was like, this is the conversation I have to have with my daughter.
Now with deepfake technology. And I said, cue the eye roll because she doesn’t want to listen to me in my house. The eye roll is at an art form. Well, what if they get sideways with a friend and the friend, instead of getting mad, decides to get even, and they take the person’s head, and then they put it on the naked body of somebody else, and they put that out on social media before you even know what happened. The damage is done.
And that’s how I wanted to communicate how big a deal deepfakes really are.
Jodi Daniels: 11:58
That actually was talked about at the IPP conference as well. All right. Number four for me, one of my other favorite catchphrases, which is say what you do and do what you say. And this was actually said verbatim by a regulator. You really want to make sure that your privacy notices reflect what is actually happening in your organization.
And then don’t forget to make sure that your organization is doing whatever it is that you just said in the privacy notice. So if one group is trying to move forward, it’s not disclosed. That’s a gap. That’s a problem. The other really important piece is some states really want to make sure that they are included.
And for example Oregon, they have some specific they want to ensure that if you’re listing all the states that they’re not excluded. And if they’re excluded, they might come to you and ask, well, why did you exclude me? Did you have an analysis that determined their laws? Not in scope. At the same time, they didn’t necessarily specify that it has to be listed.
So if you take the approach by listing just the states, make sure you include them. If you list kind of California and then everybody else, they didn’t seem to call out that they wanted their own name there. So say what you do and do what you say and make sure you have the right states listed if you go with that approach.
Justin Daniels: 13:12
All right. For me, it’s just start your AI journey. So when I present to people, I always tell them up front, I’ve learned a good bit about this, but there’s so much that I don’t learn because a lot of times you just got to start with the stuff and start playing around with it because the use cases are there, there are some out there, but it’s up to you to kind of figure that out. And I think it was a big thing to talk to people who, when I do my interactive prompting session and you can see what the AI does, it demystifies it a little bit. But what’s more important is that you just say, hey, today I’m going to start my journey because three weeks from now, I’ll be further ahead.
You just got to start.
Jodi Daniels: 13:52
Very important. We all do have to start somewhere. All right. Number five for me is read the enforcement advisories and the FAQs that regulators are putting out. These are goldmine of information and hint, hint.
This is what they care about. Connecticut recently put out an enforcement report, and it’s also actually important to note that’s not an obligation. They don’t have to put that report out. They’re choosing to because they want companies to be able to hear and understand what is important to them. There are also FAQs, for example, in Oregon, one that goes to companies and another that goes to consumers.
Because many of these states, California as well, are trying to educate consumers on what their rights actually are. Guess what happens to educated consumers? Then they start reaching out to companies and exercising those privacy rights. So make sure you’re testing the privacy rights process. That was a takeaway number two.
So very much be sure that you are reading the enforcement advisories and the FAQs from the regulators and take to heart what they are recommending, because that is a massive hint.
Justin Daniels: 14:58
My last key takeaway is talk to everyone you can to learn about AI. No one knows everything. There are so many tools out there. Like even when I talk to Jodi at night, she’ll say, hey, I found this tool that helps you make a PowerPoint. I was like, oh, I didn’t know that you could have that.
Or I talked to someone else about they have an LLM where you can like, summarize and get articles or certain things. And my point is nobody knows everything. The opportunity here is we all have to kind of collaborate together. And so don’t be afraid to reach out to other people and ask questions. Because as I like to tell everybody right now, there are no bad questions.
The only bad question is the one that you don’t ask.
Jodi Daniels: 15:38
Ooh, that was profound. Well.
Justin Daniels: 15:42
What can I say?
Jodi Daniels: 15:43
Well, we hope you enjoyed the Jodi and Justin show, where I talked about my five top takeaways from IPW Global Summit 2025.
Justin Daniels: 15:53
Well, wait a second.
Jodi Daniels: 15:54
What? What am I doing?
Justin Daniels: 15:54
I want to know. In the last moments that we have, was there anything you learned from your panel and how you prepared or public speaking from that event?
Jodi Daniels: 16:04
Well, I guess I would go with what you said. How you can always learn. You’re constantly learning. So I had panelists on our or fellow panelists that all have different experiences than me. There was a very technical privacy attorney.
There was one of the senior directors of the IB Tech Lab, and then there was in-house privacy counsel. So each one of them had a different flavor and take on. We talked about ad tech and privacy. So I, I think it’s like you said, no one knows everything. And we’re all continuously learning from each other.
I see okay, so obviously you had something that you wanted to share about your preparation and learning from your event.
Justin Daniels: 16:44
No, I, I had a very different experience in, in over in like 36 hours. I had to be a keynote speaker. I had to do a fireside chat where I interviewed somebody, I had to do a prompting session where I had to lead a session, and then I was moderator of a panel. It was just every one of those skill sets is different. And I guess the thing I will share is it’s one thing to be the speaker because the focus is on you.
But I think a key thing, and I think you would agree with this, Jodi, is when you’re the moderator, your job is to ask really good questions and set up your fellow speakers. I can’t tell you how many times I’ve been on panels where the moderator or someone else on the panel just thinks it’s their opportunity to just talk and talk and talk, and people want to hear from everybody. And so I always keep that in mind is, hey, when I moderate, I’m interviewing, my job is to tee up and ask good questions so that people can focus on the person that I’m interviewing or giving questions.
Jodi Daniels: 17:40
Also kind of like on our podcast where we ask all these really amazing questions of our guests. Look at how awesome the four and a half years of podcasting has prepared you. Really, fellow listeners.
Justin Daniels: 17:50
You know what fellow listeners, I think when we have when we post something about this, you should give your opinion. Would you like to see us speak off between Jodi and Justin? Because I have to tell you, I actually asked ChatGPT. I was like, who’s the better speaker, Justin or Jodi? And then it went through and I was like, I think Jodi is she’s done more things.
And I’m like, well, wait a second, what about this? And then I was like, well, I can see your point. I can see how maybe Justin’s better. So I’m thinking we just need to settle it.
Jodi Daniels: 18:20
You’re just Quoting ChatGPT. You know what? On that note, it has been fun. Would you like to summarize the. I said thank you for joining us for the top five takeaways from IPW Global Summit.
Privacy or didn’t make sense? Global Privacy Summit 2025. And that’s where you say yours.
Justin Daniels: 18:37
And thank you for listening to the top five takeaways I had for Atlanta AI week here in the good old ATL.
Jodi Daniels: 18:45
Thanks for listening.
Outro: 18:51
Thanks for listening to the She Said Privacy/He Said Security podcast. If you haven’t already, be sure to click subscribe to get future episodes and check us out on LinkedIn. See you next time!
Privacy doesn’t have to be complicated.
As privacy experts passionate about trust, we help you define your goals and achieve them. We consider every factor of privacy that impacts your business so you can focus on what you do best.