Jodi Daniels is the Founder and CEO of Red Clover Advisors, a boutique data privacy consultancy and one of the few certified Women’s Business Enterprises focused solely on privacy. Since its launch, Red Clover Advisors has helped hundreds of companies create privacy programs, achieve GDPR, CCPA, and US privacy law compliance, and establish a secure online data strategy that their customers can count on.
Jodi is a Certified Informational Privacy Professional (CIPP/US) with over 20 years of experience helping a range of businesses in privacy, marketing, strategy, and finance roles. She has worked with numerous companies throughout her corporate career, including Deloitte, The Home Depot, Cox Enterprises, Bank of America, and many more. Jodi is also a national keynote speaker, a member of the Forbes Business Council, and the co-host of the She Said Privacy, He Said Security podcast.
Justin Daniels is a cybersecurity subject matter expert and business attorney who helps his clients implement strategies to better manage and recover from data breaches. As outsourced general counsel for Baker Donelson, Justin advises executives on how to successfully navigate cyber business and legal concerns related to operations, M&A, incident response, and more.
In 2017, Justin founded and led the inaugural Atlanta Cyber Week, where multiple organizations held events that attracted more than 1,000 attendees. Justin is also a TEDx and keynote speaker and the co-host of the She Said Privacy, He Said Security podcast with his wife, Jodi.
Here’s a glimpse of what you’ll learn:
- Why are documentaries like The Great Hack and The Social Dilemma making such an explosive impact on society?
- Jodi and Justin Daniels share what parents should take away from these documentaries: addiction needs accountability
- Is it possible to safely use Facebook, TikTok, and other social media platforms—both as a child and as an adult?
- Jodi and Justin discuss potential legislative solutions to our current data crisis
- The key takeaway: what kind of society do we want to create?
In this episode…
How frequently do you visit platforms like Facebook, TikTok, or Google? Probably at least once a day, if not more. But, did you know that every time you visit one of these platforms, your personal data is being collected, stored, and sold in the hopes of altering your behavior, purchasing habits, and voter profile?
Many articles, podcasts, and documentaries—including recent Netflix hits, The Social Dilemma and The Great Hack—have detailed the total lack of privacy and security on some of the most frequented platforms in the world. Though it’s easy to think that we are safe and secure when using sites like Facebook, TikTok, or even Google, this unfortunately isn’t the case. So, what can we do as parents, security/privacy professionals, and frequent digital consumers in order to protect ourselves and our loved ones?
In this episode of She Said Privacy, He Said Security, Rise25 Co-founder John Corcoran sits down with Justin and Jodi Daniels to discuss practical takeaways from the recent documentaries, The Social Dilemma and The Great Hack. Listen in as Justin and Jodi talk about the reality of social media addiction, strategies for protecting your children’s online profiles, and potential legislative solutions to personal data breaches. Stay tuned for more!
Resources Mentioned in this episode
- Jodi Daniels on LinkedIn
- Justin Daniels on LinkedIn
- Red Clover Advisors
- Red Clover Advisors on LinkedIn
- Red Clover Advisors on Facebook
- Red Clover Advisors’ email: info@redcloveradvisors.com
- John Corcoran on LinkedIn
- Rise25
- The Great Hack
- The Social Dilemma
- Bark
Sponsor for this episode…
This episode is brought to you by Red Clover Advisors.
Red Clover Advisors uses data privacy to transform the way that companies do business together and create a future where there is greater trust between companies and consumers.
Founded by Jodi Daniels, Red Clover Advisors helps their clients comply with data privacy laws and establish customer trust so that they can grow and nurture integrity. They work with companies in a variety of fields, including technology, SaaS, ecommerce, media agencies, professional services, and financial services.
Their free guide, “How to Increase Customer Engagement in a Private World,” is available here.
You can also learn more about Red Clover Advisors by visiting their website or sending an email to info@redcloveradvisors.com.
Host (00:00):
Host (00:27):
Hi, I’m Host and I’m a certified informational privacy professional, and I provide practical advice to overwhelmed companies.
Host (00:38):
Hi Host here. I’m a cybersecurity subject matter expert and business attorney. I’m the cyber quarterback, helping clients design and implement cyber plans. I also help them manage and recover from the inevitable data breach. I also provide cyber business consulting services to companies as well. We have John Corcoran here today and we have flipped the script and he’ll be interviewing us. All right. You guys, I’m excited to dive into this topic here with you guys, because we’ve got an interesting topic here today. So there’s two really monumental documentaries that have come out recently. The social dilemma and the great hack, and they both have a lot to say about privacy and security issues and you two are privacy and security experts. And so we’re going to dive into some of the different issues that are raised from those two documentaries. But first, before we get into that, this episode is brought to you by Red Clover Advisors, which helps companies to comply with data privacy laws and establish customer trust.
Host (02:07):
So they can grow and nurture integrity. Red Clover works with companies in a variety of fields, including technology, SAS, e-commerce media agencies, professional services and financial services. In short, they use data privacy to transform the way companies do business. Together, we’re creating a future where there is greater trust between companies and consumers to learn more, go to red Clover advisors.com or you can also email info@redcloveradvisors.com. All right, Justin, I’m going to start with you two monumental movies that have come out, The Great Hack and The Social Dilemma. First of all, why are they making so much of a stir today?
John Corcoran (02:49):
I think they’re making such a stir today because they really laid bare the business model of big tech. And there’s a reason why Google, Facebook among others are the richest companies in human history. And it’s for one reason only, and in a word it’s data. And I think what the both documentaries make so clear is people don’t truly understand what is being done with their data. They’re just, you know, having fun on Facebook being part of the community and don’t understand how without their real knowledge or consent, how all that data can be used and monetized on an impressive dented scale. That has some really significant societal impacts.
Host (03:41):
Jodi, I want to turn to you. So the great hack is about Cambridge Analytica, for those who don’t recall, tell us about what that company did.
Host (03:51):
Yeah, so Cambridge Analytica, it was a data company essentially, and it helped other organizations use data in digital campaigns. But where the data came from is kind of the question at hand and specifically Cambridge Analytica really helped political campaigns and political messages around the world. So not just here in the United States.
Host (04:18):
And you’re both parents, what do these movies say to us as parents or another way of putting that is, what should parents be thinking about these big tech companies right now? What should they be aware of?
Host (04:31):
Yeah. So I’m going to start with this one. You know, as, as parents to people who are, once you give a device, these kids and adults are addicted and they’re addicted for a reason. And it’s because the device and the software is literally designed to keep us hooked. There is no end on Facebook. There’s no end on any of the social media platforms. You can just keep scrolling and scrolling and scrolling literally forever. It’s like infinity. And so then you have to wonder, well, what am I scrolling in? What am I seeing? And the content that I’m going to get is going to be different than the content he’s going to get. It’s going to be different than what you’re going to get. And then the question is, well, what content am I getting? And how different is it? And these companies are utilizing data. As Justin just described, they are data companies to customize that experience, but then you have all the ads. And so what type of ad message is also coming in from whom, and we’re all just kind of commoditized people here that my little screen and my little space, someone’s going to buy the opportunity to deliver what their messages to me, whether or not that message is accurate or real. We have no idea. So there’s a lot, but that’s my starting point.
Host (05:51):
Justin, your thoughts on what parents should be paying attention to.
John Corcoran (05:55):
So I think the most poignant part of the entire documentary of social dilemma, it was one scene. So, you know, all of us on the podcast and everyone listening, we’ve all been in a relationship that ended badly and it upset us. You know, someone broke up with us and you know, that stinks. So if you recall on social media, they have an algorithm. So if you haven’t been on, you know, Facebook, they want to get you to come back. And so haven’t you ever gotten a picture of, Hey, this is what John was doing a year ago or five years ago. And there’s a picture in it. It’s usually something, Oh, that’s a nice memory. And so you look and maybe you stay, but now what if it’s a picture of you with an ex boyfriend or girlfriend that you were struggling to get over and you’re finally over them. And now you see that picture, the algorithm is just, you know, it just numbers. It doesn’t think about the emotional impact of someone who had a tough breakup. And now they have to see a picture of their ex boyfriend or girlfriend and how that might affect them emotionally. And so when you see all these statistics about increase in teen suicide and all, some of these bad societal consequences, it’s because you have this algorithm that doesn’t think about people’s emotions. They just want you to get back looking at Facebook. Whether it’s something that’s going to make you happy or sad is irrelevant. But to us as parents that my daughter or son may go out on social media and see something that makes them really upset to the point, they’re not talking to me or, you know how some people can react to things. That’s a really big issue.
Host (07:39):
Yeah. So as a parent, what can parents do? What’s the answer for them? Is it no digital devices? Is there, you know, maybe software that can help with this sort of thing? Or is it regulating which apps they’re on or the number of hours that they’re on these apps?
John Corcoran (07:56):
I think you should tell him the story about our daughter and Tik-Tok.
Host (07:59):
Oh yeah. I’ll tell you the story. Tick-tok I think first it’s also a combination of everything that you’ve just described. And another part to the social dilemma that they’re emphasizing in and the addictive nature of it is the likes and the comments. And so people are all about, well, I have to go on because did you like my photo? And it’s literally psychologically driven that we want to keep going to figure out how many likes and comments and shares do we get? So then people are measuring, Oh, well, you know what? That, that picture only got three likes. I wasn’t a great picture. I need to go and do something different to me, which is also highlighted in the social dilemma so that it gets more likes. And then when, when the girl makes the change, if the picture is better received, so then she internalizes, Oh, I have to look this certain way. And that also starts to get to some of the issues of the teen suicide and depression and anxiety, because we’re people are now no longer to think for themselves. We’re just dependent on how many likes and comments. So if you, I think it also depends on the age of the child, the younger, the child. So literally our ten-year-old thought tik-tok should be fine. And we don’t agree that tic- tok should be fine. We think there’s a lot of security and privacy challenges on Tik-tok. And so we’ve just banned. There is no Tik TOK here. At the same time, you might have some older teens that banning tick-tock might be quite a challenge. They might be independent enough to be able to put it on their phone. So I think there it’s education and really explaining what, what is Tik TOK? What is Facebook doing?
Host (09:34):
What is Instagram? What are any of these social platforms doing? And teaching people to limit, teaching people, to know how to have their own value outside of the social media world. Because if you just take it away, it doesn’t actually teach the lesson. And, and we have to still work with technology. And this is today’s issue. Tomorrow, it might be a different issue. So we need to be able to teach to that. And there are definitely devices. You mean, you can talk about the VPN and things that you’ve put on ours for some of the kids kind of in between to be able to limit what they do,
John Corcoran (10:09):
Our daughters iPad, that’s called Custodio. So we know where she’s going and what she’s doing. But the challenge we have as parents is think about parenting nowadays. Now you have to spend extra time uploading the software, monitoring the use of the devices. And then we talked about this in another podcast we did was what happens when they go to somebody else’s house and they have different rules. It’s kind of like what TVs were, but now on steroids. So in other words, it’s really, it takes a village. So it takes parents being on the same page about the devices and whatnot. And I think Jodi’s example of Tik TOK is one where I still think you have a lot of unawareness of these apps and what some of the mean, because our daughter’s like all these other kids are doing it and it may be because their parents aren’t as well attuned to these issues as Jodi and I might be. Cause we’re immersive.
Host (11:08):
Certainly. Yeah. And you mentioned, Jodi a moment ago, tik-tok has a number of privacy and security issues. So this question goes to either of you, but what are some of those issues that people should be aware of?
John Corcoran (11:21):
I like to joke that you really want the ministry of state security spying on you.
Host (11:27):
I mean, it’s, it’s a Chinese owned company with the U S arm, but numerous times it would say it was doing one thing and it would do something else and it would extract way more information than one it disclosed and two, it needed as well. So you think you’re just putting a dance on and you’ve downloaded the app to your phone. And instead the app is actually taking a lot of other information that it doesn’t need for you to have a dance on the app. So there’s, and then it’s owned by the Chinese government. So you don’t own and have any control over that data.
Host (12:01):
Let me ask it seems like you know, as also users of digital devices, there has to be a degree of mixed emotions towards some of these technologies. You know, in some ways there are advantages, you know, people always point to like Facebook reconnects you with maybe someone you went to high school with, who you haven’t been in touch with in awhile. How do you feel about that? You know, and where’s the line between some good that comes from these different, you know, platforms and, and where, you know, crosses over the line and the bad parts outweigh the benefits to it.
Host (12:42):
So Justin always jokes that I’m a prolific Facebook user. So I use Facebook. I firmly believe in what you just described. I’ve reconnected with many people that I would never have been able to keep in touch with. And I like it for those reasons. I find a lot of value in the groups. I’m in a number of different business groups and personal groups and local groups. And I find a lot of information there. I also, you know, restrict and, and know kind of who I’m sharing the data with. You can have different friends lists and share, and I’m aware of what’s happening. I understand the different advertising that I get. If I click on one shoe out, I get 20 shoe ads for the next three days. You know, I kind of understand that at the same time, you know, if it’s kind of about minimizing and understanding what’s happening and I’m not looking for the validation from everyone else.
Host (13:37):
And I’m aware that my news feed is slanted. I get that I’m only seeing certain types of articles and certain contents on my feed. And if I want to get more information, I’m going to have to work harder to go and find it. So I certainly think that there’s value, but like many other things, if it’s a Seesaw and you know, it’s, this is so wonderful. It’s not perfection at all. And you have to, you have, there you go, you have to balance it out. And it’s not just in one direction. So do, I like that. They sell and use my data in that capacity. Absolutely not. So I I’m careful about what I put and communicate and those types of things
Host (14:18):
And still live life And want to be able to connect with people. So it’s about being educated and balanced.
Host (14:23):
Justin, your thoughts on that?
John Corcoran (14:26):
So this is where we’re going to have some fun. My view of Facebook is how many friends, how do you define a friend? I define a friend as somebody. I can call it two in the morning and they’ll come and help me, Facebook friends, or not, for the most part, you’re true friends, but where I really think differently is in my personal opinion, social media has in the last 10 years been weaponized to divide us. And I feel that the big technology company with the way their business model is they have zero incentive to police themselves to really start to have some profound conversations about the type of things that should be put on social media or should not be. And I recognize as an attorney, we have first amendment issues, but I’d also like to point out that you’re not hearing a whole lot of information about foreign interference in our elections because the media companies kind of banded together. The US cyber command took some offensive actions against known actors to kind of preemptively hamper their capabilities. And so when I see things like that, I think that there are opportunities to start to come up with ground rules. And I passionately have come to the belief that section 230 of the telecom act of 1996, which basically insulates a platform for liability for the content on it either has to be eliminated, or we have to find a better way because the current state of affairs in my view cannot continue because I can’t tell you how many people I talked to who, when you go behind what they’re telling you, they got it off of Facebook. And I’m like, well, how do you know that’s true? And they’re like, well, it was on my social media feed. I’m like, well, it might as well. You know, that’s not really any way to verify it. That’s my biggest concern. And that to me is one of the biggest negative consequences of Facebook that I’m very concerned about for us as a society.
Host (16:34):
As you look at the social media landscape, and there’s a lot of different players out there, we’ve mentioned Facebook, Instagram, which is owned by Facebook, there’s LinkedIn, there’s Pinterest. If we have a legislative solution like that, will that solve it, will that make these platforms more palatable or, you know, are we beyond that point? Is it impossible? Do you think it would be impossible for there to these social media platforms to exist in a more heavily regulated world?
John Corcoran (17:07):
Sure. So I’ve had this discussion, so I’m going to throw something out for you in the audience’s consideration. So when television came along in the fifties, you know, you have the FCC that regulates what you can watch on television. Like you recall, remember on Superbowl and Janet Jackson had a wardrobe Malfunction. There’s certain words that you can’t say on television, right? More importantly, there are certain rules about what you cannot sell to children who are watching cartoons on television, like cigarettes, magically, when you took the cartoons and stick them on, YouTube, you don’t have those regulations. And so, John, I’m not naive enough to think that this is some simple solution. It certainly isn’t. But to me, it’s no different than what happened in 2008, when you took away all the regulations and you basically had unfettered capitalism, and you saw what happened by the same token, there can be common sense regulation around, you know, prohibiting things that are patently false from being posted on social media and having some level of responsibility of companies that if they don’t police it, they can be held accountable for that because there’s precedent for other types of technology that we’ve used. Cause I think the current situation is one that is untenable and will continue to divide our society, which I think inevitably is going to make this a national issue. It’s just, how quickly can we start to build the awareness to where people say enough, we’ve got to deal with this or else we’re putting our children and our children’s children’s at serious risk.
Host (18:48):
It is interesting because maybe analogy would be like the music download industry, which there was a period of time, the Napster era and late nineties, when it was almost unfettered and it was the wild West and people are downloading music like crazy. And the pendulum kind of swung back the other way. And then it actually became a very productive industry and, and in many ways, music industry transformed itself. So turn it to you Jodi. So what are your thoughts on that? You know, all the different social media platforms out there, is there a legislative solution out there that would make it a more palatable solution for everyone?
Host (19:25):
I think the wild, wild West doesn’t work we’re seeing what’s happened. Um, now, right? All the, the amount of data that was shared to Cambridge Analytica quite honestly, was quite a privacy story. And to the tune that the privacy office came with, their big coats and the privacy professionals were like, look, they have big fancy privacy coats. So it was a very big news story and legislation won’t cure it. It is a first is the first step. And if you tie together really what the social dilemma highlights, which is the curated newsfeed in the algorithm. So I can maybe police the news and make sure that it’s quote unquote, whatever we believe is accurate, but that doesn’t necessarily change how people are being encouraged to be on a platform. And the algorithm is continuing to serve me a certain slice of the content. So the division in the US as an example, that Justin was mentioning, if everything I see is really in one slice of the pie, it actually might be all accurate, which you wouldn’t need the legislation for that part. But if it’s all accurate, it’s still one slice of the pie. And then if you have someone who’s paid, because that’s how they make money to serve ads, and I’m going to get a further message. The current design is going to continue to foster that singular message, and I’m encouraged. I spend hours on it. So the algorithm, the companies need to kind of gain some data ethics underneath any type of legislation that happens as well. You have to have all the different pieces collectively combined or it won’t work.
John Corcoran (21:09):
John, I think I’ve got a really good analogy that kind of brings us into focus. So let me ask you a question, how often nowadays, when you get in your car or you’re a passenger, do you not use your seatbelt?
Host (21:23):
Yeah, for me, never.
John Corcoran (21:27):
So let’s go back, wind the clock back to the 1960s. They didn’t have seatbelts. This guy came along, Ralph Nader. We may remember him. He was like, we should have seat belts in cars for safety. Cause people get killed in cars. So the auto industry fought it while we finally got seatbelts. But you know what? Even through the nineties, people weren’t wearing seatbelts. So in comes mothers against drunk driving, drunk driving, and they start to change the perception of why you should or shouldn’t, you know, you need to buckle up. We were against drunk driving. And then that led to the start of some legislation where in most States, now, if you don’t have your seatbelt on, you can get fine. But you know, they’d have to pull you over and whatnot, but it was the combination of activism about, Hey, this isn’t right. And here’s why people’s lives are at risk plus legislation that got us to a much better place. Now we haven’t gotten rid of car accidents, but we’ve sure gotten adoption of seatbelts to just skyrocket, to aware it’s a normal thing for my kids. They don’t even bat an eyelash. We just put on our seatbelt when I was growing up. My dad really wasn’t that way. And I just point that out, because if you do these kinds of things, if we could do it for seatbelts in the generation, why don’t we have the political will to do it for something like this? That is just as pervasive as the seatbelt.
Host (22:51):
Yeah. It’s an interesting analogy. Any final thoughts before we wrap things up on this topic or issues that we haven’t discussed, that particular parents should be aware of?
Host (23:02):
I think parents need to be active. They can’t just assume that their kids are fine. They need to know who they’re connected to. What platforms are they on, making sure that they have the right tools to be able to, to monitor and prevent, uh, kind of a local shout out to a company here called Bark, which can help monitor, uh, what’s happening for children’s accounts. So taking an active role, I think, is going to be the most important part.
John Corcoran (23:31):
Okay. Um, I guess my final thought is I think the key takeaway from today’s podcast is, what of society do we want to be going forward? How do we want to think more reflectively about how we want to engage with social media? Because I look out right now and, you know, as Jodi and I sit here today, we’re in Atlanta, Georgia. So the epicenter of the United States politically sits in our state for the next two months. And we’re going to be inundated with all kinds of ads and other things that if you fact check them, many of them are just going to be outright false. And a lot of this will be driven on social media. And to me, when I’m going to about to go through this whole period, there’s got to be a better way and all of us need to come together because all of this divisiveness is just corroding us as a country. And if you, if you’ve been out of the United States, we really have it good here, and we should be doing all we can to preserve this wonderful, resource we have called democracy in the United States.
Host (24:43):
Absolutely well Red Clover advisors, redcloveradvisors.com, anywhere else, people should go to learn more about you, Jodi, and the work that you do.
Host (24:52):
Social media, ironically for our discussion, go find us on Facebook or LinkedIn, responsible users only. haha like on the beer can. Not on Tik-Tok.
Privacy doesn’t have to be complicated.
As privacy experts passionate about trust, we help you define your goals and achieve them. We consider every factor of privacy that impacts your business so you can focus on what you do best.